Does AI turn good into good enough, or does it make good better?
This tricky question applies especially to AI-created music and other content, which most of us do not realize are often machine-generated. AI-produced music is not generally copyrighted because it’s not considered the work of a human creator by the U.S. and other governments.
But that won’t necessarily stop providers and streaming services from attempting to profit from synthetic content. The bigger question still: what does this output do to the DNA of music and does it condition listeners to accept ok as opposed to unique?
Platforms that provide AI-generated songs typically learn without license from existing musical data (like vocals, chords, and rhythms), not all of it in the public domain, which enable them to compose new music in various genres and styles.
Doubling Down
Between January and April the number of AI-generated tracks uploaded daily on Deezer, a popular French music streaming service, doubled, from 10,000 to 20,000, or 18% of all its downloads.
The problem may be less of a technology issue as much as a human one. Songwriters, musicians and audiences increasingly need to defend their turf against the inevitable onslaught by machines capable of convincingly mimicking their styles and genres, and even their voices. Discerning where human creativity ends and AI algorithms begin is neither simple not fair.
Some might say, “Who cares if the music is worth listening to?” Not everyone would agree.
Let’s not minimize people’s originality or competitive nature, including musicians. Humans are built to survive. They will do whatever they must to prevent machines from replicating the full depth of their personality. Despite what some experts say about machines eventually becoming smarter than humans, I would like to think we (humans) will always be a vital step ahead of even the most advanced LLMs. Think of it as parents and their adolescent children. IQ does not always represent intelligence.
Approximating the Human Touch
The way a creator or inventor is influenced is not the same as how an algorithm aggregates, processes and deploys data to approximate human output. No matter how much data or processing power, machines can only mimic human thought based on what they are provided, i.e. the data and content they have access to.
People need weapons in the battle over AI creation – our own bots and algorithms that can detect the output of abusing bots and algorithms, and make transparent what is AI-generated and what is AI-assisted.
With the still-evolving relationship between LLMs and humans in mind, Deezer reports that approximately one-fifth of all tracks now being uploaded to its platform are fully generated by artificial intelligence. That’s right – they are created by machines calculating what they believe people want to hear. Some people find this output acceptable. Many cannot tell the difference.

Deezer disclosed in April, reports Music Business Worldwide, “that more than 20,000 AI-generated tracks are being delivered to its platform every day – around double the 10,000 daily AI uploads in January.
“AI-generated content continues to flood streaming platforms like Deezer and we see no sign of it slowing down,” Aurelien Herault, Chief Innovation Officer at Deezer, told MBW.
An AI detection tool was launched by Deezer in January after it filed two patent applications in December for the technology in December.
Deezer’s new tool “can detect artificially created music from a number of generative models such as Suno and Udio, with the possibility to add on detection capabilities for practically any other similar tool as long as there’s access to relevant data examples.”
Herault said that this tool is helping the company filter fully AI-generated tracks from algorithmic recommendations for its 9.7 million subscribers.
For centuries songwriters have attempted to create appealing, genre-based music with mixed success. Will AI do a better job? Not likely.
Does it matter that listeners know which music is human and machine-generated? Should creators know when they are being influenced by other artists, who are influenced by still others, or machines that aggregate content, that draws on other music?
How do we feel about music aggregators generating licensing fees from selections that are arguably un-original and based on the copyrighted content of others?
As the recently filed Deezer patents point out, businesses, consumers and creators need weapons in the battle over AI creation – our own bots and algorithms that can detect the misdeeds of other bots and algorithms, and make transparent what is AI-generated and what is AI-assisted.
Emmy award-winning music producer and inventor, Albhy Galuten, producer of albums that have generated more than 100 million sales, says, “AI may eventually drive audiences back to live performance. They may grow to identify and appreciate the intimacy of a club and the fingerprints of the human touch.”
The Battle of Good vs. Good Enough
My greatest fear about AI is losing the battle of good vs. good enough. I get vaguely credible pitches all of the time for people to appear on my podcast, ‘Understanding IP Matters,” that are clearly AI-generated. To the sending parties, PR firms for tech businesses and law firms, they may appear to be credible enough to grab the attention of different targets.
If you think of communicating as merely a numbers game, then AI is both efficient and sufficient.
If you think of communicating as merely a numbers game, then AI is both efficient and sufficient. The difficulty of discerning with the naked eye or ear fact from fiction in almost all communication and content is spreading like wildfire. It is more insidious than it may at first appear.
Commodifying content like music, images and writing, as inventions, through questionably fair data aggregation, volume and speed is a significant fear. In this context, creation becomes less meaningful, little more than a vast array of digitized content reassembled. In some cases AI output may be sufficient but humans as a race need to get better at determining who provided what in a work. As one pundit put it, GenAI responses are incomplete, often more of a parlor trick than a complete answer. Greater transparency would help.
Deezer has been among the most aggressive digital service providers (DSPs) when it comes to detecting AI-generated content, “noise” tracks meant to skim royalty revenue, and other low-quality content.
The company announced last year that it had deleted 26 million “useless” tracks from its platform following the artist-centric rollout.
The difficulty of discerning with the naked eye or ear fact from fiction in almost all communication and content is spreading like wildfire. It is more insidious than it may at first appear.
A report released late last year by CISAC, the global umbrella group for authors’ societies, estimated that AI could “cannibalize” up to 24% of music creators’ revenues by 2028.
Here to Stay
AI is here to stay. Whether people use it to generate work that mimics acceptable content or that seeks to “fool” the recipient into thinking it was birthed by a human, or it is used as a tool to increase efficiency and quality.
People can allow AI platforms to provide acceptable songs that sound a lot like human output, or learn to use the machines as a tool to create better content. The choice is ours.
Image source: musicbusinessworldwide.com; dl-sounds.com
