Sony Music Fights Back Against Deepfake Music


Sony Music has taken a stand against fraudulent 'deepfake' music by requesting the removal of over 135,000 songs impersonating its artists, including Beyoncé and Harry Styles.


The music giant revealed that these AI-generated tracks pose direct commercial harm to legitimate recording artists and have been targeting major acts who are in the midst of promoting new albums.


Deepfake music, created using generative AI, exploits artists' popularity and can potentially tarnish their reputation and release campaigns, according to Dennis Kooker, president of Sony's global digital business.


As AI technology becomes accessible, Sony emphasizes that the 135,000 discovered tracks may only represent a fraction of the total uploads on streaming services.


Since last March, approximately 60,000 songs falsely claiming to feature Sony artists have been identified, impacting other major figures like Bad Bunny and Miley Cyrus.


Amid the rise of AI technology, the music industry gathered for the Global Music Report launch, which showed a 6.4% growth in recorded music revenue, signaling a robust recovery underpinned by streaming subscriptions.


However, the industry is alarmed at the growing concerns regarding streaming fraud, with estimates suggesting that up to 10% of content across streaming platforms is fraudulent. Steps to enhance identification and labeling of AI-generated music are being called for to ensure transparency and consumer trust.


Beyoncé