Emily Portman’s case highlights growing concerns over AI music tools scraping artists’ work to create unauthorized releases
British folk musician Emily Portman thought she was receiving congratulations for a new album she’d actually released. Instead, she discovered something far more disturbing: an entire album of AI-generated songs, uploaded to her streaming profiles under her name, that appeared to be modeled on her previous work.
The unauthorized release appeared in July, despite Portman not having put out new music since 2022. The incident highlights a rapidly emerging problem in the music industry as AI tools become sophisticated enough to mimic artists’ styles and voices — and unscrupulous actors exploit streaming platforms’ content moderation gaps.
A new frontier for musical impersonation
Unlike traditional copyright infringement, where someone might illegally redistribute existing recordings, AI-generated music creates an entirely new category of violation. The technology can analyze an artist’s catalog, learn their vocal patterns, instrumentation choices, and compositional style, then produce original-sounding tracks that never actually existed.
For independent and folk artists like Portman, who lack the legal resources of major label acts, the violation is particularly damaging. The fake album could confuse fans, dilute her brand, and potentially siphon streaming royalties that should flow to her legitimate work.
Platforms struggle to police AI content
Streaming services like Spotify and Apple Music have faced mounting pressure to address AI-generated music flooding their platforms. While some AI music is legitimately uploaded by users who created it, cases like Portman’s represent outright impersonation — someone using AI to masquerade as an established artist.
The challenge for platforms is scale. With tens of thousands of tracks uploaded daily, detecting which ones are AI-generated impersonations versus legitimate releases requires sophisticated content moderation that most services haven’t yet implemented effectively.
Why it matters
Portman’s experience foreshadows what could become a widespread problem as AI music generation tools become more accessible and convincing. If platforms can’t reliably prevent unauthorized AI-generated releases under artists’ names, it could erode trust in streaming services and complicate how listeners discover authentic work.
For artists, the implications extend beyond lost revenue. Having AI-generated music attributed to you means losing control over your artistic identity — fans might judge you based on work you never created, or worse, assume you’ve sold out by using AI when you haven’t.
The incident also raises thorny legal questions about who’s liable when AI-generated content impersonates an artist. Is it the person who generated the tracks? The company that made the AI tool? The distribution platform that failed to catch it? Current copyright frameworks weren’t designed for this scenario.
As generative AI continues to advance, cases like Portman’s are likely just the beginning. The music industry — and the platforms that power it — will need to develop new verification systems and legal protections to ensure artists maintain control over their own names and creative identities.
