On a crisp, bitterly cold morning, I needed a little background music to motivate myself. On Spotify, I played The Gap Band’s “Early in the Morning,” a funk bop from 1982 that’s currently making a resurgence on TikTok.
Charlie Wilson’s smooth vocals got me in the mood to get a little work done, followed by Spotify’s smart shuffle feature leading to old school funk and soulful R&B vibes. The audio streaming platform played songs like “She’s a Bad Mama Jama” by Carl Carlton, “Candy” by Cameo, “All Night Long” by Mary Jane Girls.
But then a song that I didn’t recognize started playing. “I’m Letting Go of the Bullshit” by Nick Hustles. It blended well with other songs from the 70s and 80s that Spotify included inspired by The Gap Band. But the songs were very modern. “This year I’m in my flow / Fuck anything that doesn’t help me grow / Fake friends are the worst.” The song had been listened to 1,823,488 times on Spotify at the time of writing.
Spotify to introduce AI labels and spam filters to stop AI music slop
Out of curiosity, I took a look at the artist’s profile on Spotify and found that Nick Hustles has nearly 600K monthly listeners and a string of popular songs with catchy titles like “Minding My Goddamn Business,” “I Do Whatever The F*ck I Want,” and “Stop B*ching.” I noted that there was no author biography in the “About” section of the Hustles profile, so I turned to Google.
It was here that I discovered that this catchy little tune I was hearing was actually AI-generated. I’ll admit: It hadn’t even occurred to me that this ’70s funk singer could be anything other than human. Considering that 97 percent of people can’t tell whether a song was created by AI or humans, according to a recent Deezer and Ipsos study, I don’t feel so bad about my lack of understanding. But these are questions we need to start asking ourselves as we enter the AI music era.
How did an AI artist get into my Spotify queue?
Nick Hustles is the alias of 35-year-old Nick Arter, a human producer who uses the AI music tools Listen and Udio. Not only are the Hustles not 1970s musicians, they weren’t even born in the 70s. Arter did not immediately respond to Mashable’s request for comment.
Then it occurred to me: Spotify had actually recommended this artist to me. I couldn’t find the track on TikTok or Instagram. It was suggested in my Smart Shuffle queue – a 2023-launched feature that adds personalized recommendations to match the vibe of the first song you play. So, is Spotify now recommending AI artists to listeners who have previously shown zero interest in anything other than human-created music? Well, it certainly seems that way.
Spotify’s mission statement is to “unlock the potential of human creativity”. Is it really possible for that mission to sit with its role in not only providing the platform, but Recommend to Oh music?
A spokesperson for Spotify told Mashable that “Spotify does not give any special treatment to AI-generated music.”
mashable light speed
He added, “While we don’t penalize artists for using AI responsibly, we are aggressive about removing content farms, impersonators, or anyone trying to game the system.”
Recommendation algorithms respond dynamically to online trends, which are often linked to viral social media activity, media coverage or public conversation. This means that if an AI-generated song is going viral on TikTok, or getting press attention, it could result in that song ending up in your Spotify shuffle queue.
Spotify’s track record with AI music isn’t great. In July 2025, the streamer published AI-generated songs on the pages of allegedly deceased musicians, including Blaze Foley, who was murdered in 1989. In the summer of 2025, viral band The Velvet Sundown released two albums and received 1 million plays on Spotify before admitting that its music, images, and band backstory were all created using AI.
Is it wrong to recommend AI songs to listeners?
How should we feel about AI-generated songs ending up in our listener libraries? Some people aren’t necessarily opposed to trying out AI music, but their open mindset starts to change when they feel betrayed. Research from Deezer and Ipsos found that 80 percent of people want AI music on streamers to be clearly labeled. Transparency matters: A strong majority (72 percent) say they would like to know if a streaming platform is recommending music created entirely by AI. Nearly half would prefer AI filtering out music altogether, and four in 10 say they would skip it if it came up anyway.
In September 2025, following user feedback, Spotify introduced an AI labeling system through metadata disclosure, working with the Digital Data Exchange (DDEX). This means AI credits will appear in music metadata on the platform, but it doesn’t mean you’ll see an “AI-generated” badge when you’re watching a track on Spotify. According to the company’s announcement: “This is not about punishing artists who use AI responsibly or obsessing over tracks that require them to disclose information about how they were created.”
Labeling songs and artist pages as AI-generated seems to be the minimum step to avoid misleading listeners.
It’s a step in the right direction, but is it enough? Currently (as of January 2026), there is no universal, front-and-center badge on Spotify track pages declaring it as “AI” or “not AI”. Instead, those AI disclosures are incorporated into credits and metadata.
I would like to know immediately that the song I am listening to is not being played by a human being. Labeling songs and artist pages as AI-generated seems to be the minimum step to avoid misleading listeners. I also think there should be an option in Spotify’s settings to exclude AI-generated music from Smart Shuffle. For those who feel strongly about AI-generated music, we should be given the option to opt out, so that we don’t jeopardize the economic incentives for human-generated works.
It is important to note the difference between entirely AI-generated work and work created by human artists with the aid of AI tools. For generations, technology has played an integral role in music production, such as multitrack recording, digital mixing consoles, Auto-Tune, audio editing software, etc. AI tools, when used responsibly to enhance human talent, are undoubtedly the next step in technology’s relationship with music.
What’s the problem with AI-generated music?
Spotify’s mission statement also states that it provides “An opportunity for millions of creative artists to make a living from their art and for billions of fans to enjoy and be inspired by it.” But when the market is saturated with AI sloppiness, we make it harder for musicians to earn a living. Research shows that music sector workers could lose almost a quarter of their income to AI by 2028. In the UK, MPs are calling on the government to regulate the use of AI in the music industry and bring in protections to ensure the public are not inadvertently duped into listening to AI-generated music.
Part of the joy of music comes in the wonder that a human being made it. Good music is born out of creativity, innovation, skill, talent, effort, sensitivity, emotion and perseverance.
According to research conducted by UK Music, 80 percent of people want to see laws to prevent musicians’ work being used to train AI without their consent. From the same study, 77 percent believe that AI music that does not credit the original creator is tantamount to piracy, and 83 percent believe that a musical artist’s creative “individuality” should be legally protected against AI copies.
AI-generated art usually relies on material that already exists. It reduces art to a formula, which it repeats. The result is a homogenous, generic song, a revived version of music that already exists. What you lose is the thrill of discovering a new sound, of hearing an artist do something that hasn’t been done before. AI can’t replicate this, and it never will.