

At its best, artificial intelligence can help analyse data, automate tasks and address major problems such as disease and climate change. At its worst, it can exploit people, damage livelihoods and dull human creativity.
The music industry now sits squarely in that tension. As a recording engineer and professor of music technology and production, I see a large grey area between helpful tools and questionable authorship.
The National Academy of Recording Arts and Sciences has tried to draw a line. Under current Grammy rules, only humans are eligible for awards: “A work that contains no human authorship is not eligible in any categories.” Any human contribution must also be meaningful and significant.
In practice, this allows many routine uses of AI. Tools that standardise volume levels, organise files or speed up workflows in a digital audio workstation are considered acceptable. They function much like advanced automation, helping creators work faster rather than replacing them.
What is not acceptable, at least for Grammy, is using an AI music service to generate an entire song from text prompts — for example, blending the style of a folk singer and a pop star into a novelty duet. That crosses from assistance into authorship.
Between those extremes lies considerable ambiguity. Would it be acceptable to use AI to generate backing vocals for a song written and recorded by humans? Almost certainly. The same applies to tools that add swing or variation to drum patterns during production.
More complicated questions arise when AI contributes to core creative elements. A user can now prompt an AI tool to generate a melody and lyrics for a song’s hook, specifying tempo, key and theme. If a human then builds verses and a bridge around that hook, who is the true author? Is the human’s contribution still “meaningful and significant” if the most memorable part was machine-generated?
Performance is clearly human, but songwriting is less clear. If AI produces the foundation, does that diminish the human role — or does guiding the prompt itself constitute creative input?
The Recording Academy is trying to keep pace with technology that evolves faster than policy. Music has faced similar disruptions before. Pitch-correction tools like Auto-Tune were once controversial; today they are ubiquitous and no barrier to winning a Grammy.
Listeners, too, are adapting. Many already consume AI-generated content, knowingly or not. Streaming platforms do little to identify or limit AI music. On Spotify, an AI “artist” has amassed more than a million monthly listeners, with little indication that the music is machine-generated. Many listeners assume a human creator.
AI can also be used to generate artificial streams, potentially influencing recommendation algorithms. Platforms say they do not support this, but concerns persist.
Listeners who want to avoid AI music do have options. Detection tools exist, though they are imperfect. Some platforms have taken firmer positions. Bandcamp, for instance, has adopted rules prioritising human creation and restricting music generated wholly or substantially by AI.
Ideally, major streaming services would disclose AI use clearly and allow listeners to filter content. Until then, music will continue to occupy a gray area — between tools that enhance creativity and practices that blur authorship itself.
The Conversation