Here’s the thing: identifying a mystery song just got a lot easier. On March 9, 2026, OpenAI rolled out a major update to ChatGPT that lets users identify music tracks inside the chatbot itself, using Shazam’s music recognition technology. No more switching between apps, no more awkward pauses trying to recall a title on your own, just type a simple prompt and let the AI do the work.
How the Shazam Feature Works Inside ChatGPT
The new feature plugs Apple’s well‑known music recognition service directly into ChatGPT’s conversation interface. After linking Shazam through the Apps section of ChatGPT, users can trigger the tool with prompts like “Shazam, what’s playing?” or tagging @Shazam in a message.
Once invoked, ChatGPT activates a listening interface that captures nearby audio and matches it against Shazam’s vast database, returning the song title, artist name, and album artwork inline within the chat. Users can even preview the track right there and, if they already have the Shazam app installed, save discoveries to their personal library.
This works on iOS, Android, and web platforms, and importantly, no standalone Shazam app is required for basic identification.

Why This Matters: From Discovery to Deep Engagement
The simplicity of typing “Shazam this song” masks what could be a bigger shift in how people discover and discuss music. Previously, Shazam lived as a separate app on our phones or tucked into operating system features. Now, it’s part of a broader conversational AI environment.
Once a song is identified, users can immediately ask follow‑up questions about the artist’s background, the track’s genre, lyrics meaning, or even have AI help curate a playlist that fits a mood or vibe. That moves recognition from quick utility to deeper engagement with music.
Integrations with streaming services like Apple Music and Spotify, which were already part of ChatGPT’s ecosystem mean your discovery can feed directly into playing or organizing music.
The Bigger Picture: OpenAI’s Strategy and Industry Trends
This Shazam addition isn’t random. It fits into a pattern where ChatGPT is evolving beyond question‑answering into platform territory. OpenAI launched its “Apps” ecosystem in October 2025, bringing in partners like Spotify, Booking.com, Canva, Coursera, Expedia, Figma, and Zillow.
For Apple, it’s another example of how the company is embedding its technologies, from Siri and system‑wide music recognition to now conversational AI. The move underscores the strategic value of AI that interfaces fluidly with everyday digital needs.
There’s also a backdrop of shifting industry dynamics. Apple and OpenAI have navigated legal and competitive pressures, including past disputes with rivals about fair access and competition, though Apple has defended its partnerships as non‑exclusive and pro‑competitive.
What Users Should Know Before They Try It
Couple things to keep in mind:
- You need to connect Shazam inside ChatGPT’s Apps section before using it.
- The feature works with audio you feed it via device microphone access.
- Even without the Shazam app installed, identification works; installing the app lets you save finds more conveniently.
- Privacy and data use align with both Apple and OpenAI’s existing policies, and microphone access is only activated when you ask for it. (Those terms are typically disclosed within the apps.)
A Shift in How We Interact With Sound
This update feels like more than a convenience. It hints at a future where conversational AI and real‑world context blur, your chat interface can now listen and respond to the world around you. Identification, exploration, and interaction with content are all converging in one place.
That raises interesting questions: Will this change how music discovery happens? Will AI start to replace standalone utilities entirely? As this integration proves, the boundaries between apps and AI helpers are already dissolving.


