Illustration for: Meta adds Spotify AI music, Kannada/Telugu, and noise filtering to AI Glasses
AI Tools & Apps

Meta adds Spotify AI music, Kannada/Telugu, and noise filtering to AI Glasses

2 min read

Meta’s latest firmware push for its AI‑powered glasses does more than tighten up audio clarity. The company rolled out on‑device noise filtering, added support for two Indian languages—Kannada and Telugu—and opened the door to a new kind of soundtrack that reacts to what you see. While the hardware itself hasn’t changed, the software now taps into a broader set of sensors, letting the device interpret visual cues in real time.

That move signals Meta’s intent to turn its wearables into a more context‑aware companion rather than just a passive display. Here’s the thing: the upgrade isn’t just a tidy language patch or a modest microphone tweak. It’s the first time Meta pairs its on‑device vision stack with an external music service, aiming to blend what you’re looking at with what you like to hear.

The result is a multimodal experience that could reshape how users interact with sound while on the go.

The update also introduced Meta’s first multimodal AI music experience in partnership with Spotify. By combining on-device vision with Spotify’s personalisation engine, users can ask Meta AI to play music that matches what they are looking at, blending visual context with individual listening prefer.

The update also introduced Meta's first multimodal AI music experience in partnership with Spotify. By combining on-device vision with Spotify's personalisation engine, users can ask Meta AI to play music that matches what they are looking at, blending visual context with individual listening preferences to create moment-specific soundtracks. In a move that strengthens its India-focused AI strategy, Meta has added Telugu and Kannada language support to Ray-Ban Meta and Oakley Meta HSTN glasses. The rollout enables fully hands-free interaction with Meta AI in two additional regional languages, making the devices more accessible and natural to use for millions of users across the country.

Related Topics: #Meta #AI #Spotify #Kannada #Telugu #noise filtering #on-device vision #multimodal AI #Ray-Ban Meta

Meta's latest v21 software push adds a handful of new capabilities to its AI‑powered glasses. Conversation Focus, for instance, boosts a speaker’s voice when background noise spikes, using the open‑ear drivers built into Ray‑Ban Meta and Oakley Meta HSTN frames. The feature sounds useful, yet real‑world performance in bustling streets or crowded cafés remains unclear.

Noise filtration works alongside the update, promising cleaner audio without headphones. In parallel, Meta rolls out its first multimodal music experience with Spotify. Users can point the glasses at a scene and ask Meta AI to cue tracks that fit the visual context, tapping Spotify’s personalization engine.

The integration hints at a tighter link between vision and sound, though how intuitive the voice prompts will be is still unknown. Support for Kannada and Telugu expands the language roster, suggesting a broader market focus. Overall, the update layers incremental improvements rather than a wholesale redesign.

Whether these tweaks will translate into sustained user engagement is something only future usage data can confirm.

Further Reading

Common Questions Answered

What new multimodal AI music experience does Meta introduce with Spotify?

Meta’s v21 update launches its first multimodal AI music experience in partnership with Spotify. By combining on‑device vision with Spotify’s personalization engine, users can ask Meta AI to play tracks that match the visual scene they are looking at, creating moment‑specific soundtracks that blend visual context with personal listening preferences.

Which Indian languages were added to the AI‑powered glasses in the latest firmware?

The v21 software push adds support for two Indian languages: Kannada and Telugu. These languages are now available on both Ray‑Ban Meta and Oakley Meta HSTN frames, expanding Meta’s India‑focused AI strategy and allowing native‑language interactions with the glasses.

How does the Conversation Focus feature improve audio clarity in noisy environments?

Conversation Focus monitors background noise levels and automatically boosts the speaker’s voice when ambient sounds spike. It works with the open‑ear drivers built into Ray‑Ban Meta and Oakley Meta HSTN frames, helping users hear conversations more clearly even in bustling streets or crowded cafés.

What hardware components enable the on‑device noise filtering and audio enhancements in Meta’s AI glasses?

The on‑device noise filtering leverages the open‑ear drivers integrated into Ray‑Ban Meta and Oakley Meta HSTN frames. These drivers, combined with the glasses’ built‑in microphones and processing chips, filter out background noise in real time, delivering cleaner audio without the need for separate headphones.