Meta Ray-Ban Smart Glasses Launch AI Features Including Live Translation
Meta is enhancing Ray-Ban smart glasses with advanced AI capabilities, including live AI viewing and conversational queries through the Meta View app. Users can ask questions about their surroundings and get real-time translations in multiple languages. Privacy concerns remain, but new controls aim to address them. These updates will roll out in spring and summer, expanding smart glasses' practical use in daily life.
Meta is advancing the smart glasses market by introducing powerful artificial intelligence features to the Ray-Ban smart glasses, developed in partnership with Meta. These updates, launching this summer for users in the US and Canada, leverage the Meta View app on connected smartphones to enable a live AI experience. By activating the "Hey Meta, start live AI" command, users can give Meta AI a real-time view of their surroundings and engage in conversational queries about what they see.
This functionality is similar to Google's Gemini demo, allowing users to ask Meta AI questions about objects or scenarios in their environment. For example, if a user looks into their pantry, Meta AI can suggest substitutes for butter based on the items it recognizes. Even without live AI, users can query specific objects they are viewing for information or assistance.
In addition to AI enhancements, the glasses will support live translation capabilities. By saying "Hey Meta, start live translation," users can receive automatic translations of incoming speech in languages such as English, French, Italian, and Spanish. The glasses' built-in speakers will relay translated speech, and users can display translated transcripts on their phones for others to read, facilitating seamless multilingual communication.
These smart glasses also gain new social media integration, allowing users to post on Instagram or send Messenger messages using voice commands. Music streaming compatibility is expanded to include Amazon Music, Apple Music, and Spotify, enabling users to listen to music directly through the glasses without needing earbuds.
Privacy concerns remain a significant topic in the adoption of AI-enabled smart glasses. Inna Tokarev Sela, CEO of AI data company illumex, noted that people often react to the recording indicator light on Ray-Ban glasses, which signals when the device is capturing video. While new models offer control over this notification light, it raises potential privacy risks. Sela emphasizes the importance of user consent and transparency regarding data collection and sharing.
She also highlights the need for consent mechanisms that allow individuals to control how their information is exposed when appearing in recordings, similar to privacy settings on platforms like LinkedIn. Furthermore, recordings made by smart glasses should require explicit permission to be admissible in legal contexts, protecting privacy rights.
Meta plans to roll out these AI and translation features during spring and summer, with object recognition updates for European Union users expected in late April and early May. These advancements position Ray-Ban smart glasses as a versatile tool for everyday tasks, communication, and entertainment, showcasing the growing integration of AI in wearable technology.
AI Tools Built for Agencies That Move Fast.
Explore how QuarkyByte’s AI insights can help developers and businesses optimize smart glasses applications. Discover strategies to balance innovation with privacy and enhance user experience with real-time AI features. Partner with QuarkyByte to lead the future of wearable AI technology.