Smart Glasses Daily

Apps & Use Cases · Meta

Be My Eyes Unveils Hands-Free AI Assistant for Ray-Ban Meta Glasses

The integration allows users with visual impairments to access Be My Eyes' AI-powered visual assistance directly through their smart glasses. This marks a significant accessibility upgrade for Meta's wearable tech.

M. BELL·April 19, 2026·2 min read
Close-up of a person wearing Ray-Ban Meta smart glasses, looking forward, with a subtle glow around the frames indicating active use of an AI assistant.

Close-up of a person wearing Ray-Ban Meta smart glasses, looking forward, with a subtle glow around the frames indicating active use of an AI assistant.

On March 11, 2026, Be My Eyes announced a new hands-free connection feature for Ray-Ban Meta and Oakley Meta smart glasses. This integration enables users to directly access the Be My Eyes AI assistant, a tool designed to provide visual assistance, directly through their smart eyewear, eliminating the need to hold a smartphone.

Michele of Be My Eyes reports that the new feature allows for seamless interaction with the Be My Eyes AI. Users can now receive real-time, context-aware visual descriptions and assistance from the AI directly streamed to their glasses, enhancing independence for those with visual impairments. Michele emphasizes the convenience of this hands-free operation, allowing users to navigate their environment or perform tasks while receiving auditory feedback through their smart glasses.

The piece from Be My Eyes highlights that this development builds upon existing collaborations between Meta and Be My Eyes, focusing on improving accessibility through smart technology. Michele notes that the move specifically leverages the on-device AI capabilities of the latest generation of Meta's smart glasses, making the assistance more immediate and integrated into the user's daily life.

Our take: This hands-free integration marks a crucial step forward in making smart glasses genuinely assistive devices. While previous iterations required some level of smartphone interaction, enabling direct AI assistance via voice commands through the glasses themselves significantly lowers the barrier to entry and increases the utility for users with visual impairments. This commitment to accessibility could set a new standard for future smart eyewear developments, positioning Meta and Be My Eyes as leaders in inclusive technology.

Share this story

The Friday Brief

Smart glasses, in your inbox..

One sharp email every Friday morning. No fluff. Unsubscribe in one click.

We never share your email.

Related

In the conversation

Most discussed

The pieces driving the loudest debates in spatial computing this week.

Picked for you

Just for you

A curated mix across reviews, news and analysis you might have missed.