Smart Glasses Daily

News · Apple

Vision Pro's Next Leap: Enhanced Accessibility with On-Device AI

Apple is set to redefine accessibility on the Vision Pro, introducing advanced features leveraging on-device AI for users with low vision.

S. WHITMAN· American correspondent·May 13, 2025·2 min read
A person using an Apple Vision Pro with magnified text and object descriptions

A person using an Apple Vision Pro with magnified text and object descriptions

Video: Apple on YouTube

Apple is poised to significantly upgrade accessibility functionalities on the Vision Pro. Upcoming updates introduce features tailored for users with low vision or blindness.

Central to these enhancements is an overhaul of the Zoom feature. While previously limited to virtual content, the updated Zoom will now magnify the physical world viewed through passthrough cameras.

Another significant addition is "Live Recognition", an expansion of the VoiceOver screen reader. This new feature harnesses on-device machine learning to describe surroundings, identify objects, and read text.

The implications are profound. It transforms the Vision Pro from a spatial computing device into a highly intelligent visual assistant.

These features are slated for release in a visionOS update later this year. We anticipate a debut as part of visionOS 3 at WWDC25.

Apple continues to push the envelope for assistive technology. Integrating these AI-powered tools underscores a commitment to making spatial computing accessible to everyone.

Share this story

The Friday Brief

Smart glasses, in your inbox..

One sharp email every Friday morning. No fluff. Unsubscribe in one click.

We never share your email.

Related

In the conversation

Most discussed

The pieces driving the loudest debates in spatial computing this week.

Picked for you

Just for you

A curated mix across reviews, news and analysis you might have missed.