Meta Ray-Bans already provide plenty of utility for people with low vision, but the glasses are about to get even better.
Also: You can zoom to read in Chrome for Android with these new AI and accessibility features
Meta has announced two new accessibility advancements for its smart glasses designed to help users navigate the world around them.
The first feature is available now. Meta said users can get descriptive responses about what their glasses see. Meta explained that this feature is useful for anyone, especially those with vision impairments.
Also: I've worn Meta Ray-Bans for months and these 5 features never get old
In a demonstration video, a user with a vision impairment asked her glasses what they saw. "I see a pathway leading to a grassy area with trees and a body of water in the distance," the glasses responded by describing tree height, giving more details about the pathway, and mentioning that the grassy area was "well manicured." Later in the video, a user asked Meta's glasses what they saw, and the user received a breakdown of food items on a kitchen countertop.
A feature called Live AI already provides conversation support. However, that feature is focused on receiving additional information or asking questions. This new feature better explains what's being seen.
Also: Apple's Meta Ray-Bans killer is only one of four major launches in 2027 - here's the list
To start using this feature, head to your device settings in the Meta AI app and turn on "detailed responses" under Accessibility.
The second accessibility feature arrives later this month. A new "Call a Volunteer" feature will let users place a call through their glasses to a service called Be My Eyes.
Also: I tested Meta's transparent Ray-Ban smart glasses, and they're a near-perfect accessory for me
This app will connect blind or low-vision users who want assistance with sighted volunteers globally to get help with everyday tasks like choosing a shirt color or reading a label at the grocery store. Users will be able to sign up for the service through the app. Volunteers will see through the user's glasses to provide guidance.