Meta is opening up its smart glasses to developers

3 hours ago 3

Jay Peters

is a senior reporter covering technology, gaming, and more. He joined The Verge in 2019 after nearly two years at Techmeme.

Meta is going to let apps access the vision and audio capabilities of its smart glasses thanks to a new Wearable Device Access Toolkit for developers.

“Our first version of the toolkit will open up access to a suite of on-device sensors— empowering you to start building features within your mobile apps that leverage the hands-free benefits of AI glasses,” Meta says. “With our toolkit, you’ll be able to leverage the natural perspective of the wearer and the clarity of open-ear audio and mic access.”

The company has highlighted a few examples of how early testers are thinking about the toolkit: Twitch will let creators livestream from their glasses, and Disney’s Imagineering R&D team is working on prototypes to let people visiting Disney parks access tips while they’re wearing Meta’s smart glasses, according to a blog post.

It seems like this toolkit is super early, though. Developers can now get on a waitlist to be notified when a preview of the toolkit is available later this year. Actually publishing experiences that use the toolkit will “be available to limited audiences in the preview phase,” Meta says, and the company notes in an FAQ that it expects that general availability of publishing won’t be available until 2026. But given how popular the Ray-Ban Meta glasses have been and how promising Meta’s new smart glasses with a display seem, there could be a lot of appetite from developers to try this toolkit out.

0 Comments

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.

Read Entire Article