Meta's Wearables Device Access Toolkit, which lets smartphone apps access the camera of its smart glasses, is now available as a public preview.
That means that developers can download it and integrate it into their iOS and Android phone apps, and can test it on their own glasses, but they cannot yet ship it for general public use.
Announced at Connect 2025, Wearables Device Access Toolkit lets phone apps capture a photo or initiate a video stream from the glasses. The app can then store or process the frames it receives. And since Meta smart glasses function as Bluetooth audio devices, developers can combine this visual capability with audio in and out.
Developers could, for example, leverage the SDK to add first-person livestreaming or recording features to their apps. Or they could feed the camera imagery to a third-party multimodal AI model to analyze what you're looking at and answer questions about it.
Hi AI devs!
— Robi ᯅ (@xrdevrob) December 4, 2025
In case you were wondering how the workflow looks like and what you can do with the Meta Wearables Device Access Toolkit (DAT) right now: The Meta AI app acts as a bridge between your glasses and your app, handling all connections and permissions!
👓↔️ Meta AI app… pic.twitter.com/zYOZHR3R6S
For a video stream, the maximum resolution is 720p and the maximum frame rate is 30 FPS, a limitation related to the use of Bluetooth. And when Bluetooth bandwidth is limited, the resolution and frame rate will be automatically reduced.
Ray-Ban Meta and Oakley Meta HSTN glasses are currently supported, with support for Oakley Meta Vanguard and Meta Ray-Ban Display coming in the near future. But to be clear, support for the latter will only include receiving camera imagery, not displaying anything on the HUD.
Interested developers can find Wearables Device Access Toolkit at wearables.developer.meta.com.
Early Developer Experiments
Meta provided an early version of the Wearables Device Access Toolkit to a handful of developers several months ago, including Twitch, Microsoft, Logitech Streamlabs, and Disney.
Twitch and Logitech Streamlabs are using the SDK to let you livestream your first-person view on their platforms, just as you already can on Instagram, while Microsoft is using it for its Seeing AI platform that helps blind people navigate and interact with the world around them.
How 18Birdies is using the toolkit.
One particularly interesting use case comes from 18Birdies. The golf app is experimenting with using Meta Wearables Device Access Toolkit for real-time yardages and club recommendations, helping golfers without requiring them to take their phone out of their pocket.
Another is from Disney's Imagineering team, which explored using the toolkit to give guests a personal AI guide in Disney parks.

