Skip to content
Guest Article

Analyzing Apple Vision Pro As An Accessibility Engineer

Analyzing Apple Vision Pro As An Accessibility Engineer
Apple’s Accessibility menu in visionOS

February 2nd, 2024 is set to go down in history - it’s yet to be seen whether that will be for an earth-shattering paradigm shift in how we think about computing technology, or for one of the most expensive flopped products of all time.

Before I dive into my experience using Apple Vision Pro, I think it's worth a quick discussion on my experience actually getting it. As someone who is a veteran iPhone launch day-er, I can say that this experience was entirely different in just about every way.

Whenever I pre-order new Apple products, I do so as soon as the online store opens. Normally, that means that so long as I move through the checkout process with some urgency, I can get a midday pickup time slot at my local Apple Store in Maryland without a problem. But this time, after walking through the additional steps required to get a properly-sized Apple Vision Pro strap, my payment process failed three times because the midday pickup time slot I had chosen was no longer available.

To my displeasure, by the 4th try, I had a 7pm appointment, but was still going to make it on launch day.Fast forward to February 2, I rounded the corner toward my Apple Store expecting to see a huge line out the door of people at least taking advantage of the free demos. But there was no one - in fact, I had never seen the store so empty. As the employees shepherded me to a table where I was going to get a personal demo and quick training on the device before they brought out my own, I asked if it had been like that all day. Based on his comments, it seemed as though it had. They had only done a couple handfuls of demos, and sold less than 10 devices.

Now I recognize that if I lived in Manhattan or SF this would have been very different, but I DO live in a city and therefore present this information to you as much more representative of the wider population than what I’m sure you saw on 5th avenue in New York.

Let’s Talk Accessibility

I saw a James Cameron quote where he described his demo with Apple Vision Pro as “a religious experience.”

I couldn’t agree more. This device is so immersive, it almost takes you to a world where you didn’t sell a kidney to afford one. Jokes aside, it’s a really incredible device that provides a pretty seamless user experience. How you interact with the interface feels new and exciting, yet familiar and easy to understand. That being said, it is clearly a first generation device.

There are a number of inefficiencies and issues, and this is normally the point at which accessibility gets overlooked in favor of other, flashier features. But in true Apple style, this was not the case with Vision Pro. It comes jam-packed with extensive accessibility features that clearly do aim to tackle a lot of the things that could go awry for someone with a disability. As I began to use the device, I realized just how critical it was that they included these features from the get-go. I have a moderate vision impairment, and have gone my whole life using only one accessibility feature on my phone: slightly larger text. But from the moment I put on the Vision Pro, I realized I was going to need a lot more help.

My experience looking through Apple Vision Pro is similar to how a regular person would see if they were wearing someone else’s glasses. Sharpness of imagery and text is low and everything is just out of focus enough to be difficult. As someone who had high hopes, because I have always used cameras and digital displays as a way to enhance my existing sight, it was sad for me to acknowledge that, at least right now, I was going to have to deal with this blur. I’m used to my world being out of focus. Because of that experience, I’m actually much more equipped to deal with it than the average person would be to spend a day wearing their friend’s glasses.

One part of my vision impairment is a condition called “nystagmus”, which is characterized by rapid, involuntary eye movements. I saw a blind creator on TikTok who also has nystagmus talk about how he was worried about the eye tracking on Apple Vision Pro, and so he promptly switched the setup to allow his cursor to be controlled via head tracking. My nystagmus isn’t too pronounced, so I was determined to make the device work for me out of the box - and to my shock, it did - at least, mostly. While it did work, I noticed that when I tried to focus on smaller or far away “buttons” my cursor would get jumpy, so I worked my way into settings and turned on the accessibility setting to ignore rapid eye movements, which seemed as though it had been created just for me. Unfortunately, I haven’t felt that the addition of this setting has changed my experience very much, and the jumping cursor remains. There’s still a number of settings and combinations that I plan to try to maximize my vision inside Apple Vision Pro, but I thought it worth conveying what I experienced out of the box and my best attempts to make it work with my current vision based on best available information at the time of the headset’s release.

Nuts and Bolts

Vision impaired or not, I’m an engineer by training and would be remiss if not to mention a few additional things I noticed:

It is VERY easy to drop - You can’t tell from the initial product photos or demos, but the headset is five pieces: the device itself, the strap, the eye piece, the light seal, and the battery. The eye piece and the light seal attach to the device itself only through magnets. This would be fine if the magnets were strong enough to hold up the device, but they aren’t. This means that if you go to pick up the device as you naturally want to (by the eye piece) they detach from one another and the device nearly hits the ground. When I made this fateful error in the Apple Store during my demo, it was clear from the employee’s face that I wasn’t the first one.

It has to sit just right - This is one of the more challenging accessibility issues in my opinion. The device has to sit JUST right on your face in order for everything to be properly aligned with your eyes. This means that when you first put it on, the device automatically aligns to your interpupillary distance. While I’m not entirely sure how this happens automatically, I can tell you it almost certainly doesn’t work right for a person, like me, with strabismus. Because both my eyes don’t look straight ahead, and at any given time one of them is turned into my nose (kind of like a lazy eye), I’d find it hard to believe that AVP is able to accurately set itself up. And this is just alignment on one axis. It also has to sit vertically in the exact right spot on your head, or else you are again met with a lack of image clarity. For me, this meant that I couldn’t use the fancy strap you see their marketing. AVP comes with a second strap in the box that helps support over your head, as well as just around it. I’m not sure exactly what aspect of the shape of my face causes me to need this extra support, but without it the device hangs off.

Typing is a Nightmare - I’ve seen a lot of people talk about this, and I’ll get further into one of the reasons for this is below, but typing in space on the AVP is a huge disaster. One of my friend’s spent his first 15 minutes with the device just trying to type in my passcode. My visually impaired friends will understand - its about as precise as using your nose to type on the Apple Watch. I know we all do it.

It is Very Uncomfortable for the First 10 Minutes - You do get used to the extra weight on your head fairly quickly, especially when using the strap that supports the devices weight vertically (which is much more natural, since your body is use to holding your head up vertically all day), but it is noticeably uncomfortable for the first few minutes.

I Can’t Drink My Coffee - The mug bumps into the device and it just doesn’t work.

Higher Order Challenges

For the sake of completion, it's worth diving in to a few bigger pictures challenges I found with the Vision Pro:

Our Brains’ Aren’t Designed to Always Look At What We Are Doing - This might be the biggest fundamental flaw of the entire Apple Vision Pro experience. Because your eyes act as your cursor, you have to be LOOKING at what you are DOING constantly. This just isn’t natural, especially when it comes to typing. Focusing on each individual letter as you press it in order for it to take properly is exactly the kind of behavior that would have gotten me an F in Computer class in the 2000s.

Does This Actually Solve a Problem? - While the Vision Pro is easily the coolest piece of technology I have ever interacted with, the question of whether or not it solves enough of a problem, or provides a significant enough upgrade to some workflows to really take off, still remains. I see the device bringing a certain ease and fluidity to collaboration sessions across timezones, but only if it is non-invasive enough not to give me wrinkles from sitting on my checks so heavily.

A Call to Action

In wrapping up my exploration of the Apple Vision Pro, the journey has been enlightening, particularly through the lens of accessibility.

This device has not only showcased Apple's innovative spirit but also underscored its dedication to inclusivity, making technology accessible to those of us with disabilities from the start. This dedication is a beacon of hope, offering a glimpse into a future where technology adapts to the user's needs, not the other way around. I share similar goals in my own explorations with technology and it is why I started my company, ReBokeh. Entering college, the options for college-bound students with visual disabilities amounted to a huge digital camera/magnifier, or an audio recorder. It hit me then just how far behind modern day these technologies really were, and how much room there was to improve them. We’re exploring a future where new technologies aren’t just usable by those with disabilities, but where new technologies are designed to enhance our living experience.

When it comes to Apple Vision Pro, I see the potential for a wonderful new home for ReBokeh’s vision enhancement tool, which currently lives on the iPhone and iPad and enables users to adjust the way the real world looks to best suit their needs. It’s technology that is currently loved - and needed- by users in more than 100 countries around the world.

There’s just one problem. Apple, like Meta before it, has yet to grant developers access to the on-board cameras. While this decision is understandable from a privacy and security standpoint, it sets to hold companies like ours back while also leaving consumers with only the vision enhancement capabilities that Apple deems fit. In the spirit of competition, or maybe just good sportsmanship, I urge Apple to consider granting certain companies who can present legitimate use cases and agree to uphold privacy and security standards access to these cameras so we can keep expanding the value of the incredible hardware they have built, while continuing to make life for people with disabilities just a little easier.

Rebecca Rosenberg is an accessibility engineer and the founder of ReBokeh, a startup working to “enhance vision, not overpower it.” You can find ReBokeh on the Apple App Store for iOS. On visionOS, ReBokeh is forced to access the virtual camera.

Member Takes

Weekly Newsletter

See More