Skip to content

Zappar Wants To Bring You a HoloLens-Style Mixed Reality Experience for $30

Zappar Wants To Bring You a HoloLens-Style Mixed Reality Experience for $30

I’ve been pining for a HoloLens ever since I demoed it a few weeks ago. The truth is that once you try out Mixed Reality it’s really hard to go back. Other tech – from smartphones and computers to VR headsets – feels clunky in comparison, and the world seems a much duller place without those friendly and convenient holograms, as you start seeing potential use cases for the MR literally everywhere.

But as Microsoft made very clear when we met, HoloLens is nowhere near a consumer launch yet. Sure, if you’re a dev with $3,000 spare and live in one of the 8 countries they currently supply, you can get your hands on one, but mere mortals will have to wait a long time to experience MR for themselves. Or so I thought.

ZapBox – a Kickstarter-funded project from UK start-up Zappar – is proposing to leapfrog Microsoft (and everyone else) by making a MR equivalent of Google Cardboard for a hundredth of the HoloLens cost – a mere $30. They launched it just over a week ago, aiming to raise $30,000 toward development. At the time of writing, however, they had already exceeded that goal, with 19 days still left to go.

zap-2

Although I kept an open mind as I went down to London* to try this out for myself, it was hard to picture how it would be possible to replicate the complex tech of the HoloLens on a cardboard device. It’s one thing to project 360 video on one of those, but quite another to integrate computer-generated elements in your real-world environment, in real time.

But as I put on the prototype headset – endearingly held together with brown tape – that Zappar’s Co-Founder Simon Taylor handed me, I was immediately impressed. The feed it displayed from Taylor’s iPhone 6 camera was accurate – if very slightly blurry at times – in every detail except for the rabbit peeking out of a hole on the table in front of me.

Like my Wonderland namesake I felt the urge to investigate further, and took a peek down that rabbit hole. As I stood above it I could actually see quite a way down, with no sign of the table underneath it. When I reached out my hand, it of course went straight through the rabbit hologram, but – crucially – it did touch the edge of the table exactly where I expected it to be. That’s when I started to get excited.

I tested my boundaries by walking around the small room, and was able to accurately and effortlessly navigate around obstacles such as chairs and laptops. I picked up my glass of water, and saw the half-eaten mince pie that Simon had been working on as he setup the demo. It all felt natural and real, in spite of the big blue bunny insistently waving at me, which reminded me of the opening scenes of that Playtest episode in Black Mirror.

This “video see-through” approach (in contrast to the optical see-through employed by HoloLens) works through a code that uses the smartphone’s camera to scan the environment for a set of specific physical markers. This allow for accurate, real-time mapping of the user’s location in relation to objects (both real and computer-generated).

“We call these markers pointcodes, and they’re the fundamental unit that makes all of this possible on a smartphone” explains Taylor, who did his Cambridge University PhD on Fast Object Localisation for Mobile Augmented Reality Applications before becoming Zappar’s Head of Research. “We lay out our marker codes onto the world like we want it and they help us map out the environment, whether that’s a table-top setup or a combination of wall and floor if you want a larger play area.”

The next demo is more interactive, so I get to try out the hand controllers; a pair of origami-like paper devices featuring pointcodes in every facet, neatly held together with Sellotape. And that’s another thing that shows how different this is from a regular VR experience: I put on the headset and by default look straight ahead, expecting Taylor to guide me and put the controllers in my hands. He does, before pointing out politely that I can look down and see my own hands – not a pair of floating gloves or a graphic representation of the controllers, but my hands – and so can do the job much better myself.

After I get the hang of it – it’s all pretty intuitive, specially if you’re familiar with devices like the Vive and the Oculus Touch – I then use one of the controllers as a laser pointer, which interacts with a hologram of a building projected on the table. It’s easy to see how that setup would be useful in educational settings or for presentations. Next up I’m instructed to use the other controller to press a holographic button with a moon symbol on it, which immediately dims the ambient light in the room. Pressing the sun-symbol button next to it undoes the effect, and I spend a fun few extra seconds just turning the lights on and off for kicks. It’s surprising how, in these virtual worlds, the simplest experiences are often the most compelling.

Finally, I indulge in some “air finger painting” with their ZapBrush tool. And although this was nowhere near as slick as Tilt Brush, the potential is evident once they get around to refining colors, textures and templates properly. The main thing is that the 3D element worked flawlessly, and I could see my handiwork from all angles as I walked around the room, smugly avoiding the laptop bag I had left on the floor (something that surely would have sent me flat on my face had I been wearing the Vive).

There are certainly a few kinks yet to be ironed out; there’s a slight lag issue if you move around too quickly, and some design work on the controllers is needed to make them easier to self-assemble. There are also plans to fit a wide-angle adaptor for the phone camera lens that would allow it to continually scan for pointcodes without the need to keep your target area at the centre of your FOV. But the fact that this was supposed to be such an early Beta and that they felt confident in shipping to their 800+ current Kickstarter backers by April next year almost seems too good to be true.

But it starts to make sense once you look at the company’s background. These are not newbies to this space, as Zappar has been experimenting with Augmented Reality and delivering commercial AR projects with a broad range of partners for the past 4 years. Their existing technology uses “zapcodes” (an evolution of bar/QR codes) to help devices recognize real-world objects and trigger interactions, and that is what they built on to create the pointcodes that make ZapBox MR possible.

Their ever-growing list of backers is a healthy mix of developers and consumers, and Taylor says that this is the main reason they decided to go for Kickstarter in the first place.

“It’s not about the money,” he said. “If we want to make this work as an ecosystem we need to build that community.”

In addition to the flat-packed headset and hand controllers, those first users will get access to ZapWorks Studio, which uses an integrated Javascript-based API and promises to be user-friendly, making it easy to develop original content for the device without a lot of assets or code. The SDK should work seamlessly with the latest generation of iOS and Android smartphones.

When I was a kid, I remember building a “computer” out of cardboard and sticky tape. Back then I had to rely on my imagination to make that work of course, so it’s funny how I lived to see the day where cutting-edge technology actually does work with cardboard in such interesting ways. And if Zappar succeeds in making this an affordable and intuitive entry-level device for Mixed Reality – and fostering an ecosystem that produces the content to make people want to use it – they could very well do for MR what Google Cardboard did for VR.

Disclosure: Zappar paid for my train ticket to this demo as I live a couple of hours away from London.

Weekly Newsletter

See More