Skip to content

Hands-On With NASA's Mixed Reality HoloLens Tools

Hands-On With NASA's Mixed Reality HoloLens Tools

During a recent test, NASA engineers placed a microwave-sized nuclear power system onto a car-sized rover, while it was inside a rocket on the launchpad. NASA engineers were able to see the new rover — and its power system — should be able make the tight fit into the rocket during that critical step.

The whole thing was simulated in HoloLens and the test provided helpful insight that could point to a future where prototypes can be tested virtually before any actual parts are made. The rover in question won’t head to Mars until at least 2020.


The software is called ProtoSpace and it was one of two pieces of software, the other being a Mars visualization tool called OnSight that lets scientists examine terrain up close, shown in detail last week at NASA’s Jet Propulsion Laboratory in Pasadena, Calif. Recently, NASA successfully tested HoloLens in space on the International Space Station as part of a third project, called Sidekick, that could allow step-by-step instructions of a complicated task to be overlaid into the astronaut’s view. Sidekick can also let ground crew connect over a Skype call to give live guidance.

Here’s our hands-on report of ProtoSpace and OnSight and what these mixed reality applications could mean for the future of space travel.


There was $27,000 worth of HoloLens headsets sitting on a table. That’s nine completely mobile headsets, each selling for $3,000. I put one on, along with journalists from other media outlets, and we spread out in a circle across the room.


Jeff Norris, NASA’s lead on mixed reality projects, led us through a demonstration of HoloLens as both a collaboration tool and as a way of saving costs. Elon Musk could revolutionize space travel with reusable rockets dramatically lowering the cost of putting people and things in space, but HoloLens (and VR in general) could also do its part in making it more accessible and affordable to do things in space. For example, ProtoSpace can help save money by demonstrating when an expensive part isn’t the right size for the project before it is built.

The biggest benefit of ProtoSpace down the line is the ability for people wearing future AR and VR headsets to view the same model at the same time, even from different locations. Design review meetings could take place at anytime, anywhere. During our demo, Norris said that currently with people in the same room, the location of the virtual model can be synced up within a couple centimeters. This means that in real time, people wearing headsets in the same room can watch Norris point to and interact with a particular area of the rover — a rover that people without headsets can’t see.



The second demo I tested at JPL put me on the surface of Mars. I got to examine the highly detailed area around the Curiosity Rover — leaning down close to look at the lines between different layers of rock. Unlike VR headsets like Vive or Rift, HoloLens creates a virtual environment in a rectangle in the center of your vision while leaving your periphery free to see the real world.

With OnSight, I was able to freely walk (no wires) around a very large demo space, several times larger than any Vive demo space I’ve seen. Throughout the whole space I saw martian landscape while walking confidently around, able to see the floor and people around me in the periphery of my vision outside the martian rectangle.

HoloLens recognizes a simple figure gesture to provide input and, in OnSight, I was able to use it to measure distances and teleport from place to place to investigate areas outside of walking distance. I also talked to a NASA researcher who seemed to be standing on a nearby ridge on Mars. He told several of us wearing HoloLens about various features of the martian surface — flagging in our view exactly what he was discussing.

The distance between Mars and Earth creates a logistical problem for scientists remotely operating a robot on the surface. Signals travelling at the speed of light take several minutes to traverse space. That means movements need to be carefully programmed for the rover to follow step-by-step several minutes later. Then scientists receive new imaging data from Mars showing the Rover’s new location and the process repeats to plan out the next move.

OnSight uses imaging data captured from the Curiosity rover as well as satellites in orbit and merges the information through an automated process that creates a detailed view of Mars terrain. Scientists on Earth operating in different parts of the world can then “meet” in VR/HoloLens and plan out next steps while discussing the unique features currently surrounding the rover.

The process holds enormous potential for more efficient operation of current and future spacecraft. A geologist could, for instance, highlight a specific rock on Mars as one that should be tested by a rover. The people who control those movements and tools can see the exact highlighted rock in HoloLens, too, no matter where they are in the real world.

Weekly Newsletter

See More