Niantic launched Lightship during its developer conference this week and you can see some footage in the video embedded above showing some phone-based AR apps using its new features starting from the 50:20 mark. The system is essentially a new type of map that developers can use for AR experiences, with the aim of providing location-based persistent content that’s synced up for all users.
Niantic is building the map from scanned visual data, which Niantic says will offer “centimeter-level” accuracy when pinpointing the location and orientation of users (or multiple users, in relation to each other) at a given location. The technology is similar to large-scale visual positioning systems in active development at Google and Snap.
While the promise of the system is to work globally, it’s not quite there just yet — as of launch yesterday, Niantic’s VPS system has around 30,000 public locations where VPS is available for developers to hook into. These locations are mainly spread across six key cities — San Francisco, London, Tokyo, Los Angeles, New York City and Seattle — and include “parks, paths, landmarks, local businesses and more.”
To expand the map, Niantic developed the Wayfarer app which allows developers to scan in new locations using their phones, available now in public beta. Niantic has also launched a surveyor program in the aforementioned six key launch cities to expedite the process.
“With only a single image frame from the end user’s camera, Lightship VPS swiftly and accurately determines a user’s precise, six-dimensional location,” according to a Niantic blog post.
Scaling VPS to a global level is a lofty goal for Niantic, but could improve mobile AR experiences which could seem to unlock far more interesting content with accurate maps pinning content to real world locations.