Many Metaverse visions aren’t a reality yet due to technology. Technical capability, despite advancing exponentially, is always a limiting factor for the human imagination. One Metaverse vision in particular, which some call “the real-world Metaverse” or even “Metavearth,” — and what I just call the future of the internet — can often seem out of reach due to technical capabilities. However, recently, with its Lightship Visual Positioning System (VPS), Niantic has taken an important step towards creating a viable Metavearth.
The Real-World Metaverse
The real-world Metaverse is a digital layer added on top of our physical reality mediated through augmented reality devices and constructed through the Internet of Things (IoT), artificial intelligence (AI), and blockchain. While that grand vision faces many technical challenges, one of those stands above all: building a replica of the physical world in a language that computers understand. We can’t augment city streets, apartment units, or historical sites without having a 3D model of that environment with which digital content can interact.
Not only does a device need to know the environment around it, but it also needs to know its place and orientation within that environment and recognize key features, like surfaces and shapes. Some in the industry call this digital replica backbone the “AR [augmented reality] cloud,” referencing its similarity to Web2 cloud services like AWS (Amazon Web Services) being the backbone of the current internet.
Enter Niantic and Lightship Visual Positioning System
In mid-2022, Metaverse heavyweight Niantic released a tool that helps tackle the challenge, the Lightship visual positioning system, or Lightship VPS. This tool will be instrumental in building the internet’s next iteration because it allows users to quickly construct digital maps of physical environments. Select participants like developers, surveyors, and players scan real-world environments and send the footage to Niantic, which processes the data and integrates the location into its growing AR world map.
Once the location is scanned sufficiently, it releases to Lightship VPS, where other developers can build AR experiences in those real-world locations. Usually, locations are famous landmarks, public squares, or places with interesting geometry.
End users can go to those VPS-activated locations and very easily localize their devices and interact with persistent AR content. Since the environment is already mapped to a high degree of precision, devices can locate themselves down to the centimeter in that environment with just a single image.
So far, Lightship VPS is available in more than 30,000 locations worldwide, with dense coverage in areas like San Francisco, Los Angeles, Seattle, New York, London, and Tokyo. It grows its coverage with a dedicated team of surveyors and developers who know how to scan environments, which tend to be around 10 meters, or 33 feet, in diameter. According to the Lightship VPS blog, “Both the size and the quantity of these VPS-activated locations will increase over time; Lightship VPS will be available in over 100 global cities by the end of 2022.”
The Possibilities of Lightship VPS
Great, Niantic built a nifty new tool. What’s the big deal?
Lightship VPS allows users to have much more rich and more immersive AR experiences in real-world locations. This is because devices don’t have to undergo as much edge computation and computer vision to determine surfaces and key points.
As a result, you can build experiences that truly mesh virtual with physical as opposed to just overlaying virtual head-up displays (HUDs) over the real world by enabling occlusion, physics, and interaction between real objects and virtual ones. That might include experiences like real-world scavenger hunts, enhancing storefronts, visualizing IoT data, collaborative public art installations that persist across users, or guiding users on a historical walking tour through your city.
Enabling these high-quality experiences advances AR’s consumer adoption as the narrative shifts away from gimmicky and low-quality to genuinely informative, entertaining, and immersive. It also boosts other AR-based services such as storefront enhancement, IoT visualization, or Google Maps-style AR navigation, which are increasingly competing with their existing two-dimensional counterparts.
Recently Niantic also announced the integration of Lightship VPS into its WebAR development platform, 8th Wall. That means users can build browser-based AR experiences that don’t require apps or logins while leveraging Lightship. These WebAR experiences can be loaded by any device in a matter of seconds, including non-LiDAR- (light-detection-and-ranging-) enabled devices. All you need is a web browser.
Behind the scenes, the race to build the AR cloud is underway. As is the case with Google’s primacy over Google Maps, the company that controls the 3D scans of the real world will become the next tech giant because it will have laid the foundation for a new generation of Metaverse applications.
That being said, the AR cloud should not be owned by a single company. I understand the strategy behind Niantic’s moves, but a monopoly over the AR cloud would have too much leverage over the rest of the tech stack that is built out in the coming decades. While Niantic crowdsources its data collection process, it stores all scans in-house — perhaps there are open-source solutions to building the AR cloud that we should all be looking to instead.
Want to compete in the Metaverse? Subscribe to the My Metaverse Minute Channel: