My first experience in VR was not actually VR.
I pulled the strap of my new Oculus Quest behind my head, excited to try VR for the first time. But before I was able to play games and watch immersive videos, Oculus required me to draw my playing area on the floor around me. A low-quality, bluish-gray, Photoshop-filtered-looking version of my immediate surroundings appeared, and I was able to walk around my room without crashing into anything. Using the hand controller, I drew freely on the floor in magic purple paint.
My first experience in VR was actually so-called passthrough augmented reality. This is a technique that uses stereo cameras and a standard virtual reality display to present the real environment around you. It seems unnecessary—why let cameras painstakingly pass their input to a complex VR display when you can just take your headset off and let your eyeballs do the work?
Passthrough AR is often compared to see-through AR, where strong lasers or LEDs project images onto a piece of transparent glass, or into your eye, allowing you to see the real world directly. This is the basis of headsets like the HoloLens series, which still face major technical challenges such as the brightness required for visibility during daytime, the field of view, and battery life. Regardless of these current challenges, it’s often seen as the pinnacle of augmented reality.
However, passthrough AR shouldn’t be dismissed so quickly. It comes with several upsides which make it a contender for interfacing with the Metaverse.
Unlike VR headsets, which are digital and thereby driven by Moore’s law, see-through augmented reality relies on strong light sources projected onto glass or directly into the eye. To develop see-through AR, companies must create these optical systems from scratch. In contrast, VR’s rebirth and successful commercialization is a result of the widespread availability of digital components (cameras, sensors, displays) brought along by smartphones.
That’s why creating augmented-reality experiences through the existing hardware of VR is cheaper and more feasible on a shorter time scale than creating see-through AR. Passthrough will help familiarize users with AR and drive innovation in the space before the optics required for see-through exist.
Not only is it an ideal transition to fully-fledged AR glasses, but passthrough is also great to have in VR regardless. Allowing users to see their surroundings makes using VR headsets much safer, and certain applications can still benefit from lower-fidelity passthrough. It’s also been used for industry applications like remote control of robots, thermal imagery, and night vision long before VR was available for consumers. This means manufacturers of headsets will continue improving on passthrough regardless.
Meta also released a passthrough API within their Quest Developer SDK last Summer and I’m excited to see what people build for the Quest 2. Since the tech and the software is already available, developers can start building those applications which only work in AR. If you’re keen on starting to build the augmented Metaverse today, passthrough is your best bet. And if you’re not a Meta fan, there’s also the Varjo XR-3 and the upcoming Lynx R-1, which even offers color passthrough.
So, what’s the timeline for passthrough AR? Will it beat pure VR or see-through AR in the long run? Well, the boundaries between the something-R’s will continue to blur. At Oculus’s Connect 5 conference, the company’s Chief Scientist Michael Abrash expressed his belief that AR and VR will eventually converge to the same piece of hardware, with users alternating between passthrough and see-through functionality whenever one is more appropriate than the other.
However, I believe this piece of hardware will need to be lightweight, socially acceptable, and transparent to the real world to be a truly groundbreaking component of the Metaverse. In the long-term, we’ll have to figure out see-through AR.
Want to compete in the Metaverse? Subscribe to the My Metaverse Minute Channel: