Meta talked about the next generation of VR Vision by releasing the Inside The Lab opening video of Mark Zuckerberg and Michael Ambra City chief scientist. Zuckerberg unveiled the visual Turing testing, the technology to pass it, and the prototype alone, Holocake2, Butterscotch, and Starburst.
Zuckerberg introduced the cake 2 as the thinnest and lightest VR headset made by Meta. These headset implementation secrets are in the implementation of a technology that expresses lens form and image.
As the VR headset users know, the lenses and the actual displays that express the video in front of the eyes have intervals. In the case of the existing refractive lens, the thick curved lens was visually accepted, and the thinner pancake lens was a form of reflecting the light through the two curved lenses. The cake lens alone uses the same polarization-based optical technology as a pancake lens, but the flat lens is overlapped and the light comes into this flat side. This narrows the distance between the display and the eyes, making it possible to implement a thinner front part.
In order to implement a cake lens alone, it is necessary to secure the right source through a special laser that is different from the LED used in the general VR headset. Ambrassi said that it is currently safe and needs a lot of development to obtain a low-cost consumer laser. Instead, if it is possible to secure the light source, Ambrassi predicts that VR displays such as sunglasses will be possible.
Meanwhile, Zuckerberg’s representative for Holo Cake 2 could run all the existing PC VR titles, and did not limit the possibility to the HMD of the mobile device.
Unlike Cake 2 alone, butter sketches are not much different from the VR of the existing meta. Unlike Cake 2, which depicts changes in external device implementation, butterskaza is a prototype that focuses on providing resolutions that are close to real human vision.
Butterskachi, a Retina display VR that Meta first unveiled to the public, has a pixel density of a 55ppd (pixel-per-digree). This allows you to describe the 20/20 of the Eye Chart. This means the resolution of 1.0 vision. In the case of the existing Meta Quest 2, 20/60 (0.33) can be described, with butter sketches higher than 2.5 than Meta Quest 2 with 20ppd.
The star burst, which has a huge volume with two fans and handles, is a prototype HDR headset. Ambrassi said that TV has made progress in the introduction of HDR technology early, but the VR headset Oculus Quest 2 has only 100 nits.
Star Burst, on the other hand, can implement 20,000 nits. Through this, the environment in the night environment or indoor lighting will accurately implement. Star Burst is a prototype that should be used with two handles on the wired model, and it is not suitable for commercial products, Zuckerberg said. However, he predicted that the experience of the device and further research based on the experience will bring positive evolution to the VR device that will be released later.
In addition to the resolution of HDR and human retinal vision level, Meta side talked about the importance of variable focus and lens distortion correction.
The world embodied through the existing VR expresses a fixed focal point, unlike the real world that humans see with the eyes. When human eyes see objects close to a distance, the shape of the lens and the thickness of the lens, and the ‘convergence’, which creates a statue, occurs at the same time. In the case of VR devices, there is one focal point, so unlike the real world, there is a ‘Vergance-ACCOMMODATION-Conflict’, where the convergence distance and the control distance are different.
This ‘convergence control mismatch’ is one of the causes of dizziness and eye blood that occurs when wearing a VR device. Meta traces the attention of the VR wearer and mentioned the possibility of implementing variable focus instead of fixed focus. In 2015, Meta, which continued to improve its prototype, continued to improve its function, and continued to improve its functions, and focuses on the distance or vice versa. I wanted to realize the change. Zuckerberg explained that it will provide world implementation and comfortable VR experience. However, the new prototype device with the function was not disclosed on that day.
In addition, Zuckerberg also cited an important task of solving the distortion problem caused by the refraction of light. Ambra City said, Meta Quest 2, which solved the distortion problem through software, showed a relatively good result, but it still has a distortion, and it talks about the development of devices that simulate and corrects them.
The Meta side mentioned the criteria for distinguishing between reality and virtual boundaries in VR through visual Turing tests, such as the Turing test that tests the excellence of AI. Zuckerberg said that the high-resolution of the retinal field of view, HDR, the focus of variable and distortion through the tracking of the eye is the basic attribute of the visual turing test, and if it is properly implemented, it will provide the visual experience of the new generation. The technology progress required for this realistic world is explained through the newly released prototype and concept.
The concept of Mirror Lake was also newly released. Meta explained the mirror lake with the concept of next-generation VR devices that will pass the visual turing test through light and thin form factors, gaze tracking, and prescription frequency measurements through the cake lens al1. Ambrassi was confident that Mirror Lake is now a concept stage, but if it is realized, it will be a clear game change in the VR experience.
The prototypes and technical applications released on this day are unlikely to be done in myopia. Meta is currently preparing for Project Cambria to be the next-generation Consumer VR/AR headset. The device will be applied with a pancake lens with a further improved augmented reality.