This blog is taken from ZeroLight's tech blog:
https://zerolight.com/news/tech/boundless-xr-the-next-step-to-cloud-rendered-vr
VR hardware has been developing rapidly since the Oculus Rift and HTC Vive were introduced in 2016. In the last 12 months, we've seen many incredible headsets like the Varjo VR1 and the StarVR One burst onto the scene and rewrite the rules with advancements like human-eye resolution, ultra-wide FOVs, and accurate eye tracking.
Then, in 2019, we saw a new wave of advancements, with the introduction of standalone headsets like the VIVE Focus Plus and the Oculus Quest. A huge win with users, these headsets no longer require a high-power, tethered PC or complicated trackers to be placed around the room. Unfortunately, however, the quality of experiences in these HMDs are limited by the power of the on-board chipset. But this is all set to change, as a new exciting generation of XR is on the horizon. At this year's EWTS, we presented the next step along this journey with our partners at Qualcomm, Pico, and Cadillac: boundless XR.
The graphics below demonstrate the difference between traditional, tethered XR and the boundless alternative that we have created. Tethered XR relies on external trackers (lighthouses) and physical connections between a PC and the headset (wires). Boundless XR, on the other hand, uses inside-out tracking and wireless content streaming to remove these two constrains.
The showcase demonstrated this latest evolution of XR, which combines the quality of high-end VR with the freedom of standalone VR by using Qualcomm's split rendering to maximise the experience and minimise the perceived latency. This is achieved by rendering the high-quality graphics on the PC before transmitting them to a standalone headset over a 60-Ghz wireless access point. Using cameras to track objects in the users' real-world environment, the HMD creates a 3D point cloud of their play area and uses its computing power to re-project the image using the very latest tracking data. This ensures that the display accurately matches the user's current position, rather than their position a few milliseconds ago, when the render request was sent to the PC. The importance of this is that it enables us to avoid the relatively high latency between capturing the tracking data, rendering on the PC, transmitting the image, and presenting the results to the users' eyes.
The showcase used a new Pico prototype headset featuring the Qualcomm Snapdragon 845 chipset. The experience itself was an adapted version of the Cadillac VR solution. It enabled users to configure a range of highly detailed Cadillac vehicles. The 7-million polygon models can be displayed in a range of photorealistic environments, with all interactions managed via the 6DOF controllers. The device displayed a perfectly antialiased ultra-realistic QHD image per eye at 75 frames per second (see full blog on above link for showcase video).
At ZeroLight, we believe that the cloud is the future, not only for enterprise and showrooms, but also for the home user. Cloud rendering makes high-quality 3D visuals accessible to all users across a range of devices. Our cloud configurator solution has shown how it's possible to get high-quality 3D content to millions of users around the world. With 5G on the horizon, more devices will become connected and have access to higher bandwidth with lower latency. As all the key components come into place, it won't be long until we see the future of XR powered by the cloud.
Top comments (0)