As we are getting closer to Connect 2022, I wanted to cover what Meta has presented almost a year ago at Connect 2021, to see what could be announced on upcoming event in 2022.
At Connect 2021, Meta unveiled the vision for the Metaverse, a more connected digital experience that allows you to move seamlessly from one place to another and spend time with people who are physically far away from you, while maintaining your unique virtual identity and digital goods from one world to the next.
A vast and interoperable Metaverse cannot be built by one company alone. It will require the contributions of many companies, developers and creators - and it will take time.
Some building blocks for the Metaverse already exist, but to create virtual environments that feel natural and authentic, we need to improve the way movement and space are represented in an app. During the keynote, Meta announced the Presence Platform, a wide range of machine perception and AI capabilities - including Passthrough, Spatial Anchors and Scene Understanding - that enable you to create more realistic mixed reality, interaction and speech experiences that seamlessly integrate virtual content with the user's physical world. Meta also gave a sneak peek at next generation of all-in-one hardware, Project Cambria VR, which will launch in 2022 - an advanced device at a higher price point that will feature the latest VR technologies. With the advances in Meta’s hardware and the capabilities of the Presence Platform, Meta is opening up groundbreaking possibilities for mixed reality experiences and natural interactions at VR. This brings us ever closer to the promise of the Metaverse and the potential to bring people together in new, more immersive ways in the future.
With the Presence Platform, Meta aims to enable a wide range of mixed reality experiences that align with principles for responsible innovation. This starts with being responsible with the information we need to create amazing mixed reality experiences that are safe and seamless.
Meta announced the Insight SDK, which lets you create mixed reality experiences that give a realistic sense of presence.
Earlier in 2021, Meta launched Passthrough API Experimental, which lets you create experiences that merge virtual content with the physical world. On Connect 2021 Meta has announced the general availability of Passthrough in the next release, which means you can create, test and deploy experiences with Passthrough capabilities.
Meta also announced Spatial Anchors, world-spanning frames of reference that allow you to place virtual content in a physical space that can be maintained across sessions. With Spatial Anchors Experimental, which will be available soon, you will be able to create Spatial Anchors in specific 6DoF positions, track the 6DoF position relative to the headset, maintain Spatial Anchors on the device and retrieve a list of currently tracked Spatial Anchors.
New scene understanding feature was also a major announcement on Connect 2021. Together with Passthrough and Spatial Anchors, Scene Understanding enables the rapid creation of complex and scene-based experiences that enable rich interactions with the user's environment. As part of Scene Understanding, Scene Model provides a geometric and semantic representation of the user's space, allowing you to create spatially rich mixed reality experiences. Scene Model is a single, comprehensive, up-to-date representation of the physical world that's indexable and queryable. For example, you can attach a virtual screen to the user's wall or have a virtual figure navigate the floor with realistic occlusion. You can also include real, physical objects in VR. To create this scene model, we offer a system-driven Scene Capture Flow that allows the user to walk through and capture the scene. We look forward to making Scene Understanding capabilities available as an experimental feature early next year.
With the new Passthrough, Spatial Anchors and Scene Understanding features in the Insight SDK, you can create mixed reality experiences that merge virtual content with the physical world, creating new opportunities for social connection, entertainment, productivity and more.
With the Interaction SDK, Meta makes it easier for you to integrate hands and controller-centric interactions. The Unity library, which became available early in 2022, includes a set of ready-to-use, robust interaction components such as Grab, Poke, Target and Select. All components can be used together or independently, or even integrated with other interaction frameworks. The Interaction SDK solves many of the difficult interaction challenges associated with computer vision-based hand tracking, providing standardized interaction patterns and preventing regressions as the technology evolves. Last but not least, it provides tools to help you develop your own gestures.
The privacy Meta always offered for hand tracking also applies here. The images and estimated points specific to your hands are deleted after processing and aren't stored on our servers.
Voice SDK Experimental was also announced on Connect 2021, which will be available in the next release so you can start creating and experimenting. Voice SDK is a set of natural language features that lets you create hands-free navigation and new voice-controlled games. With Voice SDK, you can create voice navigation and search, or enable Voice FAQ to allow users to ask for help or a reminder. New voice-controlled gameplay is also enabled, such as winning a battle with a voice-controlled spell, or speaking to a character or avatar. The Voice SDK is powered by Meta's Wit.ai natural language platform and is free to sign up and get started.
With all listed above released on Connect 2021, and made even better during 2022 - it becomes clear that Meta is getting ready for 2022 product releases, developer tools and SDKs. If Project Cambria VR goes live in 2022, this should enable new set of capabilities and use cases.
Stay tuned for updates and features coverage from Connect 2022!