<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: T, Sato</title>
    <description>The latest articles on DEV Community by T, Sato (@tkr1234st).</description>
    <link>https://dev.to/tkr1234st</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/tkr1234st"/>
    <language>en</language>
    <item>
      <title>How to Show “North” on Vision Pro—Even Without a Compass: Navigation with iPhone + Vision Pro</title>
      <dc:creator>T, Sato</dc:creator>
      <pubDate>Thu, 26 Jun 2025 14:07:29 +0000</pubDate>
      <link>https://dev.to/tkr1234st/vision-pro-x-iphone-enabling-location-awareness-x-spatial-navigation-4fjn</link>
      <guid>https://dev.to/tkr1234st/vision-pro-x-iphone-enabling-location-awareness-x-spatial-navigation-4fjn</guid>
      <description>&lt;h2&gt;
  
  
  [Vision Pro] and the Limitations of Location Awareness
&lt;/h2&gt;

&lt;p&gt;The Vision Pro cannot perform navigation based on latitude/longitude and directional heading. This is because the device lacks built-in hardware for GPS and magnetic compass (geomagnetic sensor), making it unable to acquire such information.&lt;/p&gt;

&lt;p&gt;Note: While approximate latitude/longitude data can be obtained via Wi-Fi—making some location-based navigation possible—it is not possible to determine the direction the Vision Pro is facing based on any built-in sensors.&lt;/p&gt;

&lt;p&gt;However, for users to fully benefit from the unique potential of &lt;strong&gt;spatial navigation&lt;/strong&gt;, such as &lt;strong&gt;overlaying virtual information on the real world&lt;/strong&gt; and &lt;strong&gt;doing so in 360 degrees&lt;/strong&gt;, leveraging accurate location data becomes essential.&lt;/p&gt;

&lt;p&gt;For example, imagine a user wearing the Vision Pro: being able to intuitively understand the direction and distance to a destination—such as a store or landmark—from their current standing position.&lt;/p&gt;

&lt;p&gt;While smartphone map apps already allow users to check directions, having a constant visual reference for cardinal directions (North, South, East, West) overlaid in the real world feels fundamentally different. It's like being able to see the North Star at all times, even during the day.&lt;/p&gt;

&lt;h3&gt;
  
  
  Can Vision Pro Acquire [Latitude/Longitude and Directional Heading]?
&lt;/h3&gt;

&lt;p&gt;To reiterate, the Vision Pro has no way to acquire latitude, longitude, or directional heading. This is due to the absence of hardware such as GPS or magnetic sensors.&lt;/p&gt;

&lt;p&gt;Still, we want to make spatial navigation based on real-world location data possible.&lt;/p&gt;

&lt;p&gt;So how can we achieve that?&lt;/p&gt;

&lt;p&gt;Fortunately, we already have a device capable of obtaining accurate latitude, longitude, and heading—that device is the iPhone, or any modern smartphone.&lt;/p&gt;

&lt;h3&gt;
  
  
  Can the iPhone Work in Tandem with Vision Pro?
&lt;/h3&gt;

&lt;p&gt;If the Vision Pro lacks certain capabilities, why not let the iPhone fill in the gaps?&lt;br&gt;
But first—can the two devices even communicate with each other?&lt;/p&gt;

&lt;p&gt;If they can, the iPhone can provide the Vision Pro with the location data it gathers, allowing that information to be reflected in the MR space.&lt;/p&gt;
&lt;h3&gt;
  
  
  [LLD Compass]: A Spatial Compass Experience on Vision Pro
&lt;/h3&gt;

&lt;p&gt;To explore this possibility, we created a spatial compass app called &lt;strong&gt;LLD Compass&lt;/strong&gt;, which links the Vision Pro and iPhone to recreate directional orientation within a mixed reality (MR) space.&lt;br&gt;
The name LLD stands for Latitude, Longitude, and Direction.&lt;/p&gt;

&lt;p&gt;In this article, we’ll explore how &lt;strong&gt;LLD Compass&lt;/strong&gt; was made possible—covering the technical architecture, key implementation strategies, and the potential use cases it unlocks.&lt;/p&gt;
&lt;h2&gt;
  
  
  LLD Compass – App Overview and Use Cases
&lt;/h2&gt;

&lt;p&gt;By linking the Vision Pro with an iPhone, LLD Compass enables the visualization of cardinal directions (North, South, East, West) within a mixed reality environment.&lt;br&gt;
This is the core concept behind the app.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8k38xq5m6bdjrqmopb1r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8k38xq5m6bdjrqmopb1r.png" alt="Image description" width="800" height="527"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A red sphere represents North.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;
  
  
  What Kind of Experience Does It Offer?
&lt;/h3&gt;

&lt;p&gt;With the Vision Pro on, you can intuitively see spatial cues like:&lt;br&gt;
“This direction leads to Tokyo Tower,”&lt;br&gt;
“Haneda Airport is over that way,” or&lt;br&gt;
“From Tokyo, Paris lies in this direction”—all visualized directly in your surrounding space.&lt;/p&gt;

&lt;p&gt;While smartphone map apps can provide similar information, simply looking around to understand direction—without needing to glance at a screen—is a completely different experience.&lt;/p&gt;

&lt;p&gt;By leveraging the Vision Pro’s unique ability to deliver &lt;strong&gt;spatial awareness&lt;/strong&gt;, LLD Compass offers a new kind of value that traditional navigation tools can't provide.&lt;/p&gt;
&lt;h3&gt;
  
  
  Potential Use Cases
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Tourism Navigation: Instantly see which direction famous landmarks, shops, or attractions are located—just by looking around.&lt;/li&gt;
&lt;li&gt;Stargazing Guidance: Display constellations based on your orientation and current field of view.&lt;/li&gt;
&lt;li&gt;Venue Navigation: In theme parks or zoos, intuitively understand where your destination lies within the venue.&lt;/li&gt;
&lt;li&gt;Breaking News &amp;amp; Local Buzz: Get real-time directional cues for incidents, accidents, or trending reviews happening nearby.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Simply being able to see spatial and directional information at a glance greatly enriches the mixed reality experience.&lt;/p&gt;
&lt;h2&gt;
  
  
  System Architecture: Connecting Vision Pro and iPhone
&lt;/h2&gt;

&lt;p&gt;Device Roles and Responsibilities&lt;br&gt;
LLD Compass leverages the strengths of both devices by assigning them distinct roles:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;iPhone
Acquires latitude, longitude, and heading data, then sends it to the Vision Pro
Technologies used: SwiftUI / CoreLocation / MultipeerConnectivity&lt;/li&gt;
&lt;li&gt;Vision Pro
Receives the location data and renders it spatially within the MR environment
Technologies used: Unity / PolySpatial / Swift Bridge&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;
  
  
  How the Devices Communicate
&lt;/h3&gt;

&lt;p&gt;Using Apple’s MultipeerConnectivity framework, the iPhone and Vision Pro communicate over a peer-to-peer (P2P) connection.&lt;br&gt;
LLD Compass leverages this framework to send real-time location and heading data from the iPhone to the Vision Pro, where it’s immediately reflected in the mixed reality space.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;iPhone

&lt;ul&gt;
&lt;li&gt;Retrieves latitude, longitude, and heading information&lt;/li&gt;
&lt;li&gt;Serializes the data into JSON&lt;/li&gt;
&lt;li&gt;Sends it via MultipeerConnectivity&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Vision Pro (Swift Layer)

&lt;ul&gt;
&lt;li&gt;Receives the data using MultipeerConnectivity&lt;/li&gt;
&lt;li&gt;Passes the JSON to Unity&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Unity

&lt;ul&gt;
&lt;li&gt;Deserializes the JSON into usable latitude, longitude, and heading values&lt;/li&gt;
&lt;li&gt;Updates object positions and orientations accordingly in the spatial environment&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;
  
  
  App Structure
&lt;/h3&gt;

&lt;p&gt;Given the system architecture described above, the solution requires two separate apps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The iPhone app:
Responsible for acquiring latitude, longitude, and heading data, and sending it to the Vision Pro&lt;/li&gt;
&lt;li&gt;The Vision Pro app:
Receives the location data and visualizes it within the MR environment&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyhsyd60k52frhjgs42fq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyhsyd60k52frhjgs42fq.png" alt="Image description" width="800" height="800"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Implementation Challenges and Solutions
&lt;/h2&gt;
&lt;h3&gt;
  
  
  Bridging [iPhone], [Vision Pro], and [Unity]
&lt;/h3&gt;

&lt;p&gt;The MR environment in this project was built using Unity.&lt;br&gt;
One key challenge was how to reflect the data acquired on the iPhone within the Unity world running on Vision Pro.&lt;/p&gt;
&lt;h4&gt;
  
  
  Solution: Create a Swift Bridge on Vision Pro
&lt;/h4&gt;

&lt;p&gt;To pass data from the native layer to Unity, we used UnitySendMessage.&lt;br&gt;
A custom bridge was implemented in Swift, allowing the Swift layer on Vision Pro to forward received data directly into Unity.&lt;/p&gt;

&lt;p&gt;As a result, the MultipeerConnectivity data exchanged between iPhone and Vision Pro can now be accessed and utilized within Unity.&lt;/p&gt;
&lt;h3&gt;
  
  
  Vision Pro Has No Concept of "North"
&lt;/h3&gt;

&lt;p&gt;The Vision Pro has no way to determine true North.&lt;br&gt;
While its internal sensors can detect relative rotation, it cannot identify any absolute orientation—such as the direction of magnetic North on Earth.&lt;/p&gt;
&lt;h4&gt;
  
  
  Solution: Perform an Initial Orientation Sync with iPhone
&lt;/h4&gt;

&lt;p&gt;After the Vision Pro is worn, we initiate communication while both the Vision Pro and iPhone are facing the same direction.&lt;br&gt;
This moment is treated as the reference point—a shared initial orientation.&lt;/p&gt;

&lt;p&gt;From then on, the direction the iPhone is facing is assumed to be the same as that of the Vision Pro, effectively giving the Vision Pro a sense of absolute direction.&lt;/p&gt;
&lt;h3&gt;
  
  
  Determining the Direction of a Landmark
&lt;/h3&gt;

&lt;p&gt;Suppose the user's current location—where they’re wearing the Vision Pro—is point A, and a landmark (e.g., Tokyo Tower) is point B.&lt;br&gt;
We need to determine the relative direction from point A to point B.&lt;/p&gt;
&lt;h4&gt;
  
  
  Solution: Calculate the Bearing (Azimuth)
&lt;/h4&gt;

&lt;p&gt;Note: A bearing (or azimuth) is the angle measured clockwise from a reference direction—typically true North—to the line connecting two points.&lt;br&gt;
It’s widely used in fields such as geography, navigation, aviation, surveying, and GPS systems.&lt;/p&gt;

&lt;p&gt;By using the latitude and longitude of both point A and point B, we can calculate the bearing from A to B, which tells us the direction in which the landmark lies.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="kt"&gt;double&lt;/span&gt; &lt;span class="nf"&gt;GetBearing&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;double&lt;/span&gt; &lt;span class="n"&gt;lat1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kt"&gt;double&lt;/span&gt; &lt;span class="n"&gt;lon1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kt"&gt;double&lt;/span&gt; &lt;span class="n"&gt;lat2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kt"&gt;double&lt;/span&gt; &lt;span class="n"&gt;lon2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;phi1&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;lat1&lt;/span&gt; &lt;span class="p"&gt;*&lt;/span&gt; &lt;span class="n"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;PI&lt;/span&gt; &lt;span class="p"&gt;/&lt;/span&gt; &lt;span class="m"&gt;180&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;phi2&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;lat2&lt;/span&gt; &lt;span class="p"&gt;*&lt;/span&gt; &lt;span class="n"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;PI&lt;/span&gt; &lt;span class="p"&gt;/&lt;/span&gt; &lt;span class="m"&gt;180&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;deltaLambda&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;lon2&lt;/span&gt; &lt;span class="p"&gt;-&lt;/span&gt; &lt;span class="n"&gt;lon1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;*&lt;/span&gt; &lt;span class="n"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;PI&lt;/span&gt; &lt;span class="p"&gt;/&lt;/span&gt; &lt;span class="m"&gt;180&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;y&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Sin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;deltaLambda&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;*&lt;/span&gt; &lt;span class="n"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Cos&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;phi2&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Cos&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;phi1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;*&lt;/span&gt; &lt;span class="n"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Sin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;phi2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;-&lt;/span&gt;
            &lt;span class="n"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Sin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;phi1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;*&lt;/span&gt; &lt;span class="n"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Cos&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;phi2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;*&lt;/span&gt; &lt;span class="n"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Cos&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;deltaLambda&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;theta&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Atan2&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;bearing&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;theta&lt;/span&gt; &lt;span class="p"&gt;*&lt;/span&gt; &lt;span class="m"&gt;180&lt;/span&gt; &lt;span class="p"&gt;/&lt;/span&gt; &lt;span class="n"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;PI&lt;/span&gt; &lt;span class="p"&gt;+&lt;/span&gt; &lt;span class="m"&gt;360&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;%&lt;/span&gt; &lt;span class="m"&gt;360&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;bearing&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="c1"&gt;// 北=0, 東=90, 南=180, 西=270&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Future Outlook and Applications
&lt;/h2&gt;

&lt;p&gt;Through the development of LLD Compass, we’ve come to truly appreciate the value and potential of bringing direction and location into MR space.&lt;/p&gt;

&lt;p&gt;This technology can be applied to a wide range of services, such as:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Integration with Map Services&lt;br&gt;
By projecting maps into spatial environments, users can experience the sensation of having a map unfold at their feet, making it easier to understand their surroundings.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Real-Time Flight Tracking and Live Positioning&lt;br&gt;
Aircraft flying overhead can be displayed in real time—showing their direction and path—allowing users to visually understand where each plane is coming from and where it’s headed.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Shared Location Awareness Among Multiple Users&lt;br&gt;
In fieldwork or at event venues, users can share their location data with others. This enables a shared experience where everyone can see each other’s position and orientation in the same MR space.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Final Thoughts
&lt;/h2&gt;

&lt;p&gt;By combining the strengths of two distinct devices—Vision Pro and iPhone—LLD Compass delivers a simple yet powerful experience:&lt;br&gt;
giving the &lt;strong&gt;Vision Pro a true concept of North&lt;/strong&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;While Vision Pro excels at spatial expression, it has inherent limitations in acquiring location data&lt;/li&gt;
&lt;li&gt;The iPhone compensates for this, acting as a sensor to enable spatial navigation&lt;/li&gt;
&lt;li&gt;Real-time integration is achieved through Unity × Swift × MultipeerConnectivity&lt;/li&gt;
&lt;li&gt;From the user’s current position, it becomes possible to calculate the bearing to any external point&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The true value of Vision Pro lies in &lt;strong&gt;how it overlays information onto the real world&lt;/strong&gt;.&lt;br&gt;
By embedding location and directional context into the spatial environment, we unlock entirely new and deeply immersive experiences.&lt;/p&gt;

&lt;p&gt;LLD Compass is both a prototype and an experiment—a first step toward exploring that potential.&lt;/p&gt;

&lt;p&gt;Moving forward, I hope to continue exploring &lt;strong&gt;what new kinds of experiences&lt;/strong&gt; the Vision Pro can offer.&lt;/p&gt;

</description>
      <category>visionos</category>
      <category>visionpro</category>
      <category>unity3d</category>
      <category>spatiallocation</category>
    </item>
    <item>
      <title>Getting Started with Vision Pro App Development Using Unity: “Metal” vs “RealityKit”</title>
      <dc:creator>T, Sato</dc:creator>
      <pubDate>Mon, 31 Mar 2025 02:12:54 +0000</pubDate>
      <link>https://dev.to/tkr1234st/getting-started-with-vision-pro-app-development-using-unity-metal-vs-realitykit-4mb7</link>
      <guid>https://dev.to/tkr1234st/getting-started-with-vision-pro-app-development-using-unity-metal-vs-realitykit-4mb7</guid>
      <description>&lt;h2&gt;
  
  
  “Metal” vs “RealityKit”
&lt;/h2&gt;

&lt;p&gt;When developing Vision Pro applications, the most common method is to use “Xcode”. However, it is also possible to develop using “Unity”.&lt;/p&gt;

&lt;p&gt;This article explains two approaches available when developing Vision Pro apps with Unity: &lt;strong&gt;“Metal-based Apps on visionOS”&lt;/strong&gt; and &lt;strong&gt;“RealityKit apps on visionOS”&lt;/strong&gt;. We’ll refer to these two methods as &lt;strong&gt;“Metal mode”&lt;/strong&gt; and &lt;strong&gt;“RealityKit mode”&lt;/strong&gt; throughout this article.&lt;/p&gt;

&lt;p&gt;Note: Unity does not officially define something called “Metal mode” or “RealityKit mode”. In this article, we use these terms simply to make the distinction clearer when discussing each approach’s features and use cases.&lt;/p&gt;

&lt;p&gt;Note: In Unity, there is another development approach called “Windowed Apps in visionOS”. However, since it does not utilize the XR features unique to visionOS, it is not covered in this article.&lt;/p&gt;

&lt;h2&gt;
  
  
  Two Rendering Methods in Unity for Vision Pro
&lt;/h2&gt;

&lt;p&gt;When developing for Vision Pro in Unity, two main rendering methods are available:&lt;/p&gt;

&lt;h3&gt;
  
  
  Metal
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;An approach that directly uses Apple’s low-level graphics API, “Metal”, for rendering&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Allows full utilization of Unity’s built-in rendering pipeline (e.g., shaders, post-processing, lighting)&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  RealityKit (PolySpatial)
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;An approach that uses Apple’s high-level 3D/AR framework, “RealityKit”&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Passes the Unity scene data to RealityKit, making it easier to build apps that leverage Vision Pro’s unique spatial computing capabilities&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  What is Metal?
&lt;/h2&gt;

&lt;p&gt;Metal is Apple’s low-level graphics API, which allows applications to send commands directly to the GPU for maximum performance.&lt;br&gt;
Unity has long supported rendering functions that target Metal, so if you choose “Metal mode”, the development process will be quite similar to creating other 3D games for iOS.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is RealityKit?
&lt;/h2&gt;

&lt;p&gt;On the other hand, RealityKit is a high-level 3D framework designed by Apple specifically for AR/VR. It includes not only 3D rendering but also a physics engine, multi-user features, and other functions tailored for AR and VR.&lt;br&gt;
Because Vision Pro is designed with mixed reality (MR) capabilities in mind—overlapping virtual objects onto the real-world environment—RealityKit is especially well-suited for this device.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnn0zu05vj4z0r1sdgjcl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnn0zu05vj4z0r1sdgjcl.png" alt="Image description" width="800" height="297"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Metal Mode
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Unity handles the rendering on its own&lt;/strong&gt;&lt;br&gt;
You can fully utilize Unity’s built-in rendering pipeline, setting up shaders, post-processing, and more with great flexibility.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Easy to use Unity’s standard XR toolkit&lt;/strong&gt;&lt;br&gt;
You can employ development methods similar to other VR platforms in Unity, such as XR Origin, camera setups, and Tracked Pose Driver.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  RealityKit Mode
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Delegates rendering to Apple’s AR engine&lt;/strong&gt;&lt;br&gt;
Unity sends its scene and object data to RealityKit for rendering, enabling native use of RealityKit’s spatial computing and AR/MR features.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Constraints on Unity-specific features&lt;/strong&gt;&lt;br&gt;
Custom shaders do not run directly on the RealityKit side. Instead, you must go through Shader Graph, and some features are limited.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Creating a Sample Scene
&lt;/h2&gt;

&lt;p&gt;Below is a quick overview of how to set up a sample project in Unity for Vision Pro.&lt;/p&gt;

&lt;p&gt;Note: This section focuses only on the unique steps for each “Metal mode” and “RealityKit mode”. Also, keep in mind that versions and settings may change in the future, so always refer to Unity’s official documentation for the latest details.&lt;/p&gt;

&lt;h3&gt;
  
  
  Creating a Project in Metal Mode
&lt;/h3&gt;

&lt;h4&gt;
  
  
  1. Project Settings
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Install the Apple visionOS XR Plugin&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;From the “Package Manager”, install the “Apple visionOS XR Plugin”.&lt;br&gt;
(Or skip this step initially and let the next steps prompt you to install it automatically.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fezsibi834vophtz48qqh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fezsibi834vophtz48qqh.png" alt="Image description" width="800" height="345"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Set up XR Plug-in Management&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In “Project Settings &amp;gt; XR Plug-in Management”, select “Apple visionOS”.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwhzgakbr31n3f885bghm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwhzgakbr31n3f885bghm.png" alt="Image description" width="800" height="497"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Select App Mode&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In “XR Plug-in Management &amp;gt; Apple visionOS &amp;gt; App Mode”, set it to “Metal Rendering with Compositor Services”.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3cwgnmmbbvqdq26bv487.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3cwgnmmbbvqdq26bv487.png" alt="Image description" width="800" height="319"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  2. Scene Creation
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Add AR Session and XR Origin&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;From the menu, go to “GameObject &amp;gt; XR &amp;gt; AR Session” to add an AR Session.&lt;br&gt;
  Then go to “GameObject &amp;gt; XR &amp;gt; XR Origin (Mobile AR)” to add an XR Origin.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyzsbl3dyj26px5ttfuow.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyzsbl3dyj26px5ttfuow.png" alt="Image description" width="800" height="621"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Background Settings&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you want an immersive VR-like environment, set the Main Camera’s “Background Type” to SkyBox.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9qycw7lgvmwtwfym9p5b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9qycw7lgvmwtwfym9p5b.png" alt="Image description" width="800" height="656"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you want an MR-like scene that shows the real world in the background, set “Background Type” to Solid Color, choose black (e.g., [0, 0, 0, 0]) with Alpha 0 for full transparency.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flgx8zxiq62abk6l77k6z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flgx8zxiq62abk6l77k6z.png" alt="Image description" width="800" height="656"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Place 3D Objects&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Place a 3D object near the center of the scene.&lt;/p&gt;

&lt;h4&gt;
  
  
  3. Build and Check Rendering
&lt;/h4&gt;

&lt;p&gt;Verify that the 3D object you placed is rendered correctly.&lt;/p&gt;

&lt;h3&gt;
  
  
  Creating a Project in RealityKit Mode (PolySpatial)
&lt;/h3&gt;

&lt;h4&gt;
  
  
  1. Project Settings
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Install the Apple visionOS XR Plugin&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;From the “Package Manager”, install the “Apple visionOS XR Plugin”.&lt;br&gt;
(Or skip this step initially and let the next steps prompt you to install it automatically.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fezsibi834vophtz48qqh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fezsibi834vophtz48qqh.png" alt="Image description" width="800" height="345"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Set up XR Plug-in Management&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In “Project Settings &amp;gt; XR Plug-in Management”, select “Apple visionOS”.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwhzgakbr31n3f885bghm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwhzgakbr31n3f885bghm.png" alt="Image description" width="800" height="497"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Install PolySpatial&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;From the “Package Manager”, install the “PolySpatial” package.&lt;br&gt;
(Or skip this step and proceed to the final step, which may automatically prompt installation.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhroopamgpzla2pee608m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhroopamgpzla2pee608m.png" alt="Image description" width="800" height="272"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Select App Mode&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In “XR Plug-in Management &amp;gt; Apple visionOS &amp;gt; App Mode”, set it to “RealityKit with PolySpatial”.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6ocxg6g75tb37inww3bw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6ocxg6g75tb37inww3bw.png" alt="Image description" width="800" height="312"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  2. Scene Creation
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Add AR Session&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;From the menu, go to “GameObject &amp;gt; XR &amp;gt; AR Session” to add an AR Session.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwchxqcisdbkmuz6bvztp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwchxqcisdbkmuz6bvztp.png" alt="Image description" width="800" height="797"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Add Volume Camera&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;From the menu, go to “GameObject &amp;gt; XR &amp;gt; Setup &amp;gt; Volume Camera” to add a Volume Camera.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv8x4l1jtvd0nqpnd2abr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv8x4l1jtvd0nqpnd2abr.png" alt="Image description" width="800" height="796"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;VolumeCamera Window Configuration&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;From the menu, “Assets &amp;gt; Create &amp;gt; PolySpatial &amp;gt; Volume Camera Window Configuration”.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd61o8qtj0tbs213rqlia.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd61o8qtj0tbs213rqlia.png" alt="Image description" width="800" height="788"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then choose either Bounded volume (a fixed display area) or Unbounded volume (extended across the entire space) and attach this configuration to the Volume Camera.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe3klvpfujanoys2jt8ul.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe3klvpfujanoys2jt8ul.png" alt="Image description" width="800" height="278"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpib6tymdmofw85nc0zob.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpib6tymdmofw85nc0zob.png" alt="Image description" width="800" height="278"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Place 3D Objects&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Place a 3D object within the Volume Camera’s area.&lt;/p&gt;

&lt;h4&gt;
  
  
  3. Build and Check Rendering
&lt;/h4&gt;

&lt;p&gt;Verify that the placed 3D object is rendered correctly.&lt;/p&gt;

&lt;h2&gt;
  
  
  Use Cases for Each Mode
&lt;/h2&gt;

&lt;p&gt;How do you decide whether to use Metal mode or RealityKit mode? Below are some practical examples.&lt;/p&gt;

&lt;h3&gt;
  
  
  When Metal Mode Is Suitable
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;If you want a fully virtual space&lt;/strong&gt;&lt;br&gt;
For VR games, simulations, or immersive cinematic experiences, where you want rich graphics in a purely virtual environment.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;If you want advanced shaders or post-processing&lt;/strong&gt;&lt;br&gt;
Ideal for utilizing custom shaders or high-end effects that come with Unity’s standard rendering capabilities.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;If you want to align with other VR platforms&lt;/strong&gt;&lt;br&gt;
For instance, when porting an existing VR project (for PC VR or standalone devices) to Vision Pro and you prefer to maintain a similar Unity XR workflow.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  When RealityKit Mode Is Suitable
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;If you want to overlay virtual objects onto the real world&lt;/strong&gt;&lt;br&gt;
Take advantage of Vision Pro’s spatial computing features (tracking floors, walls, etc.) to create MR experiences.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;If you prefer a simple, utility-focused MR app&lt;/strong&gt;&lt;br&gt;
Great for displaying additional information or guides while viewing the real world.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;If you need a persistent, always-on app that works alongside other apps&lt;/strong&gt;&lt;br&gt;
For example, placing widgets within your environment. Instead of complex custom shaders, you focus on smooth spatial placement and multitasking.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  In Closing
&lt;/h2&gt;

&lt;p&gt;When developing Vision Pro apps with Unity, choosing between “Metal mode” and “RealityKit mode” largely depends on your app’s purpose and the kind of experience you want to create.&lt;/p&gt;

&lt;h3&gt;
  
  
  Metal mode
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Development follows a workflow similar to conventional VR, allowing for rich graphical expression and advanced shaders.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  RealityKit mode
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;By leveraging RealityKit’s optimization for spatial computing, it becomes easier to create apps that incorporate MR elements.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each approach has its pros and cons, so you should carefully consider factors such as “the direction of your app”, “the level of visual quality required”, and “the depth of interaction” early in development.&lt;/p&gt;

&lt;p&gt;Since Vision Pro may undergo substantial updates and new SDK features, it’s crucial to keep an eye on the official documentation and release notes—and, of course, to experiment hands-on to find the best workflow for your needs.&lt;/p&gt;

</description>
      <category>visionos</category>
      <category>visionpro</category>
      <category>unity3d</category>
    </item>
  </channel>
</rss>
