<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Art Sh</title>
    <description>The latest articles on DEV Community by Art Sh (@shiaart).</description>
    <link>https://dev.to/shiaart</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/shiaart"/>
    <language>en</language>
    <item>
      <title>Role of Unity and Unreal Engine in VR development</title>
      <dc:creator>Art Sh</dc:creator>
      <pubDate>Sun, 18 Dec 2022 19:27:35 +0000</pubDate>
      <link>https://dev.to/shiaart/role-of-unity-and-unreal-engine-in-vr-development-e8e</link>
      <guid>https://dev.to/shiaart/role-of-unity-and-unreal-engine-in-vr-development-e8e</guid>
      <description>&lt;p&gt;&lt;strong&gt;Virtual reality (VR)&lt;/strong&gt; technology has made significant strides in recent years, and developers have a variety of tools at their disposal to create immersive VR experiences. Two of the most popular game engines for VR development are Unity and Unreal Engine.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgpcpue1pld5utuaghs0g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgpcpue1pld5utuaghs0g.png" alt="unity logo" width="800" height="327"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;&lt;a href="https://unity.com/" rel="noopener noreferrer"&gt;Unity&lt;/a&gt;&lt;/strong&gt; is a cross-platform game engine that has been widely adopted by the game development community. It is known for its flexibility and ease of use, making it a good choice for developers who are new to VR development. Unity also has a large community of developers and a robust asset store, which can be helpful for finding resources and getting support.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk6gt5vx76ov2x8z17kpn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk6gt5vx76ov2x8z17kpn.png" alt="Unreal engine logo" width="800" height="467"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;&lt;a href="https://www.unrealengine.com/" rel="noopener noreferrer"&gt;Unreal Engine&lt;/a&gt;&lt;/strong&gt;, on the other hand, is a more powerful and feature-rich game engine that has been used to develop a number of high-quality VR experiences. Unreal Engine has a robust visual scripting system and advanced rendering capabilities, which can be useful for creating more complex and visually impressive VR environments. However, Unreal Engine can also be more difficult to learn and may require a steeper learning curve for developers who are new to the engine.&lt;/p&gt;

&lt;p&gt;Both Unity and Unreal Engine offer a range of tools and features specifically designed for VR development. These include support for motion controllers, haptic feedback, and real-time rendering. They also both support a variety of VR headsets, including &lt;a href="https://www.oculus.com/experiences/quest" rel="noopener noreferrer"&gt;Oculus Quest 2/Pro&lt;/a&gt;, &lt;a href="https://www.vive.com/" rel="noopener noreferrer"&gt;HTC Vive&lt;/a&gt;, and &lt;a href="https://www.playstation.com/en-gb/ps-vr/" rel="noopener noreferrer"&gt;PlayStation VR&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Ultimately, the choice between Unity and Unreal Engine for VR development will depend on the needs and experience level of the developer. Unity is a good choice for developers who are just starting out with VR development and are looking for an easy-to-use and flexible game engine. Unreal Engine is a better choice for more experienced developers who are looking for advanced features and powerful rendering capabilities.&lt;/p&gt;

&lt;p&gt;In conclusion, both Unity and Unreal Engine are excellent choices for VR development, and developers can choose the engine that best meets their needs and experience level. Regardless of the engine they choose, developers will need to be familiar with VR development principles and techniques, such as locomotion, user interaction, and optimization for VR. By understanding these principles and using the right tools, developers can create immersive and engaging VR experiences for users.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>rust</category>
      <category>cleancode</category>
      <category>discuss</category>
    </item>
    <item>
      <title>How AR will disrupt retail and sales</title>
      <dc:creator>Art Sh</dc:creator>
      <pubDate>Fri, 02 Dec 2022 22:28:46 +0000</pubDate>
      <link>https://dev.to/shiaart/how-ar-will-disrupt-retail-and-sales-5fbk</link>
      <guid>https://dev.to/shiaart/how-ar-will-disrupt-retail-and-sales-5fbk</guid>
      <description>&lt;p&gt;There are two ways in which AR operates:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Mobile AR&lt;/strong&gt; – Arguably the most common form of AR, where customers hold their smartphone cameras in front of their surrounding environment. The mobile AR application superimposes digital images onto the camera feed, allowing digital objects to “latch onto” real-world surroundings as if following the laws of physics. Snapchat and Pokémon Go are good examples of mobile AR.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AR glasses&lt;/strong&gt; – Users can also view AR feeds in the field of view (FoV) of a pair of smart glasses. This method is more expensive than mobile AR, but supports more precise applications.&lt;/p&gt;

&lt;p&gt;The retail industry is no stranger to technological innovation and disruption. From the rise of e-commerce to the incorporation of virtual reality in store design, the industry has constantly adapted to new technologies in order to enhance the customer experience and stay ahead of the competition.&lt;/p&gt;

&lt;p&gt;One such technology that is set to revolutionize the retail sector is augmented reality (AR). AR technology allows for the overlaying of digital content onto the physical world, creating an enhanced reality that can be experienced through devices such as smartphones or specialized headsets and powered by software platforms such as &lt;a href="https://dev.to/shiaart/metas-presence-platform-introduction-4kf0"&gt;Meta’s Presence Platform&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;In the retail sector, AR technology has the potential to transform the way customers interact with products and stores. For example, AR can be used to create virtual dressing rooms where customers can try on clothes without actually having to physically change into them. This not only saves time and effort for the customer, but also reduces the need for physical inventory and increases the potential for impulse purchases.&lt;/p&gt;

&lt;p&gt;AR can also be used to create interactive displays and advertisements within stores. For example, customers can scan a product with their smartphone and see an AR demonstration of how it works, or scan a promotional poster and receive a digital coupon or special offer. This not only enhances the customer experience, but also increases engagement and drives sales.&lt;/p&gt;

&lt;p&gt;In addition to AR, the concept of the Metaverse is also set to transform the retail sector. The Metaverse refers to a collective virtual shared space, where users can interact with each other and with digital content in real time. This allows for the creation of fully immersive virtual experiences, such as virtual stores or events, where customers can shop, socialize, and engage with brands in a virtual environment.&lt;/p&gt;

&lt;p&gt;The potential for the Metaverse in the retail sector is vast. For example, brands can create virtual stores within the Metaverse where customers can browse and purchase products, just as they would in a physical store. These virtual stores can be customized and personalized to each customer’s preferences and needs, providing a more tailored and engaging shopping experience.&lt;/p&gt;

&lt;p&gt;In addition, the Metaverse allows for the creation of virtual events and experiences, such as fashion shows or product demonstrations, where customers can interact with each other and with the brand in real time. This not only enhances the customer experience, but also increases engagement and brand loyalty.&lt;/p&gt;

&lt;p&gt;In conclusion, both augmented reality and the Metaverse have the potential to transform the retail sector in significant ways. By enhancing the customer experience and increasing engagement, these technologies have the power to drive sales and drive the industry forward. As such, it is important for retailers to keep an eye on these technologies and consider incorporating them into their business strategies in order to stay competitive in an ever-evolving market.&lt;/p&gt;

</description>
      <category>ar</category>
      <category>metaverse</category>
      <category>retail</category>
      <category>discuss</category>
    </item>
    <item>
      <title>2022 Virtual Reality Startups in the UK</title>
      <dc:creator>Art Sh</dc:creator>
      <pubDate>Sun, 16 Oct 2022 22:37:54 +0000</pubDate>
      <link>https://dev.to/shiaart/features-to-expect-from-apples-upcoming-arvr-headset-5ded</link>
      <guid>https://dev.to/shiaart/features-to-expect-from-apples-upcoming-arvr-headset-5ded</guid>
      <description>&lt;p&gt;Virtual and mixed reality market continues to grow even during challenging economic times. UK as one of the biggest tech hubs has a few startups, in addition to Meta, Google and Apple - as they have thousands of employees in the UK, building their next generation Virtual, Augmented and mixed reality platforms and products.&lt;/p&gt;

&lt;h2&gt;
  
  
  Ultraleap
&lt;/h2&gt;

&lt;p&gt;Provider of spatial interaction technology solutions.&lt;/p&gt;

&lt;p&gt;Founded Year: 2013&lt;br&gt;
Location: Bristol&lt;br&gt;
Funding: 190M USD&lt;br&gt;
Major Investors: Tencent, Mayfair, IP Group&lt;/p&gt;

&lt;p&gt;Ultraleap provide spatial interaction technology solutions. Offers a haptic module that enables to integrate virtual touch and an optical hand tracking module which has the capability to capture the movements of an user's hands with unparalleled accuracy and near-zero latency. Also offers TouchFree application, which enables to add touchless gesture control to interactive screens. The software application runs on an interactive kiosk or advertising totem and has the capability to detects a user’s hand in mid-air and converts it to an on-screen cursor.&lt;/p&gt;

&lt;h2&gt;
  
  
  nDreams
&lt;/h2&gt;

&lt;p&gt;VR games development&lt;/p&gt;

&lt;p&gt;Founded Year: 2006&lt;br&gt;
Location: Farnborough&lt;br&gt;
Funding: 47M USD&lt;br&gt;
Investors: Mercia, Tech Nation&lt;/p&gt;

&lt;p&gt;Developer of Indie VR games. The company develops multiple VR games. It also develops VR-based entertainment content and provides game development services for other companies.&lt;/p&gt;

&lt;h2&gt;
  
  
  Gravity Sketch
&lt;/h2&gt;

&lt;p&gt;Provider of a 3D sketching tool for VR applications&lt;/p&gt;

&lt;p&gt;Founded Year: 2014&lt;br&gt;
Location: London&lt;br&gt;
Funding: 40M USD&lt;br&gt;
Investors: Accel, GV, Kindred Capital&lt;/p&gt;

&lt;p&gt;Provider of a 3D sketching tool for VR applications. It comprises a pen and pad specifically designed for sketching in real space through the use of augmented reality glasses. The pen and the pad give the physical interaction of drawing in the digital world and by adjusting the plane the user can give volume to the sketch. The augmented reality glasses would also help visualize the product just in front of their eyes. The virtual model can be printed using a 3D printer.&lt;/p&gt;

&lt;h2&gt;
  
  
  Immerse
&lt;/h2&gt;

&lt;p&gt;Developer of a VR platform to create and share VR experiences with multiple users.&lt;/p&gt;

&lt;p&gt;Founded Year: 2005&lt;br&gt;
Location: London&lt;br&gt;
Funding: 16M USD&lt;br&gt;
Investors: Erik Blachford, Trevor Fenwick, David Tomlinson&lt;/p&gt;

&lt;p&gt;Developer of a VR platform to create and share VR experiences with multiple users. It offers an SDK-based platform compatible with Unity that provides users with tools, including a library of VR, for creating and sharing VR/3D experiences. It also allows users to import OBJ models for interacting in VR. It is compatible with VR headsets such as Oculus and Vive and Chrome browsers. The company plans to enable the platform for real-time user interaction analytics in the future. The main applications are in the areas of defense, healthcare, mining, oil and gas, satellite communication, and nuclear and renewable energy.&lt;/p&gt;

&lt;h2&gt;
  
  
  FitXR
&lt;/h2&gt;

&lt;p&gt;Virtual reality based personalized fitness coaching platform.&lt;/p&gt;

&lt;p&gt;Founded Year: 2016&lt;br&gt;
Location: London&lt;br&gt;
Funding: 11M USD&lt;br&gt;
Investors: Hiro Capital, Boost VC, Maveron&lt;/p&gt;

&lt;p&gt;FitXR provides a virtual reality-based personalized fitness coaching platform. Claims to building an ecosystem to deliver virtual reality fitness experiences that can guide, motivate and entertain. The company has launched its first product, BOXVR, a fitness game and is described as "Guitar Hero crosses with a studio boxing workout". Available on HTC Vive, Oculus, and Steam.&lt;/p&gt;

&lt;h2&gt;
  
  
  FundamentalVR
&lt;/h2&gt;

&lt;p&gt;Provider of a virtual reality-based medical education platform.&lt;/p&gt;

&lt;p&gt;Founded Year: 2012&lt;br&gt;
Location: London&lt;br&gt;
Funding: 30M USD&lt;br&gt;
Investors: EQT, Downing Ventures, Downing&lt;/p&gt;

&lt;p&gt;Provider of a virtual reality-based medical education platform. Its cloud-based platform provides surgical simulation and training solutions using haptics in a virtual environment. Its proprietary haptics system uses AI technologies and allows sensing resistance, texture, substance, and pressure, enabling professionals to practice or refine their surgical skills. Its hardware agnostic software engine claims to deliver kinesthetic force feedback haptics into a variety of handheld devices ranging from base station-held instruments to haptic gloves.&lt;/p&gt;

&lt;h2&gt;
  
  
  NCTech Imaging
&lt;/h2&gt;

&lt;p&gt;360 degree camera for spherical immersive images.&lt;/p&gt;

&lt;p&gt;Founded Year: 2010&lt;br&gt;
Location: Edinburgh&lt;br&gt;
Funding: 25M USD&lt;br&gt;
Investors: Archangels, Scottish Enterprise, Eg Thomson Holdings&lt;/p&gt;

&lt;p&gt;NCTech provides 2 products, iris360 &amp;amp; iSTAR, for 360 degree capture of environment. Provides software to process, color overlay for laser scan and share images captured by the cameras. Also gives and SDK to integrate cameras with third party systems and software. Their automatic 360 degree camera is approved by Google for Street View. Originally developed for military and police, their camera now is used in applications like 3D laser documentation, asset management for engineering and heritage sectors etc. Partners with 12 companies, including Google, Apple, Autodesk, Nokia. The iris360 camera, to be available from Oct'15, costs user $2K for pre-order.&lt;/p&gt;

&lt;p&gt;This is not a complete list of all UK based startups, and market keeps growing year on year. Big tech companies continue to invest billions into Metaverse, Virtual and Augmented reality areas - demonstrating the trend and huge potential for startups in this area.&lt;/p&gt;

</description>
      <category>ar</category>
      <category>vr</category>
    </item>
    <item>
      <title>Engineering Leadership in Big Tech</title>
      <dc:creator>Art Sh</dc:creator>
      <pubDate>Sun, 18 Sep 2022 17:50:41 +0000</pubDate>
      <link>https://dev.to/shiaart/engineering-leadership-in-big-tech-23g4</link>
      <guid>https://dev.to/shiaart/engineering-leadership-in-big-tech-23g4</guid>
      <description>&lt;p&gt;In this article, we'll go through some of the common job titles used for senior management positions in tech companies. We created these to have transparent career paths, both for engineers already working at the company and for those applying for our open positions, so they can see how they can develop over time if they get the job.&lt;/p&gt;

&lt;p&gt;We won't go into the role of CTO in this article because we think it's pretty well understood: You lead a department either by being promoted above the management level - sometimes over the course of an entire career - or by being the first senior engineer in a company. However, the levels in between are often a bit of a mystery, so we'll take some time to unravel them. This should give you a better idea of what to expect if you take this route, but also a better understanding of what people in your current company do in such positions.&lt;/p&gt;

&lt;p&gt;The job titles in the executive ranks of most good technology companies are pretty standard. There are Engineering Managers, Directors of Engineering, VPs of Engineering and finally CTO. However, what the role actually entails depends on the size of the company. For example, a Director of Engineering in a medium sized company may have a few small teams reporting to him, but in the largest companies (e.g. FAANG) the same job title may mean managing a department of 250 people. So if you move to a very large company, you may have to accept a "downgrade" in job title even though the role you take on is bigger.&lt;/p&gt;

&lt;p&gt;In this article, we'll look at each job title and what it might mean in a start-up or smaller company, and then compare it to the equivalent position in some of the largest companies in our industry. This way you'll not only get an idea of what the job entails, but also where you should best start thinking about your next move. For example, would you rather take on a bigger role with more autonomy in a smaller company, or do you want to establish yourself in larger tech companies first by taking a sideways step and then working your way up? Different people have different motivations.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Three Levels
&lt;/h2&gt;

&lt;p&gt;To better classify the following job titles, there's a nice, if scary-sounding, stage definition from the military: the three stages of war. I'm by no means a military person, but they've certainly thought a lot about leadership issues over the years.&lt;/p&gt;

&lt;p&gt;These three levels are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Tactical: leading others to win individual battles and engagements.&lt;/li&gt;
&lt;li&gt;Operational: planning, executing, sustaining and adapting campaigns to achieve strategic objectives.&lt;/li&gt;
&lt;li&gt;Strategic: defining outcomes that form strategic goals: why and with what we want to achieve something.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Using these criteria, we can better understand what it means to move up the leadership ladder.&lt;/p&gt;

&lt;h2&gt;
  
  
  Engineering Manager
&lt;/h2&gt;

&lt;p&gt;An engineering manager (EM) usually leads a team of about 5-7 people. If you contribute as an individual, it is sometimes difficult to get the first job EM, as companies sometimes prefer candidates who have already managed staff. &lt;/p&gt;

&lt;p&gt;By our definition, EMs are tactical. They lead a team to deliver its part of the whole. What that part of the whole is has usually already been defined for them.&lt;/p&gt;

&lt;p&gt;One strategy for getting your first EM role is to join a fast-growing company as a IC with the aim of developing yourself and ensuring that you practise and demonstrate skills that show you are suited to a leadership position. This may include putting yourself forward to lead projects, mentoring others, influencing decisions and continually building a track record for delivering high quality software. If you are not already working in a company that regularly has EM vacancies, it is much harder to progress as you will need to apply elsewhere.&lt;/p&gt;

&lt;p&gt;It's worth noting that you may also see the job title Senior Engineering Manager. This usually means someone who manages a team but has much more seniority and experience than a EM without the prefix senior. Larger companies offer this level of advancement to ensure that EMs have more opportunities for career development without jumping straight to Director, as this requires more movement on the organisational chart. In addition, for EMs who enjoy technical contribution and management roles, the career path from EM to Senior EM can be very rewarding and long-lasting. They can gain influence, seniority and impact and still contribute code.&lt;/p&gt;

&lt;h2&gt;
  
  
  Director of Engineering
&lt;/h2&gt;

&lt;p&gt;The role of Director of Engineering is usually the first one where you start managing managers. I mentioned at the beginning of this article that I would explore what the role means in start-ups and in larger companies, but it is rare to see the Director of Engineering in start-ups as it is more of an artefact of middle management in medium and large companies. &lt;/p&gt;

&lt;p&gt;By our definition, directors are operational. They coordinate and execute multiple actions as part of a larger strategic goal. They usually have more control over the how, but the why has already been decided for them.&lt;/p&gt;

&lt;p&gt;The role itself varies from company to company. However, there are often some commonalities in the definition as the next major development step from EM:&lt;/p&gt;

&lt;p&gt;1) You begin to lead other managers. This means that you can grow to a reasonable size in the organisation. If you assume that a EM has about 7 direct subordinates who are ICs, then the largest team that a EM could manage is the same number. However, as a director manages managers, he could have 7 or more managers reporting to him, each with their own team. That's a lot more people to consider, manage and promote.&lt;/p&gt;

&lt;p&gt;2) Usually you are responsible for an operational area. This means you might have a comma after your job title, followed by a few words describing the area you are responsible for. For example, a Director of Engineering, Data Infrastructure might lead several teams that build and maintain the application's core storage infrastructure. A Director of Engineering, ABC might lead an organisation made up of all the engineering teams that develop new features for the ABC application, one of many applications across the Foobar application suite.&lt;/p&gt;

&lt;p&gt;3) You step away from driving the vehicle. While EMs typically continue to write code for their team - albeit usually less on the critical path - Directors of Engineering will typically be much less involved, if at all, in handing over code. Instead, they focus on keeping their teams productive, coaching their managers, working on the combined technical roadmap for their area, and maximising efficiency and collaboration. If we apply Andy Grove's management equation, where a manager's output is equal to the output of their team + the output of those they influence, it becomes clear that there are activities with greater leverage than going into the shallows of a IDE and transferring code. Instead, deciding what to do and what not to do, connecting and sharing information with colleagues, and effectively delegating tasks through the team will always lead to higher output.&lt;/p&gt;

&lt;p&gt;So how does this role come about? I’ve often seen it happen in two ways.&lt;/p&gt;

&lt;p&gt;The first is that it occurs naturally through growth. As a department hires more people, EMs begin to acquire more direct reports that they can effectively manage. Teams get too big. Thus teams split, and the need for the org chart to maintain a logical grouping creates gaps for people to begin managing managers. &lt;/p&gt;

&lt;p&gt;Although this presents a great opportunity, it’s important not to make yourself redundant if you happen to be the person getting promoted into the Director role. For example, if you end up splitting an overly large team in two, promote an EM to run one of them and report to you first whilst you run the other one. This way you can gradually ease away from driving the vehicle, which gives you a longer period of time in your comfort zone of running one team whilst delegating another to a new manager who will need ramping up.&lt;/p&gt;

&lt;p&gt;The second way is that Director of Engineering roles at the biggest technology companies are an entry point for experienced external managers of managers to begin to establish themselves in bigger companies. For example, someone who has experience of being a Director (but often above) at a medium-sized company may get recruited externally in order to begin to scale a new initiative, or to provide stable engineering management for an acquired team that the larger company wants to retain and grow. &lt;/p&gt;

&lt;p&gt;It’s not uncommon to see CTOs who have run departments of around 100 join big technology companies to run a smaller team with a plan to grow rapidly. From what I have learned talking to contacts, Directors of Engineering at FAANG companies can run orgs in the hundreds of people, whereas VPs have thousands reporting up into them.&lt;/p&gt;

&lt;p&gt;All of this sounds very exciting and important, but the step upwards to Director of Engineering is where an EM must firmly commit to management and coaching being their primary, and often only, output. Trying to hang on to technical contributions causes conflicts of interest across their teams and is, most often, inefficient as per Andy Grove’s equation. However, the good news is that excellent managers of managers are rare. If you are motivated to do it and successful at it, you are extremely hirable, and you can also make a real difference in the working day of a substantial amount of people.&lt;/p&gt;

&lt;p&gt;Typically a Director of Engineering will report to the VP Engineering, or perhaps a Senior Director of Engineering. The Senior prefix works in the same way as it does for EM. It signposts tenure and experience.&lt;/p&gt;

&lt;h2&gt;
  
  
  VP Engineering
&lt;/h2&gt;

&lt;p&gt;Ah, the level VP. Usually managers manage managers of managers. How meta! The size of the organisation a VP is responsible for depends very much on the company they work for. Below we look at two ways in which the role of a VP engineering manifests itself.&lt;/p&gt;

&lt;p&gt;In our level definition, VPs are strategic. They help define why and with what we want to achieve something.&lt;/p&gt;

&lt;p&gt;Looking at VP Engineering as an evolution of Director, there are some common themes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Accountability for a specific part of the strategy. Perhaps the VP Engineering leads the platform department, which includes everything from data ingestion to classification, storage and APIs. Uptime, easy access and fast throughput are critical and are managed by dozens of teams. They may also be responsible for the organisation producing a product or set of products that make up a large part of the entire company's revenue. In any case, it involves a high level of responsibility and a strategy that's closely linked to the direction of the company.&lt;/li&gt;
&lt;li&gt;They spend time thinking about the what and why rather than the how. Our Technical Directors may spend their time on how to create and maintain an important part of the application real estate, but Vice Presidents typically spend more time on what those parts should be in the first place and how they impact the company's bottom line. They're often involved in discussing corporate and departmental strategy as it affects the direction of their organisation. It's an upper management role where VPs draw on their technical knowledge to contribute to the discussion.&lt;/li&gt;
&lt;li&gt;They coach and steer many people towards the future. What should the department be working on in 3, 6, 9, 12 months? What about the possible development in the next three years? What would that look like in terms of resources and technology? How can they communicate this vision and coach their staff to take their own teams on this journey?&lt;/li&gt;
&lt;li&gt;Reporting to CTO is worth a separate point because it can be either brilliant or frustrating. In smaller companies, it can happen that an VP engineer is the process owner, while the hacker-in-chief is CTO subordinate. This leads to tensions. In larger companies, they may be geographically distant from their supervisor and have many competing priorities in their schedules. This makes it difficult to spend time together. The same strategy applies: you've to be independent, be able to make important decisions with minimal support, and know how to fill in the gaps where the supervisor cannot or won't spend their time.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;How to become a VP Engineering?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;One way is to become the first engineering leader in a start-up company. The VP engineering is the counterbalance and complement to the CTO in the early stages. He/she will be responsible for the delivery process, performance of engineers, resources and prioritisation of projects, hiring of staff, etc. The CTO will lead the development of the product. If the start-up is successful, it is a great way to accelerate one's career, but the experience can definitely be a trial by fire. Start-ups are not easy.&lt;/p&gt;

&lt;p&gt;The other path is to establish yourself as Director of Engineering and have a proven track record of operational excellence (i.e. making the trains run on time comes naturally to you), while demonstrating your ability to develop and implement strategic direction in collaboration with your VP and colleagues. Think of cross-departmental initiatives, efficiencies through building systems for reuse, and a sense of how best to invest time, money and people to achieve results that benefit both engineering (e.g. interesting, innovative, challenging work) and the wider business (e.g. improving speed, reducing costs or opening up new products). Think again of Andy Grove's equation: ever more powerful teams, ever more powerful impact on others.&lt;/p&gt;

&lt;p&gt;If you want to work for one of the biggest technology companies in the world as VP engineering, you should know that these positions are rarely filled externally. Because of the expertise, experience and confidence required for this position at FAANG (or similar companies), you will need to start at the lower management level and then work your way up from there. I have spoken to FAANG recruiters who say that VP engineers only ever join externally by taking on the same role at other FAANG companies.&lt;/p&gt;

&lt;p&gt;Like the EM and Director roles before, you can have a VP with the Senior Prefix (SVP). You may even see Executive VP (EVP). Again, it's about length of service, experience and remit, and sometimes whether they are part of the company's leadership team.&lt;/p&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;I hope that this article has helped to demystify what's expected in the various management-level roles that sit between an individual employee and the website CTO. I'm aware that these roles can still seem very nebulous after reading it, especially if you aren't used to thinking about fairly abstract concepts like corporate or departmental strategy. And believe me, sometimes you'll wish you didn't have to!&lt;/p&gt;

&lt;p&gt;Check out career paths on &lt;a href="//progression.fyi"&gt;progression.fyi&lt;/a&gt; for more detailed descriptions of what each level means at Brandwatch, and be sure to compare them to those published for other companies - it's not the same everywhere. You can also check out &lt;a href="//levels.fyi"&gt;levels.fyi&lt;/a&gt; to see what these levels are called at much larger companies like Amazon, Google and Facebook.&lt;/p&gt;

&lt;p&gt;The journey from EM to Director to VP takes you along a path from tactical to operational to strategic. It’s not for everyone. Beyond EM you typically have to make the conscious choice to put down your IDE and spend more time on coaching, people, resourcing and, dare I say it, competing priorities and politics within an organization. But it’s not all bad. It can be incredibly rewarding seeing teams, divisions and whole departments succeed.&lt;/p&gt;

</description>
      <category>career</category>
      <category>engineering</category>
      <category>leadership</category>
    </item>
    <item>
      <title>Meta’s Presence Platform introduction</title>
      <dc:creator>Art Sh</dc:creator>
      <pubDate>Sat, 10 Sep 2022 23:36:20 +0000</pubDate>
      <link>https://dev.to/shiaart/metas-presence-platform-introduction-4kf0</link>
      <guid>https://dev.to/shiaart/metas-presence-platform-introduction-4kf0</guid>
      <description>&lt;p&gt;As we are getting closer to Connect 2022, I wanted to cover what Meta has presented almost a year ago at Connect 2021, to see what could be announced on upcoming event in 2022.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--7a8Nu-T3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qtra6q1i39zjdrsdfo9a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--7a8Nu-T3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qtra6q1i39zjdrsdfo9a.png" alt="Image description" width="800" height="361"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;At Connect 2021, Meta unveiled the vision for the Metaverse, a more connected digital experience that allows you to move seamlessly from one place to another and spend time with people who are physically far away from you, while maintaining your unique virtual identity and digital goods from one world to the next.&lt;/p&gt;

&lt;p&gt;A vast and interoperable Metaverse cannot be built by one company alone. It will require the contributions of many companies, developers and creators - and it will take time.&lt;/p&gt;

&lt;p&gt;Some building blocks for the Metaverse already exist, but to create virtual environments that feel natural and authentic, we need to improve the way movement and space are represented in an app. During the keynote, Meta announced the &lt;strong&gt;Presence Platform&lt;/strong&gt;, a wide range of machine perception and AI capabilities - including Passthrough, Spatial Anchors and Scene Understanding - that enable you to create more realistic mixed reality, interaction and speech experiences that seamlessly integrate virtual content with the user's physical world. Meta also gave a sneak peek at next generation of all-in-one hardware, Project Cambria VR, which will launch in 2022 - an advanced device at a higher price point that will feature the latest VR technologies. With the advances in Meta’s hardware and the capabilities of the Presence Platform, Meta is opening up groundbreaking possibilities for mixed reality experiences and natural interactions at VR. This brings us ever closer to the promise of the Metaverse and the potential to bring people together in new, more immersive ways in the future.&lt;/p&gt;

&lt;p&gt;With the Presence Platform, Meta aims to enable a wide range of mixed reality experiences that align with principles for responsible innovation. This starts with being responsible with the information we need to create amazing mixed reality experiences that are safe and seamless.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/QHWu8WdN9mk"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;h2&gt;
  
  
  Capabilities Overview
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Insight SDK
&lt;/h3&gt;

&lt;p&gt;Meta announced the &lt;strong&gt;Insight SDK&lt;/strong&gt;, which lets you create mixed reality experiences that give a realistic sense of presence.&lt;/p&gt;

&lt;p&gt;Earlier in 2021, Meta launched &lt;strong&gt;Passthrough API Experimental&lt;/strong&gt;, which lets you create experiences that merge virtual content with the physical world. On Connect 2021 Meta has  announced the general availability of Passthrough in the next release, which means you can create, test and deploy experiences with Passthrough capabilities.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/wOc7KjRiDt8"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;Meta also announced &lt;strong&gt;Spatial Anchors&lt;/strong&gt;, world-spanning frames of reference that allow you to place virtual content in a physical space that can be maintained across sessions. With Spatial Anchors Experimental, which will be available soon, you will be able to create Spatial Anchors in specific 6DoF positions, track the 6DoF position relative to the headset, maintain Spatial Anchors on the device and retrieve a list of currently tracked Spatial Anchors.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/0Fo0hYSXcUw"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;New &lt;strong&gt;scene understanding&lt;/strong&gt; feature was also a major announcement on Connect 2021. Together with Passthrough and Spatial Anchors, Scene Understanding enables the rapid creation of complex and scene-based experiences that enable rich interactions with the user's environment. As part of Scene Understanding, Scene Model provides a geometric and semantic representation of the user's space, allowing you to create spatially rich mixed reality experiences. Scene Model is a single, comprehensive, up-to-date representation of the physical world that's indexable and queryable. For example, you can attach a virtual screen to the user's wall or have a virtual figure navigate the floor with realistic occlusion. You can also include real, physical objects in VR. To create this scene model, we offer a system-driven Scene Capture Flow that allows the user to walk through and capture the scene. We look forward to making Scene Understanding capabilities available as an experimental feature early next year.&lt;/p&gt;

&lt;p&gt;With the new Passthrough, Spatial Anchors and Scene Understanding features in the Insight SDK, you can create mixed reality experiences that merge virtual content with the physical world, creating new opportunities for social connection, entertainment, productivity and more.&lt;/p&gt;

&lt;h3&gt;
  
  
  Interaction SDK
&lt;/h3&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/6Ybt1L8XVhU"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;With the &lt;strong&gt;Interaction SDK&lt;/strong&gt;, Meta makes it easier for you to integrate hands and controller-centric interactions. The Unity library, which became available early in 2022, includes a set of ready-to-use, robust interaction components such as Grab, Poke, Target and Select. All components can be used together or independently, or even integrated with other interaction frameworks. The Interaction SDK solves many of the difficult interaction challenges associated with computer vision-based hand tracking, providing standardized interaction patterns and preventing regressions as the technology evolves. Last but not least, it provides tools to help you develop &lt;strong&gt;your own gestures&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--S7nVTGqb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2cp31hl3mfho23plngzs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--S7nVTGqb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2cp31hl3mfho23plngzs.png" alt="Image description" width="512" height="288"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The privacy Meta always offered for hand tracking also applies here. The images and estimated points specific to your hands are deleted after processing and aren't stored on our servers.&lt;/p&gt;

&lt;h3&gt;
  
  
  Voice SDK
&lt;/h3&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/6hoCwFXk5v0"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Voice SDK Experimental&lt;/strong&gt; was also announced on Connect 2021, which will be available in the next release so you can start creating and experimenting. &lt;strong&gt;Voice SDK&lt;/strong&gt; is a set of natural language features that lets you create hands-free navigation and new voice-controlled games. With Voice SDK, you can create voice navigation and search, or enable Voice FAQ to allow users to ask for help or a reminder. New voice-controlled gameplay is also enabled, such as winning a battle with a voice-controlled spell, or speaking to a character or avatar. &lt;a href="https://developer.oculus.com/documentation/unity/voice-sdk-overview/"&gt;The Voice SDK&lt;/a&gt; is powered by Meta's Wit.ai natural language platform and is free to sign up and get started.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/6O3teZL8UKw"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;h3&gt;
  
  
  Imagine What’s Possible
&lt;/h3&gt;

&lt;p&gt;With all listed above released on Connect 2021, and made even better during 2022 - it becomes clear that Meta is getting ready for 2022 product releases, developer tools and SDKs. If Project Cambria VR goes live in 2022, this should enable new set of capabilities and use cases.&lt;br&gt;
Stay tuned for updates and features coverage from &lt;a href="https://www.metaconnect.com/"&gt;Connect 2022&lt;/a&gt;!&lt;/p&gt;

</description>
      <category>vr</category>
      <category>ar</category>
      <category>metaverse</category>
    </item>
    <item>
      <title>How real is Apple's realityOS?</title>
      <dc:creator>Art Sh</dc:creator>
      <pubDate>Wed, 07 Sep 2022 21:46:03 +0000</pubDate>
      <link>https://dev.to/shiaart/how-real-is-apples-realityos-399o</link>
      <guid>https://dev.to/shiaart/how-real-is-apples-realityos-399o</guid>
      <description>&lt;p&gt;We saw a lot of realityOS leaks during WWDC 2022 - but we don’t know it yet. All enhancements seem solid build-out toward a conceptual MR environment. Here are a few examples.&lt;/p&gt;

&lt;h3&gt;
  
  
  Leaks
&lt;/h3&gt;

&lt;p&gt;Is it was reported by &lt;a href="https://www.techspot.com/news/93333-references-realityos-found-apple-source-code.html"&gt;Techspot&lt;/a&gt; and &lt;a href="https://www.macrumors.com/2022/02/09/apple-realityos-ar-vr-headset-reference/"&gt;Macrumors&lt;/a&gt;, there are number of references to RealityOS in Apple’s source code, this is giving a community a clear signal that both OS and MR device are coming to the market, and only release date is unknown.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;“Apple's upcoming mixed reality headset will be driven by a custom operating system known as realityOS, developers have discovered. The naming scheme makes sense, at least on paper, as Apple has given similar names to its other operating systems including iPadOS and watchOS.”&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Mentioned source code is still available on &lt;a href="https://github.com/apple-oss-distributions/dyld/blob/5c9192436bb195e7a8fe61f22a229ee3d30d8222/common/MachOFile.cpp#L578"&gt;github&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--dhR4VvxG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/q1pctyr79lneiv7j4kie.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--dhR4VvxG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/q1pctyr79lneiv7j4kie.png" alt="Image description" width="800" height="472"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Features
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;StageManager&lt;/strong&gt; jumps out from the screenshot below as something similar to what Meta has in their Oculus ecosystem. Bold, floating windows, the side docked window design seems perfect for peripheral vision, and it seems there is no need for a new window management system in macOS - this could be preparation for future products.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--uqooZtiP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3ptkz1zy62d2vgfh8trp.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--uqooZtiP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3ptkz1zy62d2vgfh8trp.jpeg" alt="Image description" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Generally, &lt;strong&gt;UI Elements&lt;/strong&gt; - e.g. bold text, overlays, key features to make content visible in a noisy mixed reality overlay.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The &lt;strong&gt;Lock Screen&lt;/strong&gt; is revised from time to time, and this year is no exception. There is a lot of work around occlusion, fonts and design that lends itself to RealityOS. Apple's Newsroom post says: “Notifications have been redesigned to roll up from the bottom, ensuring that users have a clear view of their personalized Lock Screen” A clear view of what? - Your MR headset Viewport.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;App Intents&lt;/strong&gt; session recommended that developers are accommodate audio-only environments in specific ways. The new live designs for intents also seem really relevant.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;CatPlay dashboard&lt;/strong&gt; looks like a proof of concept for UI they’re planning. Apple is known to experiment with platforms to prove concepts for others devices.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Live Text&lt;/strong&gt; looks like a perfect app for a MR Headset, Live translation is an obvious feature you would add to a revolutionary headset device.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--SIAmFph---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6hagwhwpiq8dagaqv00u.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SIAmFph---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6hagwhwpiq8dagaqv00u.jpg" alt="Image description" width="653" height="653"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Frameworks
&lt;/h3&gt;

&lt;p&gt;Apple has unveiled some new framework features that seem to form the basis of OS. Taken out of context, they might raise the question of why these investments are worthwhile. However, when viewed from the perspective of AR, this becomes clear. Here are some examples:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Charts and CalendarUI&lt;/strong&gt; engineers have been developing open source libraries since iOS 4. There should be a reason why they have developed these frameworks now, probably to fulfil an internal product or platform need.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;WeatherKit&lt;/strong&gt; has nice benefit around consistency. If it’s raining, you are 100% sure that all apps and system know it’s raining and goes into rain mode.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;LIDAR&lt;/strong&gt; has been touted as a new feature on iPad and iPhone for the past few releases. You know how Oculus asks users to onboard by defining a virtual safe zone to avoid bumping into things? LIDAR - is a major player in their RoomPlan functionality, and potentially a key capability for Mixed Reality as well.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  One more thing
&lt;/h3&gt;

&lt;p&gt;Look at how Apple is adding more Mixed Reality foundational features to it’s products - Apple Watch Assistive Touch feature, just have a look and imagine how this could enable input system for Mixed Reality headset.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/8ZqNMRUSlHc"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

</description>
      <category>apple</category>
      <category>ar</category>
      <category>macos</category>
    </item>
    <item>
      <title>Unity AR capabilities overview</title>
      <dc:creator>Art Sh</dc:creator>
      <pubDate>Mon, 05 Sep 2022 22:55:46 +0000</pubDate>
      <link>https://dev.to/shiaart/unity-ar-capabilities-overview-3njm</link>
      <guid>https://dev.to/shiaart/unity-ar-capabilities-overview-3njm</guid>
      <description>&lt;p&gt;To start developing AR, Unity recommends using AR Foundation to build your application for Unity supported handhelds AR and portable AR devices.&lt;/p&gt;

&lt;p&gt;AR Foundation enables cross-platform work with augmented reality platforms in Unity. This package provides an interface for Unity developers, but does not implement AR functions itself.&lt;/p&gt;

&lt;p&gt;To use AR Foundation on a device, you must also download and install packages for each of the target platforms supported by Unity:&lt;br&gt;
For Android : &lt;a href="https://docs.unity3d.com/Packages/com.unity.xr.arcore@3.1" rel="noopener noreferrer"&gt;ARCore XR Plug-in&lt;/a&gt; &lt;br&gt;
For iOS: &lt;a href="https://docs.unity3d.com/Packages/com.unity.xr.arkit@3.1" rel="noopener noreferrer"&gt;ARKit XR Plug-in&lt;/a&gt;&lt;br&gt;
For Magic Leap: &lt;a href="https://docs.unity3d.com/Packages/com.unity.xr.magicleap@5.1" rel="noopener noreferrer"&gt;Magic Leap XR Plug-in &lt;/a&gt;&lt;br&gt;
For HoloLens: &lt;a href="https://docs.unity3d.com/Packages/com.unity.xr.windowsmr@3.4" rel="noopener noreferrer"&gt;Windows XR Plug-in&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwbmgu7h00lq0dhvgq5cx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwbmgu7h00lq0dhvgq5cx.png" alt="Image description"&gt;&lt;/a&gt;&lt;br&gt;
            &lt;em&gt;(Source:&lt;a href="https://unity.com" rel="noopener noreferrer"&gt;https://unity.com&lt;/a&gt;)&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;With Apple coming to the market with a future mixed reality headset, probably Unity will extend supported platforms quite fast to gain developer audience and extend capabilities offering on Unity platforms.&lt;/p&gt;

&lt;p&gt;For instructions on how to configure your Project using the XR Plug-in Management system, see the &lt;a href="https://docs.unity3d.com/Manual/configuring-project-for-xr.html" rel="noopener noreferrer"&gt;Configuring your unity Project&lt;/a&gt; for XR page.&lt;/p&gt;
&lt;h2&gt;
  
  
  AR platform support
&lt;/h2&gt;

&lt;p&gt;AR Foundation doesn't implement AR functions from scratch, but instead defines a cross-platform API that allows developers to work with functions common to multiple platforms.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhmoj4hjx05zy9osh815o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhmoj4hjx05zy9osh815o.png" alt="Image description"&gt;&lt;/a&gt;&lt;br&gt;
            &lt;em&gt;(Source:&lt;a href="https://unity.com" rel="noopener noreferrer"&gt;https://unity.com&lt;/a&gt;)&lt;/em&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  AR Foundation capabilities
&lt;/h2&gt;

&lt;p&gt;This is a great overview of AR capabilities and tutorial to get started in Unity AR.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/gpaq5bAjya8"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;AR foundation supports following capabilities within the platform:&lt;/p&gt;

&lt;h3&gt;
  
  
  Raycast
&lt;/h3&gt;

&lt;p&gt;Commonly used to determine where virtual content will appear, where a ray (defined by an origin and direction) intersects with a real-world feature detected and/or tracked by the AR device. Unity has built-in functions that allow you to use raycasting in your AR app.&lt;/p&gt;

&lt;h3&gt;
  
  
  Plane detection
&lt;/h3&gt;

&lt;p&gt;Detect the size and location of horizontal and vertical surfaces (e.g. coffee table, walls). These surfaces are called “planes”.&lt;/p&gt;

&lt;h3&gt;
  
  
  Reference points
&lt;/h3&gt;

&lt;p&gt;Track the positions of planes and feature points over time.&lt;/p&gt;

&lt;h3&gt;
  
  
  Participant tracking
&lt;/h3&gt;

&lt;p&gt;Track the position and orientation of other devices in a shared AR session.&lt;/p&gt;

&lt;h3&gt;
  
  
  Gestures
&lt;/h3&gt;

&lt;p&gt;Recognize gestures as input events based on human hands.&lt;/p&gt;

&lt;h3&gt;
  
  
  Point cloud detection
&lt;/h3&gt;

&lt;p&gt;Detect visually distinct features in the captured camera image and use these points to understand where the device is relative to the world around it.&lt;/p&gt;

&lt;h3&gt;
  
  
  Face tracking
&lt;/h3&gt;

&lt;p&gt;Access face landmarks, a mesh representation of detected faces, and blend shape information, which can feed into a facial animation rig. The Face Manager configures devices for face tracking and creates GameObjects for each detected face.&lt;/p&gt;

&lt;h3&gt;
  
  
  2D and 3D body tracking
&lt;/h3&gt;

&lt;p&gt;Provides 2D (screen-space) or 3D (world-space) representations of humans recognized in the camera frame. For 2D detection, humans are represented by a hierarchy of seventeen joints with screen-space coordinates. For 3D detection, humans are represented by a hierarchy of ninety-three joints with world-space transforms.&lt;/p&gt;

&lt;h3&gt;
  
  
  2D image tracking
&lt;/h3&gt;

&lt;p&gt;Detect specific 2D images in the environment. The Tracked Image Manager automatically creates GameObjects that represent all recognized images. You can change an AR application based on the presence of specific images.&lt;/p&gt;

&lt;h3&gt;
  
  
  3D object tracking
&lt;/h3&gt;

&lt;p&gt;Import digital representations of real-world objects into your Unity application and detect them in the environment. The Tracked Object Manager creates GameObjects for each detected physical object to enable applications to change based on the presence of specific real-world objects.&lt;/p&gt;

&lt;h3&gt;
  
  
  Environment probes
&lt;/h3&gt;

&lt;p&gt;Detect lighting and color information in specific areas of the environment, which helps enable 3D content to blend seamlessly with the surroundings. The Environment Probe Manager uses this information to automatically create cubemaps in Unity.&lt;/p&gt;

&lt;h3&gt;
  
  
  Meshing
&lt;/h3&gt;

&lt;p&gt;Generate triangle meshes that correspond to the physical space, expanding the ability to interact with representations of the physical environment and/or visually overlay the details on it.&lt;/p&gt;

&lt;h3&gt;
  
  
  Human segmentation
&lt;/h3&gt;

&lt;p&gt;The Human Body Subsystem provides apps with human stencil and depth segmentation images. The stencil segmentation image identifies, for each pixel, whether the pixel contains a person. The depth segmentation image consists of an estimated distance from the device for each pixel that correlates to a recognized human. Using these segmentation images together allows for rendered 3D content to be realistically occluded by real-world humans.&lt;/p&gt;

&lt;h3&gt;
  
  
  Occlusion
&lt;/h3&gt;

&lt;p&gt;Apply distance to objects in the physical world to rendered 3D content, which achieves a realistic blending of physical and virtual objects.&lt;/p&gt;

&lt;h2&gt;
  
  
  Code and how to get started
&lt;/h2&gt;

&lt;p&gt;If you are looking to get started with the Unity platform and want to get started with code.&lt;br&gt;
Here is a great Github repo with a detailed overview and some well-structured code samples:&lt;br&gt;
&lt;a href="https://github.com/Unity-Technologies/arfoundation-samples" rel="noopener noreferrer"&gt;https://github.com/Unity-Technologies/arfoundation-samples&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ar</category>
      <category>unity3d</category>
      <category>arcore</category>
      <category>discuss</category>
    </item>
    <item>
      <title>Apple is getting ready for Mixed Reality development through iOS 16 APIs</title>
      <dc:creator>Art Sh</dc:creator>
      <pubDate>Wed, 31 Aug 2022 21:22:38 +0000</pubDate>
      <link>https://dev.to/shiaart/apple-is-getting-ready-for-mixed-reality-development-through-ios-16-apis-2g6f</link>
      <guid>https://dev.to/shiaart/apple-is-getting-ready-for-mixed-reality-development-through-ios-16-apis-2g6f</guid>
      <description>&lt;p&gt;If you’re following recent Apple announcements probably you’ve noticed the fact that Apple is just landing foundation for future mixed reality development. Without saying a word during WWDC keynotes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--qOB8dP0E--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fmake59zcfijmolflj26.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--qOB8dP0E--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fmake59zcfijmolflj26.png" alt="Image description" width="800" height="532"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;(Source: Bloomberg)&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Before Apple’s WWDC 2022 keynote kickstarted, media and tech world were speculating if Apple is going to mention or even announce a mixed reality headset.&lt;/p&gt;

&lt;p&gt;At least, a toolkit similar to DTK for M1 development was expected. The reason was that multiple leaks and speculations were covered in media, but WWDC folded without even mentioning Apple’s revolutionary project.&lt;br&gt;
What is more interesting? SceneKit and RealityKit framework barely got any updates this year. Instead, we have got M2-powered Macs, a stage manager in iPadOS and revamped iOS.&lt;/p&gt;

&lt;p&gt;The release date for Apple’s reality headset initially was planned for 2020, finally moved to 2023, and now it could even go further and moved to 2024.&lt;/p&gt;

&lt;p&gt;Apple is a type of company which will invest significantly into their products and eco-system before demonstrating any product details, and it make sense - deep integration into their eco-system and gaining AR developers market share to enable them to build for the metaverse.&lt;/p&gt;

&lt;p&gt;Despite no news of realityOS, the iPhone maker has been making significant enhancements in its APIs and frameworks to shape up the developers for a mixed reality.&lt;/p&gt;

&lt;p&gt;Let’s cover some APIs announced during WWDC 2022. Some of these are well-known and received a lot of limelight during WWDC 22. However, from an AR/VR development perspective, the role these APIs wasn’t that obvious to the public.&lt;/p&gt;

&lt;p&gt;Live Text API And PDFKit for Scanning Text from Video&lt;/p&gt;

&lt;p&gt;Apple had introduced a Live Text feature to extract text from images in iOS 15. They took feature to the next level in iOS 16 by releasing a Live Text API to easily grab text from images and video frames. Released as a part of VisionKit framework, DataScannerViewController class lets you configure various parameters for scanning. Under the hood, the Live Text API uses the VNRecognizeTextRequest to detect texts.&lt;/p&gt;

&lt;p&gt;At a first glance, the Live Text API feature seems like a Google Lens. However, think of the possibilities it’ll bring when Apple’s future headset is in front of your eyes. For starters, imagine turning your head to quickly extract information with your eyes. Yup, it was already possible in iOS 15 through the AirPods spatial awareness for head-tracking that leverages CMHeadphoneMotionManager. Now throw iOS 16’s new personalized spatial audio into the mix and I can already see VR mechanics unfolding.&lt;/p&gt;

&lt;p&gt;Two enhancements in the PDFKit framework — the ability to parse text fields and convert document pages into images — will matter a lot in building a rich AR experience.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--DbBw9hEN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2gf8x8ixbzn4r79bwyu4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--DbBw9hEN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2gf8x8ixbzn4r79bwyu4.png" alt="Image description" width="800" height="411"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;(Source: WWDC 2022 video)&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;To ensure Apple’s mixed-reality device isn’t just a fancy gadget on your face, providing a toolset to interact with text, images and graphics is important.&lt;/p&gt;

&lt;p&gt;With the introduction of two powerful image recognition functions, Apple is on the right path. A path that’ll lead to AR/VR apps with rich interactive interfaces connected to a real world.&lt;/p&gt;

&lt;p&gt;Speech Recognition and Dictation&lt;br&gt;
Let’s put text and images aside for a moment, iOS 16 has also rebuild the Dictation feature by letting users switch between voice and touch.&lt;/p&gt;

&lt;p&gt;So, you could be walking down a hall and might want to quickly edit a text message on your phone. In iOS 16, you can quickly use your voice to easily modify a piece of text.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4j6T6kB9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dgon6x0g6kpn8ic6ihzy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4j6T6kB9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dgon6x0g6kpn8ic6ihzy.png" alt="Image description" width="800" height="457"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;(Source: WWDC 2022 video)&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;More? The Speech framework has got a little enhancement — the ability to toggle punctuations in SFSpeechRecognitionRequest through addsPunctation. I’m optimistic this will give rise to rich communication apps as it's already found its way into live captions in FaceTime calls.&lt;/p&gt;

&lt;p&gt;From a mixed reality perspective, these are fantastic features. Using voice to enter text would minimise our dependency on keyboard based input in the Virtual world. Apple’s also making it easy to integrate Siri into our apps using the new App Intents framework.&lt;/p&gt;

&lt;p&gt;RoomPlan API, Background Assets Framework&lt;br&gt;
The Background Assets framework is another tool that didn’t get a lot of attention yet. Introduced to handle the downloads of large files across different app states, I think the possibilities extend beyond this utility.&lt;/p&gt;

&lt;p&gt;By downloading 3D assets from the cloud, we can quickly build and ship augmented reality apps with much smaller asset sizes, which could be critical on a headset.&lt;/p&gt;

&lt;p&gt;Similarly, the RealityKit framework didn’t get any significant changes. But Apple quietly unveiled a new RoomPlan API. Powered by ARKit 6, the Swift-only API provides out-of-the-box support for scanning rooms and building 3D models out of it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--sgvy7JvC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/i94jlaox0tb2a070670n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--sgvy7JvC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/i94jlaox0tb2a070670n.png" alt="Image description" width="800" height="547"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;(Source: &lt;a href="https://developer.apple.com"&gt;https://developer.apple.com&lt;/a&gt;)&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Now, you can deem the RoomPlan API as an extension of the Object Capture API but think of it in AR/VR terms, and considering the fact that Apple’s mixed reality headset would have LiDAR sensors and multiple cameras, RoomPlan is going to be a game-changer for developers. Expect a lot of AR apps that let you reconstruct and restyle houses.&lt;/p&gt;

&lt;p&gt;While those were the major APIs that would fit in the mixed reality use cases, Spatial is another new framework that enables working with 3D math primitives. It might prove its metal in dealing with graphics in the virtual space.&lt;/p&gt;

&lt;p&gt;Finally, Apple didn’t mentioned a single word about its virtual reality headset plans, but the new APIs they released this year will play a crucial role in plugging all the pieces for metaverse development. I think these APIs are demonstrating a trend of Apple shifting towards enabling it’s eco-system for the future device.&lt;/p&gt;

&lt;p&gt;It’s critical to prepare developers to build apps for the new ecosystem today. After all, for a product to get widespread adoption, there needs to be a mature ecosystem of apps and tools — which requires getting developers on board.&lt;/p&gt;

</description>
      <category>ar</category>
      <category>apple</category>
      <category>ios</category>
      <category>discuss</category>
    </item>
  </channel>
</rss>
