<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Chrystian Vieyra</title>
    <description>The latest articles on DEV Community by Chrystian Vieyra (@chrystianv1).</description>
    <link>https://dev.to/chrystianv1</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/chrystianv1"/>
    <language>en</language>
    <item>
      <title>ARCore and ARKit Feature Sets Compared to Hamsters and DaVinci: How they see the world</title>
      <dc:creator>Chrystian Vieyra</dc:creator>
      <pubDate>Thu, 10 Jan 2019 17:12:29 +0000</pubDate>
      <link>https://dev.to/chrystianv1/arcore-and-arkit-feature-sets-compared-to-hamsters-and-davinci-how-they-see-the-world-4n6d</link>
      <guid>https://dev.to/chrystianv1/arcore-and-arkit-feature-sets-compared-to-hamsters-and-davinci-how-they-see-the-world-4n6d</guid>
      <description>&lt;p&gt;In the previous entry of this blog series, we looked into the market share of each AR platform, device compatibility, and perceived developer interest. In this article we will look into the main feature set that ARCore and ARKit provide. Overall, both platforms offer the same feature set, but there are differences in how they categorize and name these features.&lt;/p&gt;

&lt;p&gt;ARCore identifies three main features: (1) motion tracking, (2) environmental understanding, and (3) light estimation. In contrast, ARKit refers to (1) tracking and (2) scene understanding, but the underlying technical aspects of each AR platform are essentially the same. Because Google and Apple provide fairly limited information to developers about how these features work, we provide some examples below.&lt;/p&gt;

&lt;h4&gt;
  
  
  Tracking Features
&lt;/h4&gt;

&lt;p&gt;As the name of this feature suggests, smartphone AR technology tracks the position of the mobile device in space and builds a map of its environment through visual and inertial inputs. Google ARCore refers to this as &lt;a href="https://developers.google.com/ar/discover/concepts#motion_tracking"&gt;motion tracking&lt;/a&gt;, while Apple ARKit refers to this simply as &lt;a href="https://developer.apple.com/documentation/arkit/understanding_world_tracking_in_arkit"&gt;tracking&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;ARCore and ARKit use the camera and a visual algorithm to identify feature points in the environment, such as sharp edges denoted by color and brightness contrast, shadows, and straight lines. Similar to a facial recognition system that can identify eyes, nose, a mouth, and the outline of a face based upon an understanding of how these objects are situated with respect to one another, the camera can get some sense of where two walls join together to make a corner of a room, or where a table edge is located.&lt;/p&gt;

&lt;p&gt;Simply seeing a picture of a room is not enough to give it its three-dimensional quality, however. Unlike &lt;a href="https://medium.com/@DAQRI/depth-cameras-for-mobile-ar-from-iphones-to-wearables-and-beyond-ea29758ec280"&gt;depth-sensing cameras&lt;/a&gt; available on some dedicated AR products (more on this later), the vast majority of smartphones use a simple camera that can be likened to the eye of a hamster. As is the case with most prey animals that need to be wary of their surroundings, hamsters do not have &lt;a href="http://www.open.edu/openlearn/nature-environment/natural-history/studying-mammals-the-social-climbers/content-section-3.4"&gt;stereoscopic vision&lt;/a&gt; that gives them a sense of depth — each eye has an independent field of vision that ultimately allows them to have a wider field of view.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--1TuJeVcE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/7n0bcb93k1175xy12hwf.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--1TuJeVcE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/7n0bcb93k1175xy12hwf.jpg" alt="alt text for accessibility"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Using visuals alone, hamsters cannot immediately tell if the hungry predator is near or far, giving something of a flat, cartoonish effect to their perception of the world. However, hamsters aren’t totally incapacitated when it comes to understanding depth. Occasionally, you might notice your hamster pause, stand on its hind legs, and move its head from side to side. By doing this, the hamster slightly adjusts its view of the world and uses parallax cues to get a sense of what is near or far. When viewed from slightly different vantage points, objects that are nearer seem to shift their position with respect to a more static background.&lt;/p&gt;

&lt;p&gt;Even with our — mostly — binocular, stereoscopic vision, humans can get a sense of how parallax adds to the perception of a simple camera. We say “mostly,” because a small but significant fraction of people have &lt;a href="https://nei.nih.gov/health/amblyopia"&gt;amblyopia&lt;/a&gt;. And, even this deviation from normal vision function doesn’t always present a handicap. Many artists, &lt;a href="https://www.washingtonpost.com/news/morning-mix/wp/2018/10/19/leonardo-da-vincis-genius-may-be-rooted-in-a-common-eye-disorder-new-study-says/?utm_term=.dd0adea0245f"&gt;including Leonardo DaVinci&lt;/a&gt;, likely owe some of their visual genius to amblyopia because of their increased sensitivity to other visual cues like hue and intensity of color.&lt;/p&gt;

&lt;p&gt;To illustrate how movement-induced parallax can help a simple camera get a sense of depth, you might recall the classic school activity in which you vertically hold out your index finger or a pencil, view it through one eye and then the other, and notice how your finger “jumps” with respect to the background. Instead of moving our head, this activity results in our seeing things from a different vantage point simply by ignoring input from one eye and then the other, as opposed to moving our heads about like a hamster or moving a smartphone camera about a room.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tiH7GXkk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/btxo9z49ubf5q5me12of.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tiH7GXkk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/btxo9z49ubf5q5me12of.png" alt="alt text for accessibility"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Image source: &lt;a href="https://pumas.nasa.gov/files/04_28_05_1.pdf"&gt;NASA&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Just as we can correlate small head movements with the visual changes due to parallax, the smartphone combines inertial sensor data with the visual features.Essentially, the smartphone can generally notice how much it has been shifted in space with the use of an accelerometer, as well as how much it has been rotated with the use of a gyroscope, and correlate how much it has been moved to how much the visual features have shifted with respect to the background. (Curious to know how an accelerometer and gyroscope work? Check out our &lt;a href="https://www.vieyrasoftware.net/sensors-sensor-modes"&gt;Sensor and Generator Info&lt;/a&gt; page to learn how each major sensor functions, or one of our &lt;a href="https://www.vieyrasoftware.net/single-post/2016/10/23/Inside-Mobile-Sensors-MEMS-Technology"&gt;previous blog&lt;/a&gt; posts that explains the foundational physics behind micro-electrical-mechanical technology.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--qwsyoP5_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/b9j14lgc56k5rl9o61ez.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--qwsyoP5_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/b9j14lgc56k5rl9o61ez.png" alt="alt text for accessibility"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Image source: &lt;a href="https://developers.google.com/ar/images/MotionTracking.jpg"&gt;Google Developers&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This technique of combining motion and visual information to navigate the world is known as &lt;a href="https://en.wikipedia.org/wiki/Visual_odometry"&gt;visual odometry&lt;/a&gt;, and is a step up from &lt;a href="https://en.wikipedia.org/wiki/Dead_reckoning"&gt;dead reckoning&lt;/a&gt; that estimates change in position blindly. Using an accelerometer to determine changes in position involves double-integration of data, and can lead to cumulative positional errors can result in drift.&lt;/p&gt;

&lt;p&gt;Drift is perhaps one of the biggest concerns for AR developers. Drift becomes apparent when it is clear that the AR experience is no longer anchored to the original position of the smartphone. In a sense, drift is a miscalibration between the augmented experience and reality itself. For example, in an ideal AR experience, you might place a virtual object on a table. If you turn and walk away with your phone, only to come back later, you would expect to find the virtual object in its original location. However, without strong visual anchors to counteract drift, such as too little visual data about a space or a room without significant visual characteristics, you might return to find that the object is in a different location — on a different part of the table, or not on the table at all. The strength of tracking in ARCore and ARKit is in the convergence of visual and motion inputs to provide a checks-and-balances system.&lt;/p&gt;

&lt;p&gt;For best performance and to help provide visual anchors to the smartphone, it is best to translate (move) the device substantially before trying to interaction with AR. For reference, Google’s patented process for this is referred to as concurrent odometry mapping, while Apple’s patented process is referred to as visual inertial odometry, but the general idea is the same.&lt;/p&gt;

&lt;p&gt;A special mention should be made about the new Apple iPhone X. Users might note that this popular phone does have a depth-sensing camera that uses an infrared pixel system. Although developers can access the camera data to incorporate into ARKit, it should be noted that this feature is only for the front-facing camera, which makes it good for little more than &lt;a href="https://twitter.com/verge/status/927317628783230976"&gt;Face Recognition&lt;/a&gt;.&lt;/p&gt;

&lt;h4&gt;
  
  
  Understanding Features
&lt;/h4&gt;

&lt;p&gt;The remaining features of ARCore and ARKit refer to making sense of the environment once it has been mapped. Google refers to this as &lt;a href="https://developers.google.com/ar/discover/concepts#environmental_understanding"&gt;environmental understanding&lt;/a&gt; and &lt;a href="https://developers.google.com/ar/discover/concepts#lightestimation"&gt;light estimation&lt;/a&gt;, while Apple combines these two into &lt;a href="https://developer.apple.com/videos/play/wwdc2017/602/"&gt;scene understanding&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--2d6yRnbm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/3i3lljzvgzk6doc66vny.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--2d6yRnbm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/3i3lljzvgzk6doc66vny.png" alt="alt text for accessibility"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A cursory look at how AR views its environment can be seen in the following two examples. In both approaches, the algorithm looks for planes. In the image below, the evenly-spaced white dots are an array that make up the plane that is the floor.&lt;/p&gt;

&lt;p&gt;Separately, the algorithm looks for feature points, visual anchors within or outside of planes that provide additional information. In the case of the table, note that the feature points are not evenly spaced. Rather, they are only found on objects with visual contrast. There are no feature points on the white table cloth, but many can be found on the striped green placemats, and on the bowl and candle holder (with a number of feature points dotted along the their edges). As the smartphone is rotated in space, feature points appear and disappear as lighting changes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--cbPxfbKI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/jjzevtmvk2q8nod5ue37.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--cbPxfbKI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/jjzevtmvk2q8nod5ue37.png" alt="alt text for accessibility"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As one might expect, both ARCore and ARKit algorithms are likely to work best in high-contrast, static environments where only the smartphone does the moving.&lt;/p&gt;

&lt;p&gt;Lighting is not just used as an input for environmental understanding, however. Both ARCore and ARKit can use that information about the source(s) of &lt;a href="https://developer.apple.com/documentation/arkit/arlightestimate"&gt;environmental light from a static image&lt;/a&gt; selected from an AR session to enhance virtual images placed in the scene by providing them with realistic illumination and shadows.&lt;/p&gt;

&lt;h4&gt;
  
  
  Similar, but not Equal, Features
&lt;/h4&gt;

&lt;p&gt;Although at the surface level these features appear similar, the reality is that they their precision cannot be compared until code is tested across devices. As mentioned in our previous post, Google has been extremely conservative in allowing Android devices to run AR, making ARKit much more accessible to people based upon shear numbers alone. Google has the challenge of creating ARCore that is compatible with the myriad types and qualities of sensors, potentially rendering ARCore very imprecise or unworkable with certain lower-end devices. These findings will guide our work in the development of AR software to collect and visualize magnetic fields sensed directly by the smartphone. In our next posting, we will begin to compare actual functionality of ARCore and ARKit with regard to drag and environmental understanding.&lt;/p&gt;

&lt;p&gt;Contact the authors: Write to &lt;a href="mailto:support@vieyrasoftware.net"&gt;support@vieyrasoftware.net&lt;/a&gt; or visit &lt;a href="http://www.vieyrasoftware.net"&gt;www.vieyrasoftware.net&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This work is funded by NSF Grant #1822728. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.&lt;/p&gt;

</description>
      <category>android</category>
      <category>ios</category>
      <category>arcore</category>
      <category>arkit</category>
    </item>
    <item>
      <title>Comparing Google ARCore and Apple ARKit</title>
      <dc:creator>Chrystian Vieyra</dc:creator>
      <pubDate>Tue, 01 Jan 2019 12:53:13 +0000</pubDate>
      <link>https://dev.to/chrystianv1/comparing-google-arcore-and-apple-arkit-2lin</link>
      <guid>https://dev.to/chrystianv1/comparing-google-arcore-and-apple-arkit-2lin</guid>
      <description>&lt;p&gt;Choosing which platform to use to pursue exploratory work in augmented reality (AR) visualization techniques is critical. As many mobile developers only specialize on one platform, few cross-platform analyses exist for the relatively recent technological advancements like AR. In the following paragraphs, we present some of our own experiences with the investigation of these AR tools as part of a &lt;a href="https://www.nsf.gov/awardsearch/showAward?AWD_ID=1822728&amp;amp;HistoricalAwards=false"&gt;recently-awarded grant&lt;/a&gt; from the National Science Foundation to the &lt;a href="https://modelinginstruction.org/"&gt;American Modeling Teachers Association&lt;/a&gt;and to &lt;a href="https://modelinginstruction.org/"&gt;Vieyra Software&lt;/a&gt;. We present a primer on ARKit and ARCore compatibilities with Apple and Android products. In future entries we’ll take a closer look at some of our own development on each platform, as well as inherent benefits and disadvantages in terms of performance and functionality of each as we work to use AR and real data to display magnetic fields.&lt;/p&gt;

&lt;p&gt;In summer 2017, public AR frameworks were released openly to developers: &lt;a href="https://developer.apple.com/arkit/"&gt;ARKit for iOS&lt;/a&gt; on &lt;a href="https://www.apple.com/newsroom/2017/06/highlights-from-wwdc-2017/"&gt;5 June, 2017&lt;/a&gt;, and &lt;a href="https://developers.google.com/ar/"&gt;ARCore for Android&lt;/a&gt; on &lt;a href="https://www.blog.google/products/arcore/arcore-augmented-reality-android-scale/"&gt;29 August 2017&lt;/a&gt;. Linking the smartphone’s internal motion sensors and camera algorithms, these new frameworks allow developers to create AR worlds in which the smartphone has a 3-D environmental awareness. This allows users to not only display images on a 2-D background, but to interact with them in a 3-D environment without the use of cumbersome, pre-made visual targets. However, these bare-bones technology frameworks were released with extremely limited public documentation or sample applications off of which developers could base their work. Nearly a year following their release, these frameworks continue to be very poorly understood and underutilized by developers.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--rSHIOMdq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/u63wa11ev5qim8zv226s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rSHIOMdq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/u63wa11ev5qim8zv226s.png" alt="ar image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The yellow dots are anchor points for spacial awareness. The two red dots (on the mug and on the stack of napkin) are situated in space, with the distance between them noted in blue (in centimeters).&lt;/p&gt;

&lt;p&gt;Who is developing with ARKit and ARCore?&lt;/p&gt;

&lt;p&gt;A simple search on commit repositories shows that strong preference for ARKit by today’s developers. Despite the fact that both platforms were announced within a few months of each other, there is about five times more development and three times greater frequency of discussion on ARKit than ARCore.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--z5oNkg_g--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/8jwixv9an69eun7craqr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--z5oNkg_g--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/8jwixv9an69eun7craqr.png" alt="ar image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Our grant work initially focused on Android ARCore as platform of choice because of a desire to create something that was as accessible as possible to users. However, seeing these numbers caused us to question why more developers weren’t investing in Android for AR.&lt;/p&gt;

&lt;p&gt;Which devices are compatible with ARKit and ARCore?&lt;/p&gt;

&lt;p&gt;As of 2018, nearly 85% of smartphones around the world use Android, a &lt;a href="https://www.idc.com/promo/smartphone-market-share/os"&gt;percentage that is expected to only grow&lt;/a&gt; in the coming years. However, consistent with a &lt;a href="https://www.businessinsider.com/android-is-for-poor-people-maps-2014-4"&gt;2014 study&lt;/a&gt; of U.S. city maps that showed socioeconomic differences between the owners of Android versus iOS users, a huge portion of the billions of Android users are &lt;a href="https://newzoo.com/insights/articles/insights-into-the-2-3-billion-android-smartphones-in-use-around-the-world/"&gt;purchasing low-end devices&lt;/a&gt; that have significantly different capabilities than their higher cost counterparts. Unlike Apple, which has control over all models that use iOS, Android phones are manufactured by at least &lt;a href="https://www.appbrain.com/stats/top-manufacturers"&gt;ten major companies&lt;/a&gt;, each of which can select for unique features and acquire disparate hardware components of their choice.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--pLsv_a9c--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/2k1xiudm0zuos5nyrocc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--pLsv_a9c--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/2k1xiudm0zuos5nyrocc.png" alt="ar image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Portion of Android devices compatible with ARCore. Source: &lt;a href="https://developers.google.com/ar/discover/supported-devices"&gt;Google Developers&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;What does this mean for compatibility? Because of the hardware uniformity for iOS, all devices back to the iPhone SE and the iPad 5th generation are compatible with ARKit. Two notable estimates were made of the number of ARKit-compatible devices by the end of 2017–&lt;a href="https://blog.mapbox.com/how-i-determined-there-will-be-195m-arkit-devices-by-the-end-of-2017-2e2a99790bda"&gt;195 Million&lt;/a&gt;, which did not account for the release of the iPhone 8 or iPhone X, and &lt;a href="https://artillry.co/2017/10/04/434-million-arkit-devices-by-year-end-revised-forecast/"&gt;434 Million&lt;/a&gt;, an estimate released after the iPhone 8 had been revealed. Compare this with Google’s December 2017 estimate of ARCore-compatible devices of &lt;a href="https://www.blog.google/products/arcore/arcore-developer-preview-2/"&gt;100 Million&lt;/a&gt;, a comparative drop in the bucket with respect to Android’s massive global market share.&lt;/p&gt;

&lt;p&gt;What variables influence the compatibility of devices with AR platforms?&lt;/p&gt;

&lt;p&gt;Sensor Hardware: While all Apple devices have included the same basic sensor systems for years, Android devices are notoriously variable when it comes to hardware and sensors. ARCore requires a device with a back-facing camera, an accelerometer, and a gyroscope. While nearly all phones have the required camera and accelerometer, the gyroscope requirement removes compatibility with a large portion of low-end Android devices typically purchased by cost-conscious consumers and especially those in the developing world. Of all Android tablets, only the Galaxy Tab S4 is currently compatible. (&lt;a href="https://developers.google.com/ar/discover/supported-devices"&gt;Click here&lt;/a&gt; to see an expanded version of the table to the right for a complete list of ARCore-compatible Android devices).&lt;/p&gt;

&lt;p&gt;Operating System: Compared to ARKit’s complete compatibility with devices released since September 2015 that are running iOS 11.0, ARCore supports only a &lt;a href="https://developers.google.com/ar/discover/supported-devices"&gt;limited subset of Android devices&lt;/a&gt; running Android 7 or newer, and users must download the app &lt;a href="https://play.google.com/store/apps/details?id=com.google.ar.core"&gt;ARCore by Google&lt;/a&gt;, unlike Apple devices, which already include the app embedded within the operating system. Even these requirements for Android, however, aren’t hard-and-fast rules. Several devices require Android 8.0, and another subset of those require a specific patch (in addition to the download of the ARCore App), as is the case for the Sony Xperia XZ2.&lt;/p&gt;

&lt;p&gt;App Requirements: For any ARCore-compatible device whose user has not already downloaded the ARCore by Google app, an attempt to download an app that uses ARCore capabilities, such as the popular &lt;a href="https://play.google.com/store/apps/details?id=com.inter_ikea.place"&gt;IKEA Place&lt;/a&gt; app, notifies the user that ARCore by Google will also be downloaded.&lt;/p&gt;

&lt;p&gt;Device Release: While all new iOS devices are ready for compatibility with ARKit, the same is not true for even the newest Android products. While this year’s Google smartphones, Pixel 2 and Pixel XL were released with ARCore fully installed, they only account for a minimal portion of the market share. Because Google does not control the manufacture of devices produced by other companies, most companies seek &lt;a href="https://www.android.com/certified/"&gt;Android certification&lt;/a&gt; before release. In doing so, manufacturers work with Google to make sure that their hardware is compatible with Google Play apps, allowing for Google to modify their apps in advance to ensure that they work seamlessly once downloaded. Unfortunately, the ARCore by Google app appears to be excluded from this certification process, because a number of Android Certified devices have been released that are not compatible with ARCore, and cannot even be downloaded on them for testing by the user–Google effectively only permits the download of its ARCore app on devices that it has approved.&lt;/p&gt;

&lt;p&gt;This inability to download the ARCore by Google app onto some of the newest and most popular Android devices as proven frustrating. For instance, the Galaxy S9 and Galaxy S9+ are one of Samsung’s most popular devices. Despite its &lt;a href="https://en.wikipedia.org/wiki/Samsung_Galaxy_S9"&gt;release on March 16, 2018&lt;/a&gt;, Google did not permit the &lt;a href="https://github.com/google-ar/arcore-android-sdk/issues/367"&gt;download of ARCore until early May&lt;/a&gt;, nearly two months after its release. Similarly, Sony’s XPERIA XZ2 was &lt;a href="https://en.wikipedia.org/wiki/Sony_Xperia_XZ2"&gt;released on April 5, 2018&lt;/a&gt;, and &lt;a href="https://developers.google.com/ar/discover/supported-devices"&gt;wasn’t fully compatible until August 8, 2018&lt;/a&gt;, and even then only with the additionally required security patch.&lt;/p&gt;

&lt;p&gt;How should one decide which platform to use to pursue development?&lt;/p&gt;

&lt;p&gt;The answer depends upon your goals.&lt;/p&gt;

&lt;p&gt;Functionality: While we haven’t yet explicitly identified the benefits and limitations to each AR software framework, our initial experimentation suggests that ARKit experiences significantly less positional lag than ARCore. (For example, when “placing” an inanimate AR object in the middle of a room, walking around the house, and returning with the expectation of finding the AR object at the same coordinates as it was placed, ARKit significantly outperformed our tests of ARCore on two different Android devices). The reasoning behind this might be that while all Apple-manufactured devices have sensors that have been calibrated in the same way, even the same Android manufacturers often pull from different sensor manufacturers with varying levels of sensor precision and accuracy. Indeed, the reason there have been delayed responses from Google in creating compatible ARCore software for new devices might be exactly because it is making adjustments to meet the hardware specs.&lt;/p&gt;

&lt;p&gt;Accessibility: The goal of our recently-awarded &lt;a href="https://www.nsf.gov/awardsearch/showAward?AWD_ID=1822728&amp;amp;HistoricalAwards=false"&gt;National Science Foundation grant&lt;/a&gt; is to use modern 3-D mapping capabilities of phones, sensor data, and AR to build novel software for teaching and learning about fields that are highly accessible to device owners. Advancement in merging AR capabilities with sensor data has the potential to change the way educators, learners, and members of the workforce interact with their mobile devices, and, in turn, impact the way developers support the needs of education and workforce. We want to build on a platform that would eventually allow for wide-scale implementation of whatever we produced, especially educational resources for teachers and learners in underfunded schools and underdeveloped areas. However, the reality is that because so many Android devices are low-end, their users simply won’t have access to ARCore at this time. In terms of shear numbers, iOS and ARKit are much more likely to result in greater access to novel AR developments.&lt;/p&gt;

&lt;p&gt;Although sensor-based software developers have long demonstrated a preference for Android because of the ease of access to sensor data and because of the multitude of sensor options available across devices, in the case of augmented reality, ARKit and Apple seem to win the day.&lt;/p&gt;

&lt;p&gt;Contact the authors: Write to &lt;a href="mailto:support@vieyrasoftware.net"&gt;support@vieyrasoftware.net&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This work is funded by NSF Grant #1822728. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.&lt;/p&gt;

</description>
      <category>android</category>
      <category>ios</category>
      <category>arkit</category>
      <category>arcore</category>
    </item>
  </channel>
</rss>
