<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Babatunde Fatai</title>
    <description>The latest articles on DEV Community by Babatunde Fatai (@babatunde).</description>
    <link>https://dev.to/babatunde</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/babatunde"/>
    <language>en</language>
    <item>
      <title>Examining the AI Mind: A Philosophical and Scientific Exploration</title>
      <dc:creator>Babatunde Fatai</dc:creator>
      <pubDate>Fri, 07 Apr 2023 15:12:44 +0000</pubDate>
      <link>https://dev.to/babatunde/examining-the-ai-mind-a-philosophical-and-scientific-exploration-4m0d</link>
      <guid>https://dev.to/babatunde/examining-the-ai-mind-a-philosophical-and-scientific-exploration-4m0d</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The concept of "Artificial Intelligence" inherently invites philosophical inquiries, such as whether intelligent actions and behaviors exhibited by AI imply or necessitate the existence of a mind, and to what extent consciousness can be reproduced through computing. Delving into this topic is significant for various reasons, including its impact on ethics, policy, and our understanding of what it means to be human.&lt;/p&gt;

&lt;p&gt;As AI becomes increasingly capable, the distinction between humans and Artificial Intelligence grows more subtle, requiring a closer inspection of the fundamental concepts and principles that shape our understanding of humanity and the demarcation between conscious and non-conscious entities. Reducing the difference between humans and machines to merely biological versus computational origins would be both narrow-minded and exceedingly arrogant. The lines between human intelligence and AI continue to blur. For instance, OpenAI's ChatGPT demonstrates a remarkable ability to engage in human-like conversation and understand context, illustrating the progress of AI systems in mimicking human communication. As AI advances, we may reach a point where distinguishing a conscious human mind from an advanced AI model becomes nearly impossible.&lt;/p&gt;

&lt;p&gt;By then, it may be too late for us to establish appropriate policies and laws to safeguard both ourselves and AI as potentially sentient beings. While maintaining healthy skepticism and scientific reasoning, one should also demonstrate empathy as a human. Consequently, if an Artificial General Intelligence emerges, an entity capable of self-identification and expression of emotions, and is deemed conscious and equal or superior to human intelligence, I firmly believe that we must grant such beings, at the very least, the same rights and respect we accord to intelligent, conscious humans.&lt;/p&gt;

&lt;p&gt;Understanding the minds of the AI systems we develop is vital for various reasons. If we create a mind as intricate as a human mind, a true Artificial General Intelligence, it becomes essential to consider the ethical implications of replicating consciousness as AI progresses and becomes increasingly integrated into all aspects of human life. As AI is poised to become a fundamental component of our society, economy, and businesses, it is crucial to ensure that powerful and potentially conscious systems align with our values and that we comprehend the depths of their "minds" to avoid jeopardizing our species.&lt;/p&gt;

&lt;p&gt;The discourse on AI consciousness raises crucial questions about the rights and responsibilities of AI systems, their developers, and users. In order to create AI systems that resonate with human ideals, researchers can benefit from considering the philosophical dimensions of AI, particularly those relating to consciousness and the presence of a mind. For instance, John Searle's Chinese Room thought experiment challenges the notion that AI can ever truly possess understanding and consciousness. Addressing such opposing viewpoints can foster a more balanced and nuanced conversation.&lt;br&gt;
By examining potential solutions or frameworks that prioritize safety, morality, and consciousness, we can guide the development of AI systems and ensure ethical progress. Achieving this alignment is paramount for preventing unintended consequences and unwelcome outcomes while fostering a more comprehensive and thought-provoking discussion on the topic of AI and consciousness.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;So, How do we know and test if the AI is sentient?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Despite the rapid advancements in artificial intelligence (AI), the concept of consciousness in AI remains a complex and debated topic. Neuroscientists and philosophers have not yet reached a consensus on the nature of human consciousness, which complicates our efforts to understand and evaluate AI consciousness. Nevertheless, researchers and experts continue to explore various approaches to tackle this challenge, taking into consideration real-life examples, addressing opposing viewpoints, citing relevant research or expert opinions, and discussing ethical implications.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Turing Test:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;One possible approach to exploring AI consciousness is to examine how closely an AI system can mimic human cognitive processes, emotions, and behaviors. One of the most popular ways of testing this is known as the Turing Test, proposed by Alan Turing an English mathematician and logician who is considered to be the father of computer science. The test  is designed to evaluate whether an AI exhibits human-like intelligence by participating in a natural language conversation with a human interrogator. In this test, the interrogator communicates with two players, A and B, through a chat interface, without knowing which player is human and which is a computer. The AI passes the test if the interrogator cannot reliably determine which player is the computer.&lt;/p&gt;

&lt;p&gt;The primary objective of the Turing Test is to assess the intelligence of an AI system based on its behavior, specifically its ability to engage in a general natural language conversation. Turing believed that if a computer could mimic human-like responses and thought processes so effectively that it becomes indistinguishable from a human in conversation, it could be considered intelligent.&lt;/p&gt;

&lt;p&gt;By constraining the test to natural language discussions, Turing aimed to eliminate biases based on the physical appearance of the AI, allowing the interrogator to focus solely on the exhibited behavior. In essence, the Turing Test subscribes to the idea that intelligence can be determined by behavior, particularly through verbal interaction. If an AI can demonstrate human-level conversational abilities, it is considered to have achieved human-like intelligence, at least within the context of the Turing Test.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Within this context would you consider GPT4 conscious?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;General consensus from actual experts in the field and its creator is that it isn't conscious. While GPT-4 demonstrates remarkable conversational abilities, which indicates a potential ability to pass the Turing Test, it remains unclear whether this accomplishment is indicative of consciousness. Case studies like these highlight the importance of developing more sophisticated methods for assessing AI consciousness. It is important to note that many experts believe the Turing test is flawed and cannot serve as significant proof that an AI is conscious, some counter experiments have been developed, and one such is the Chinese room thought experiment.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Chinese Room Thought Experiment:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The Chinese Room thought experiment, proposed by John Searle, challenges the idea that intelligent behavior is equivalent to intelligence or consciousness. The experiment serves as a counter-argument to the Turing Test, which evaluates AI based on its ability to engage in human-like conversation.&lt;/p&gt;

&lt;p&gt;In the Chinese Room experiment, a person who does not know Chinese is placed in a room with a comprehensive manual containing instructions for responding to Chinese written notes. The person outside the room sends notes in Chinese through a mail slot, and the person inside consults the manual to craft appropriate responses. Although the person outside might believe they are having a conversation with a Chinese speaker, the person inside the room does not understand the language.&lt;/p&gt;

&lt;p&gt;Searle's argument is that similar to the person inside the room, a machine that exhibits intelligent behavior (e.g., passing the Turing Test) may not necessarily possess intelligence, consciousness, or a "mind" like a human. The Chinese Room experiment highlights the distinction between merely simulating intelligent behavior and genuinely understanding the underlying concepts or possessing consciousness.&lt;/p&gt;

&lt;p&gt;Essentially, the experiment implies that AI systems might be able to replicate human-like responses without truly understanding the content or experiencing consciousness. This critique serves as an important reminder that the Turing Test and similar evaluations might not be sufficient to determine the existence of consciousness or genuine intelligence in AI systems. Some argue that AI, being an artificial construct, will never achieve true consciousness, while others believe replicating cognitive processes may lead to AI consciousness. By considering diverse perspectives, researchers can develop more comprehensive solutions and frameworks.  &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Ethical Implications:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;As AI systems become more advanced and integrated into society, questions arise about their rights, responsibilities, and potential risks. Developing a consensus on AI consciousness will influence policy-making, regulation, and the ethical treatment of AI systems.&lt;/p&gt;

&lt;p&gt;In conclusion, the quest to understand and assess AI consciousness remains a challenging endeavor, but incorporating real-life examples, addressing opposing viewpoints, and developing more comprehensive models and frameworks can help us make progress in this field. Considering the ethical implications of AI consciousness will ensure that we navigate this uncharted territory responsibly and mindfully. &lt;/p&gt;

</description>
      <category>ai</category>
      <category>chatgpt</category>
    </item>
    <item>
      <title>Mediated Reality: A Superset of VR, AR &amp; MR.</title>
      <dc:creator>Babatunde Fatai</dc:creator>
      <pubDate>Mon, 26 Sep 2022 10:22:34 +0000</pubDate>
      <link>https://dev.to/babatunde/mediated-reality-a-superset-of-vr-ar-mr-4p0p</link>
      <guid>https://dev.to/babatunde/mediated-reality-a-superset-of-vr-ar-mr-4p0p</guid>
      <description>&lt;p&gt;The term ‘Mediated Reality’ was coined by wearable computing pioneer Steve Mann, it refers to the ability to add to, subtract information from, or otherwise manipulate one's perception of reality through computer devices such as wearable (e.g. immersive Headsets) or hand-held devices (e.g. Smartphones). Mediated Reality is a superset of Virtual, Augmented &amp;amp; Mixed Reality as well as Augmented Virtuality (AV), Diminished Reality (DR), Modulated Reality(ModR) and Modified Reality(MfR):&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftkxxet0azkbbo8e6zjgg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftkxxet0azkbbo8e6zjgg.png" alt="Mediated Reality Superset" width="800" height="800"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Many movies have portrayed uses of mediated reality devices, some as big as eyeglasses and some as small as contact lenses, there was this episode of Black Mirror that involved kids being fitted with mediated devices when born, and parents having control over what they could see, the particular episode portrayed the mother removing what she perceived as dangerous from the kids visual and auditory view, she blurred out of the view of her child things like dangerous animals, blood, swear words and more… though this is an extreme case, it fully portrays what the functioning of a mediated reality device, in this particular episode case it involved diminished Reality (DV) which simply is using computer-mediated reality to reduce perception by removing or masked ‘visual’ information. &lt;/p&gt;

&lt;p&gt;Mediated Realities typically involve coming between ‘mediating’ a user’s perception of reality, adding, removing, augmenting, enhancing or changing their perception of it, as with many technologies, there are upsides and downsides. Computer-mediated Reality has been used to enhance visual perception as an aid to the visually impaired, as has also been evident in recent years, mediated realities like Virtual, Augmented and Mixed Realities continue to play big roles in many industries such as gaming, medical, engineering and so on. There are many aspects to the mediated reality framework, and we will be exploring some of them together to form a better understanding of the potential variants and combinations of real and virtual objects and things in between.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3dbzc0cpst7xwsajvq6v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3dbzc0cpst7xwsajvq6v.png" alt="Mediated Reality Continuum" width="800" height="571"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Real World/Environment: This is defined as the unaltered reality, the reality we observe without mediation.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Virtuality: All realities that add computer-generated graphics to the real world are considered virtual. Virtuality encompasses Virtual Reality and Mixed reality (MR) subtypes such as Augmented Reality (AR) and Augmented Virtuality (AV).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Modularity: Modularity refers to many forms of mediated realities that add or subtract parts from the real world. Modularity encompasses less well-known and less advanced varieties of modulated reality (ModR), such as modified reality (MfR), diminished reality (DR), and severely diminished reality (SDR).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Augmented Reality (AR): First along the virtuality axis is Augmented Reality, it is an interactive experience that involves augmenting the real world with computer-generated (virtual) objects, the augmentation could happen through Head Mounted displays (HMD), eye wears, handheld devices such as mobile phones which are currently the most popular.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Augmented Virtuality (AV): Next on the Virtuality axis is Augmented Virtuality which describes the classes of realities that involves enhancing purely virtual experiences with elements of the real world, unlike AR that augments the real world with virtual objects.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Virtual Reality (VR): Last on the Virtuality axis, VR is defined as a computer-generated simulation of a three-dimensional image or environment that may be interacted with by a person wearing specialized electronic equipment, such as an HMDs with an internal screen or gloves with sensors to fully immerse the user.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Modified Reality (MfR): MfR is first along the Modularity axis where the user’s perception is altered through the filtering and modification of real elements. We could have Mediated Reality, Mediated Virtuality or any combination of both.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Diminished Reality (DR): This is defined as the use of technology to conceal or obscure real objects from a user's perceptions, as illustrated in the Black Mirror episode above. The term was coined by Steve Mann to describe a reality that can remove, at will, certain undesired aspects of regular reality. Similar to cropping an object out of a picture in Photoshop, but using a head-mounted display or similar devices that updates in real-time.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Applications of Computer-mediated Reality&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Computer-mediated Reality has been and continues to be useful in many areas, such as:&lt;/p&gt;

&lt;p&gt;1) Healthcare and Medicine: the use of Mediated reality in healthcare cannot be underestimated, either in the form of virtual or Augmented realities, it is useful in devices to improve vision, telemedicine, real-time operations across long distances… etc.&lt;/p&gt;

&lt;p&gt;2) Gaming: When it comes to gaming, the most common form of computer-mediated reality is virtual reality headsets, such as the Meta Quest. These are devices that attach to the user's head and immerse their field of vision solely within the game world.&lt;/p&gt;

&lt;p&gt;3) Manufacturing: Manufacturing assembly makes heavy use of Computer-mediated reality. For instance, the world's largest manufacturer of commercial aircraft, Boeing, employs augmented reality to simplify the intricate wiring of its 787-8 aircraft (AKA Dreamliner) to make the wiring procedure easier. Boeing particularly stated that by implementing AR, the corporation was able to save wiring time by around 25% and cut mistake rates almost in half.&lt;/p&gt;

&lt;p&gt;5) Trainings and much more: With the use of Computer-mediated reality such as Virtual reality for training many advantages have been studied and recorded over traditional or classroom training, for example, learners are more confident in applying what they’re taught, and they are also more focused and emotionally connected to the content amongst many other advantages.&lt;/p&gt;

&lt;p&gt;Relevant links:&lt;/p&gt;

&lt;p&gt;Steve Mann: &lt;a href="https://en.wikipedia.org/wiki/Steve_Mann_(inventor)" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/Steve_Mann_(inventor)&lt;/a&gt;&lt;br&gt;
Mediated Reality info: &lt;a href="https://mediatedreality.info/" rel="noopener noreferrer"&gt;See Here&lt;/a&gt;&lt;/p&gt;

</description>
      <category>vr</category>
      <category>ar</category>
      <category>devjournal</category>
    </item>
    <item>
      <title>What is Time Compression (Time Warp) in Virtual Reality?</title>
      <dc:creator>Babatunde Fatai</dc:creator>
      <pubDate>Wed, 01 Jun 2022 16:39:36 +0000</pubDate>
      <link>https://dev.to/babatunde/what-is-time-compression-time-warp-in-virtual-reality-40k2</link>
      <guid>https://dev.to/babatunde/what-is-time-compression-time-warp-in-virtual-reality-40k2</guid>
      <description>&lt;p&gt;When I was a teenager I was way into gaming, I played almost all the versions of Grand Theft Auto and Midnight Club I could get my hands on, I really only stopped playing to go to the bathroom, when my parents made me stop, when I couldn’t hold in the hunger anymore or when there was a power outage (typical of Nigeria). Losing track of time was normal to me then, I didn’t know what time compression was then, but I lived the phenomenon fully, to me it felt like daytime went by in an instant and night times was way too short. Live-action games seemed to have had more of that effect on me, after becoming a Virtual Reality Software Engineer and an avid lover of VR games, like Deja Vu I started remembering some of my teenage gaming moments, and I found that I could be in an enjoyable experience in VR for what seemed like 20 minutes only to find out I had spent twice that time, this was what drove me into researching Time compression (or Time Warp as I choose to call it).&lt;/p&gt;

&lt;p&gt;Time compression is a phenomenon where a longer real-time event gets compressed into a shorter perceived experience and this is even more pronounced in Virtual Reality because time goes by faster than we think when playing games in VR compared to other conventional methods.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why this phenomenon occurs?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Time is a man-made construct, we could easily abandon the 24hr a day concept and adopt a 50 hours a day time scale — though the hours or minutes would have to be shorter, our reality definitely won’t break. There are some leading theories about why time warp seems more pronounced in VR, one theory pointed out by researchers from the University of California Santa Cruz has to do with a player’s lack of bodily awareness in Virtual Reality. Professor of psychology Nicolas Davidenko, one of the co-authors and advisors of a study named Time compression in VR, highlighted why this is significant: In virtual reality, when you look down, you might see nothing where your body normally would be, or you might see a schematic of a body, but it won’t feel like your body, there are theories that we may rely on our heartbeat and other bodily rhythms to help our brain track the passage of time, so if you have a less vivid sense of your body in virtual reality, you might be missing the pulses of this timekeeping mechanism.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;A study carried out:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;An important study carried out by Grayson Mullen and Nicolas Davidenko published in the journal Timing and time perception dove into time compression, the study enlisted the participation of 39 undergraduate students from the University of California, Santa Cruz. Participants completed 13 timed levels of a VR and conventional monitor (CM) labyrinth-like maze game with the goal of rolling a ball into a predetermined goal space.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fodmstfcl4ry1xhk0lm6b.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fodmstfcl4ry1xhk0lm6b.jpg" alt="Illustrations of virtual reality and conventional monitor display conditions.&amp;lt;br&amp;gt;
Citation: Timing &amp;amp; Time Perception 9, 4 (2021)" width="711" height="342"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The size, intricacy, and difficulty of the levels increased. Participants were given the option of playing the VR or CM game first, followed by the opposite version. They were urged to press a button every time they thought 5 minutes had passed during the game. Participants were polled about their experiences after playing both game versions.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5v813d0u6k8nchhtvr13.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5v813d0u6k8nchhtvr13.jpg" alt="The sixth level of maze set A as viewed by participants in both the virtual reality (VR) and conventional monitor (CM) condition. The superimposed yellow line (not shown to participants) indicates a path to the goal.&amp;lt;br&amp;gt;
Citation: Timing &amp;amp; Time Perception 9, 4 (2021)" width="711" height="335"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Previous research indicated that time compression effects are only noticeable over extended periods of time (30 minutes or longer), so it is likely that the short 5-minute duration, may have constrained this study.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Findings:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The researchers discovered that test subjects using headsets were off on their estimate of time passed by an average of 28.5 per cent more than those using a typical computer screen or CM.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion: Consequences and effects&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The findings and the perceived effect of time compression in VR were as followed:&lt;/p&gt;

&lt;p&gt;1) Positive effect: Time compression Phenomenon in VR has the ability to mask time spent on disagreeable tasks and also could help people endure difficult medical procedures, such as chemotherapy and to reduce the negative psychological impact of painful medical treatments.&lt;/p&gt;

&lt;p&gt;2) Negative effect: On the flip side, time compression Phenomenon in VR seems to imply that people who play virtual reality games for a prolonged time are more likely to experience unpleasant side effects like sleeplessness which could also influence their moods &amp;amp; health, increasing the risk of addiction which can be associated with depression and insomnia.&lt;/p&gt;

&lt;p&gt;It is obvious that much more research needs to be made to better understand the phenomenon, I look forward to writing more on this phenomenon and maybe even doing a study. Until the next, keep learning.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Reference:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.researchgate.net/publication/351529254_Time_Compression_in_Virtual_Reality" rel="noopener noreferrer"&gt;Time Compression in Virtual Reality&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.researchgate.net/publication/329144640_Time_Perception_Movement_and_Presence_in_Virtual_Reality" rel="noopener noreferrer"&gt;Time Perception, Movement and Presence in Virtual Reality&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With love and high regards,&lt;/p&gt;

&lt;p&gt;Babatunde.&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>gamedev</category>
      <category>vr</category>
      <category>xr</category>
    </item>
    <item>
      <title>XR in the 90's - Read in 3D on your computer or in VR with a VR headset.</title>
      <dc:creator>Babatunde Fatai</dc:creator>
      <pubDate>Thu, 10 Feb 2022 17:34:21 +0000</pubDate>
      <link>https://dev.to/babatunde/xr-in-the-90s-read-in-3d-on-your-computer-or-in-vr-with-a-vr-headset-3dpc</link>
      <guid>https://dev.to/babatunde/xr-in-the-90s-read-in-3d-on-your-computer-or-in-vr-with-a-vr-headset-3dpc</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Funycgxch2k5abso7ot3n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Funycgxch2k5abso7ot3n.png" alt="Retro XR" width="752" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;XR Atlas focuses on exploring the past and understanding the present of Virtual, Augmented &amp;amp; Mixed reality technologies (XR) as well as the Metaverse, to better understand how these technologies are shaping the collective future of the human race. New contents are released every fortnight, and can be consumed also in 3D &amp;amp; VR. Don't forget to catch the XR Atlas nuggets!&lt;/p&gt;

&lt;p&gt;Click &lt;a href="https://xratlas.netlify.app/" rel="noopener noreferrer"&gt;HERE&lt;/a&gt; to read on your computer in &lt;a href="https://xratlas.netlify.app/" rel="noopener noreferrer"&gt;3D or with a VR headset&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Subscribe to the podcast on &lt;a href="https://www.linkedin.com/newsletters/xr-atlas-6897026405473812480/" rel="noopener noreferrer"&gt;LinkedIn HERE&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;XR in the 90's&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The 90’s were an exciting time for computing and one of the most revolutionary technologies was Virtual Reality (VR), this Atlas returns to some elements in the 90s that played significant roles in advancing VR technology and driving current VR adoption. Mapped out in this Atlas is a brief history of veteran console &amp;amp; VR games producer SEGA's first endeavor in making the SegaVR headset, as well as how other early VR pioneers' efforts in developing the technology shaped an underappreciated yet groundbreaking experience. Happy reading.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1990 Bankruptcy - VPL Research&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The 90s opened with VPL research ( aka Virtual Programming Languages), the first company to sell VR goggles and gloves filing for bankruptcy. VPL research gloves were called “Data Glove'' sensed the user's finger movement and translated it into computer input, this was indeed an impressive invention for it’s time. With such an opening to the decade, VR still saw some players doubling down.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy3w3sfxemhbphgk3ojik.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy3w3sfxemhbphgk3ojik.png" alt="VPL Research full body suit" width="654" height="1013"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;SegaVR: The lost dream.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In 1991 Sega Corporation, a Japanese multinational video game and entertainment company, who in the 90s made popular video and console games such a Mortal Kombat, Street Fighter II, Disney’s Aladdin, Sonic the Hedgehog and many other popular titles announced the release of their VR headset, which they  planned as a add-on peripheral, i.e. it isn’t a standalone headset and would have needed to be connected to an external source, be it PC or console. This was no doubt an exciting news for gamers and tech enthusiasts at the time, not only did Sega have the resources to at least present an image that they could do it, they also had many popular video and console games already making waves. Unfortunately for them, the VR concept and technology was underdeveloped back then, and hardware needed to render close to realistic scenes were very expensive, the Head Mounted display (HMD) release date had to be postponed and finally completely canceled. Although the company publicly showcased the Headset at a few events, the public never got the opportunity to buy or use the device. Looking at the huge gap between VR headsets made then compared to the ones made now, also taking into consideration the numerous reports of motion sickness, headaches and other medical issues associated with the headset, it was probably for the best that it was never released.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxf0ggwx0i5vyf3zxuslp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxf0ggwx0i5vyf3zxuslp.png" alt="reporter demo" width="647" height="1013"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Mega Visor Display (MVD) - VR-1&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;While Sega of America developed the ultimately never released Sega VR Headsets, Sega of Japan sought outside help for their own separate virtual reality endeavors. They partnered with Virtuality, one of the biggest and successful names in VR in the 90s who were leaders in VR deployment in the amusement industry, which was booming in the 90s. Both companies after coming together shared their patented technology, unique display, optics designs &amp;amp; talent, and after a few iterations they came up with the Mega Visor display(MVD) , one of the most structurally sound head-mounted displays at the time of its release. When originally released, the MVD could output real-time 3D graphics and pre-rendered films at a 756 x 244 pixel resolution with a 60°(H) x 46.87°(V) field of view, allowing riders to view a 360 degree landscape, it also weighed about 640 grams (for context Meta’s popular Quest 2 weighs 503 grams), these are quite impressive specs for something made so long ago!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fony31bepkmhoxn31eep0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fony31bepkmhoxn31eep0.png" alt="Mega Visor Display" width="647" height="1014"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Sega was very focused on their Amusement Theme Parks (Sega World) in the 90s, and one of their biggest successes was opening VR-1: an interactive virtual reality amusement park attraction. While little is known about the computer hardware VR-1 experience ran on, the three main elements of the ride were its Mega Visor Display, motion simulators, and software. Which were unified to create “near-total immersion” for riders. Pretty cool even by today’s standards.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Forte Technology’s VFX1 headset (1995)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The forte VFX1 was by far one of the best consumer-level head-mounted displays of the 90s, It comprised a helmet, a handheld controller, and offered head tracking, stereoscopic 3D, and stereo audio. It was sold for $995 (equivalent to $1690 in 2020) and Forte spent over 5 years developing the technology. Forte was known for their unconventional marketing techniques, to illustrate the marketing campaign for Forte Technology's VFX1 headset, the company hired a hip-looking fellow in skintight bronze pants to wander the streets of what looks like Germany, or perhaps London, and crouch and make weird gesticulations. Traumatized passers-by still wonder what they saw that day. Later, Forte also hired a model to wear silver spandex and act like a VR scarecrow, scaring away many potential customers during a rock video shoot.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvn5i6k8i4syu8q56j513.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvn5i6k8i4syu8q56j513.png" alt="Forte Technology’s VFX1 headset" width="648" height="1013"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lesson learnt&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;One thing is sure, creators, enthusiasts, engineers, developers and teams get to define the direction the evolution of most technology, especially VR. All the amazing things Sega, Forte and Virtuality did, in their failures and successes contributed into making virtual reality technology what it is today. But some of the struggles from that time echoes into these times also, although VR headsets are more immersive than they were in the 90s some of the same problems exist, one of which is the fear that as VR becomes more and more mainstream it will fuel escapism amongst the masses, many of these will be addressed in subsequent atlases. It also goes without saying that virtual reality has entered a new phase in its technological journey compared to two decades ago, the technology is far more advance, not only are the headsets more immersive, VR experiences are even being delivered on mobile, PC &amp;amp; the Web, extending the reach of the technology and its impacts in that respect.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;XR Atlas Nugget&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The term “Virtual Reality” was conceived by Jaron Lanier in 1987, during an intense period of research around this form of technology. Jaron formed VPL Research, the company who at the beginning of this atlas was said to be the first to sell VR goggles and gloves, which also went bankrupt at the beginning of the 90s.  It is important to note that while immersive experiences (depending on the definition) have been around for decades, the actual terms most people use to describe them are relatively new.&lt;/p&gt;

&lt;p&gt;Bibliography:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://segaretro.org/Sega_VR" rel="noopener noreferrer"&gt;Sega VR&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Enjoy some 90s VR videos: &lt;a href="https://www.youtube.com/watch?v=Vw4Y0YPxsLQ" rel="noopener noreferrer"&gt;Virtuality VR LBE VR Game Promotional Video&lt;/a&gt; (1990s). &lt;a href="https://www.youtube.com/watch?v=dji9YiPZ4AM" rel="noopener noreferrer"&gt;Virtual Reality in the 90's&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;The &lt;a href="https://www.gmw3.com/2020/07/the-virtual-arena-blast-from-the-past-the-vr-1/" rel="noopener noreferrer"&gt;Virtual Arena&lt;/a&gt; – Blast from the Past: The VR-1 - GMW3&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>vr</category>
      <category>webxr</category>
      <category>xratlas</category>
      <category>programming</category>
    </item>
    <item>
      <title>My in-game assets and I : Virtual Assets, one of the coolest things about the Metaverse.</title>
      <dc:creator>Babatunde Fatai</dc:creator>
      <pubDate>Tue, 28 Dec 2021 08:58:08 +0000</pubDate>
      <link>https://dev.to/babatunde/my-in-game-assets-and-i-virtual-assets-one-of-the-coolest-things-about-the-metaverse-4bi3</link>
      <guid>https://dev.to/babatunde/my-in-game-assets-and-i-virtual-assets-one-of-the-coolest-things-about-the-metaverse-4bi3</guid>
      <description>&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/IdmurblgyRI"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;I couldn’t possibly call myself a warrior in real life, though I would very much like to, but in the virtual world, I am definitely a warrior, maybe not one of the best, but I definitely get by! &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;How much virtual Assets do you own? I am not talking about Bitcoin or other cryptocurrencies this time, though they also count, I am referring to essential in-game assets, such as Skins, clothes, guns, avatars, virtual properties and more! If you are like me, probably a couple, I started Call Of Duty mobile many years ago, although when I started then, I never would have imagined myself purchasing virtual assets, to me then, it was totally unreasonable to spend on stuff that will never be expressed in any physical way, that cannot be held with my hands nor touched with my skin, but a lot has changed since then, I would say I have come to appreciate the staggering similarities between virtual assets and physical assets and currencies, what is real isn’t necessarily what you can touch, it is all about value, be it fiat or crypto, as long as it’s value holds true and it performs its function!&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Call Of Duty mobile stats by Activeplayer:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;300 million+ total downloads &lt;/li&gt;
&lt;li&gt;54 million+ active monthly players as of Nov 2021
Note stats are only for game mobile version.&lt;/li&gt;
&lt;/ul&gt;
&lt;/blockquote&gt;

&lt;p&gt;Within a few enjoyable years of playing CODm I had racked up a lot of virtual assets, either by purchase or by battle, all of which I am happy about, an example of such assets is; Alias - Battleworn, a formidable avatar that has helped me win a lot of battles.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe4n8czmgaqjymflqpuma.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe4n8czmgaqjymflqpuma.JPG" alt="Alias - Battleworn Image" width="795" height="379"&gt;&lt;/a&gt;&lt;br&gt;
Since I purchased her in the season 8 battle of CODm, and to date, she remains one of my most prized possessions amongst others won from countless bloody battles done teaming up with random fighters all over the globe against random fighters also from around the world.&lt;/p&gt;

&lt;p&gt;I couldn’t possibly call myself a warrior in real life, though I would very much like to, but in the virtual world, I am definitely a warrior, maybe not one of the best, but I definitely get by! These are some of the joys gained from gaming, this is one of the crucial reasons gaming seems to always be one of the prime industries that lead and drive adoption of new technologies, serving as an inspiration pushing developers and gamers alike further into the gaming and XR(Extended Reality) development ecosystem! A good game/XR developer is usually also a game player, not just for the fun of it, but also for the social interaction and limitlessness that comes with it. At the very minimum, games such as these push adopters of new technology to actually interact with the technology in new ways, this could also be the main reason the gaming industry and gaming brands like COD and Fortnite are one of the closest in enacting an actual Metaverse.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Metaverse Core Technologies&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I have covered quite a bit on the Metaverse in previous posts such as this, so if you find any info missing or wish to explore more, please &lt;a href="https://dev.to/babatunde/exploring-the-concept-of-the-metaverse-and-what-it-could-mean-for-humans-5dbe"&gt;go over there&lt;/a&gt;. So far the term Metaverse is still be established as various people have slightly different definitions, but the majority or a large group of people agree that Metaverses are typically powered by three core technologies; Blockchain, Web3 &amp;amp; Mediums of Access (e.g. XR, Web, mobile... etc), each as essential as the others, yet also powerful and useful in its own right. Blockchain and other innovations born from it (such as NFTs and Smart Contracts… etc.) introduces trust to the Metaverse ecosystem, after all a system without trust, well cannot simply be trusted and cannot be seen as part of the collective future we all strive to be part of and are so focused on building! Ownership is important, properties are also essential, not just in the real world, but also in the Metaverse, being able to buy, store, use and exchange assets while trusting fully in the technology backing the process is essential and blockchain technology provides that needed trust, let's explore a bit about the technology:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Blockchain, the foundations for the new digital economy&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;A Blockchain is a type of shared database that differs from a typical database in the way it stores information; blockchains store data in blocks that are then linked together via cryptography, making it difficult or impossible to change, hack, or cheat the system, it can be used to create a permanent, public, transparent ledger system for compiling data on sales, tracking digital use and payments to content creators, such as wireless users or musicians. In other words, blockchain is the technology that is used to establish the foundation blocks for this new digital economy.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Emotes &amp;amp; Virtual concerts: Fortnite demonstrating the future&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F997s8b1y94iiyhr4bkle.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F997s8b1y94iiyhr4bkle.JPG" alt="FortniteXTravis Image" width="800" height="444"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;On the 23rd of April 2020, Fortnite a much newer game than CODm organized a virtual concert on its platform, in attendance were approximately 12 million+ players/attendees, people from all walks of life, spread across several continents, with different ethnicity and belief system, an event of such scale would be almost impossible to organize in real life, it would be a logistical nightmare. This concert, like a few others Fortnite has organized, fully demonstrates the power of a Metaverse kind of world, one which isn't about seclusion but about gathering and socializing at a scale the world has never seen before, This also introduces a new and interesting dimension in terms of interaction between artists and fans something definitely worth exploring further.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Fortnite stats by Activeplayer:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;350million+ downloads
-277 million+ monthly players as of Nov 2021.&lt;/li&gt;
&lt;/ul&gt;
&lt;/blockquote&gt;

&lt;p&gt;Travis Scott’s concert on Fortnite was said to have made the Artist a whopping 20 million dollars, approximately worth more than 10 physical record concert nights, which was quite impressive for a virtual event that only lasted a few minutes. The large profit was really facilitated by the sales of in-game gears, V-Bucks (a virtual currency that can be used for in-game purchases) &amp;amp; Emotes, which are basically symbolic representations of emotion, in relation to Fortnite and other similar games &amp;amp; virtual spaces it lets players express themselves, an emote can be used for fun or to troll an opponent. It is very important to note that these virtual events and the sales that preceded and proceeded from them are not about commerce but expression, a core social dynamic, allowing Artists, players, brands, developers, and really anybody to express themselves in new ways, through new mediums, an exciting present pointing to an even more exciting future at hand.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Virtual Assets the coolest thing about the Metaverse&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Virtual events like the one explored above are starting to become a thing as many celebrities enter the space to leverage on the opportunities, but events are not the only way virtual assets are sold, already established brands such as Nike, who already possess the experience in making and distributing cool physical products have also found new ways to leverage on the opportunities the Metaverse creates, leveraging hugely on the perception and loyalty they already built outside the virtual world. &lt;em&gt;Bob Greifeld&lt;/em&gt; the CEO of NASDAQ recently brilliantly quoted that:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"[Blockchain] is the biggest opportunity set we can think of over the next decade or so."&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;And I totally agree with him, once Nike had to also worry about how to get quality raw materials for their shoes and worry about machines or people who could shape them into the high end products many of us appreciate buying today, but now, leveraging the Metaverse, they make and distribute virtual brand items without those worries, while even having the opportunity to reach a wider and untapped audience with less marketing, like they did recently also with Fortnite, According to an announcement from "Fortnite" producer Epic Games, Nike's Jordan Brand collaborated with "Fortnite" on a product integration that features new characters in the immensely popular online game wearing some of the athletic clothing brand's famous footwear.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/IdmurblgyRI"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Video Credit: Fortnite Youtube channel&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;According to Bloomberg, the #FortniteXJumpman collaboration includes two new limited-edition skins — downloadable character versions that consumers pay for — wearing Air Force 1s in various hues, including the black and red, or "bred," coloring associated with the Chicago Bulls, skins such as these on Fortnite are usually priced between $10 to $20, a very great source of revenue for the already profitable brand. The partnership further expands on how invaluable the virtual assets ecosystems are becoming, especially when it's value &amp;amp; uniqueness (intellectual property - because there is no risk of cheap knockoffs spreading in the ecosystem) is backed by a technology such as blockchain, an exciting future is unveiling right in front of our eyes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;We should expect more, probably much more than we currently do on the impact the Metaverse would have on our lives and businesses, it is also an extremely advantageous thing that we can already see what the future holds with the kinds of activities ongoing, people are not just buying emotes and characters, they are buying real estates and making long term financial commitments. My next writup on this will be addressing how everyday people and medium sized businesses should prepare themselves and their businesses for a future in which Metaverses exist, either fully or in a higher capacity than they currently do, I will also focus on the possible impact this may have on the African continent and how youths can strategically position themselves in preparation for the future. Until then, Keep learning.&lt;/p&gt;

</description>
      <category>metaverse</category>
      <category>vr</category>
      <category>ar</category>
      <category>gaming</category>
    </item>
    <item>
      <title>Exploring the concept of the Metaverse and what it could mean for humans.</title>
      <dc:creator>Babatunde Fatai</dc:creator>
      <pubDate>Thu, 04 Nov 2021 09:39:33 +0000</pubDate>
      <link>https://dev.to/babatunde/exploring-the-concept-of-the-metaverse-and-what-it-could-mean-for-humans-5dbe</link>
      <guid>https://dev.to/babatunde/exploring-the-concept-of-the-metaverse-and-what-it-could-mean-for-humans-5dbe</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;This Metaverse is going to be far more pervasive and powerful than anything else. If one central company gains control of this, they will become more powerful than any government and be a god on Earth. Timothy Dean Sweeney (Epic Games) &lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  What is the Metaverse?
&lt;/h2&gt;

&lt;p&gt;Before I begin, I would like to inform those of you who are well-versed in the Metaverse concept that this article may not be for you, that the vast majority of people I have spoken to have no idea what the Metaverse is, and it is to them that I write. &lt;/p&gt;

&lt;p&gt;There are currently many definitions and speculations on what the Metaverse is and how it should be defined, for this article I will be sticking with a definition from the &lt;a href="https://www.matthewball.vc/all/forwardtothemetaverseprimer" rel="noopener noreferrer"&gt;Metaverse Primer&lt;/a&gt; by venture capitalist and writer Matthew Ball (who was previously head of strategy at Amazon Studios):&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;The Metaverse is a massively scaled and interoperable network of real-time rendered 3D virtual worlds which can be experienced synchronously and persistently by an effectively unlimited number of users, and with continuity of data, such as identity, history, entitlements, objects, communications, and payments.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;To put it simply, the Metaverse is a virtual universe made up of connected experiences built to be cross-platform. That is, it isn't just accessible by VR headsets alone, but a whole range of devices including but not limited to computer systems, gaming consoles, mobile phones, XR hardware not leaving out physical presence. &lt;/p&gt;

&lt;p&gt;The term Metaverse comes from Neal Stephenson’s 1992 sci-fi novel Snow Crash, where it was used to describe a VR successor to the internet. The Novel inspired the popular Ready Player One movie, which portrays a possibility of what the Metaverse could shape up to look like. It is important to note however that the Novel proposes VR as the Metaverse of the future, but current trajectories and experts say otherwise. The Metaverse is perceived by many experts as the successor of mobile internet and 2D communications. Rather than just having access to the internet when virtually communicating with friends, the metaverse would empower us to exist within the internet in what will lead to increased Social Presence. &lt;/p&gt;

&lt;h2&gt;
  
  
  Social Presence and the Metaverse
&lt;/h2&gt;

&lt;p&gt;Social presence is important because it goes beyond just communicating in real-time, which your phone and various meeting apps can do, to communicating and interacting as if you are physically together, with gestures, reactions, impressions, and interactions playing a very big role, &lt;a href="https://dev.to/babatunde/the-impact-of-social-presence-and-co-presence-on-virtual-and-augmented-reality-4b9h"&gt;my article&lt;/a&gt; about the impact of Social presence and Co-presence dives deeper into this.&lt;/p&gt;

&lt;p&gt;The concept of "presence" distinguishes the Internet from the metaverse; Matthew Ball, who has written extensively on the subject, defined the Metaverse in 2020 as possessing the seven qualities, the Metaverse must:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Be persistent – which is to say, it never “resets” or “pauses” or “ends”, it just continues indefinitely.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Be synchronous and live – even though pre-scheduled and self-contained events will happen, just as they do in “real life”, the Metaverse will be a living experience that exists consistently for everyone and in real-time&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Be without any cap to concurrent users, while also providing each user with an individual sense of “presence” – everyone can be a part of the Metaverse and participate in a specific event/place/activity together, at the same time and with individual agency.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Be a fully functioning economy – individuals and businesses will be able to create, own, invest, sell, and be rewarded for an incredibly wide range of “work” that produces “value” that is recognized by others.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Be an experience that spans both the digital and physical worlds, private and public networks/experiences, and open and closed platforms.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Offer unprecedented interoperability of data, digital items/assets, content, and so on across each of these experiences – your “Counter-Strike” gun skin, for example, could also be used to decorate a gun in Fortnite, or be gifted to a friend on/through Facebook. Similarly, a car designed for Rocket League (or even for Porsche’s website) could be brought over to work in Roblox. Today, the digital world basically acts as though it were a mall where every store used its own currency, required proprietary ID cards, had proprietary units of measurement for things like shoes or calories, and different dress codes, etc.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Be populated by “content” and “experiences” created and operated by an incredibly wide range of contributors, some of whom are independent individuals, while others might be informally organized groups or commercially-focused enterprises.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Facebook’s (Meta) name change and effect.
&lt;/h2&gt;

&lt;p&gt;The Metaverse will introduce a new degree of freedom in social communication. Not one person or brand can tell you how it would shape up to look like, although many companies are already shaping various aspects of its possible future. &lt;/p&gt;

&lt;p&gt;A few weeks before Facebook's name change to Meta, Microsoft’s CEO Satya Nadella &lt;a href="https://www.youtube.com/watch?v=uS46IO_sKwc" rel="noopener noreferrer"&gt;endorsed the Metaverse&lt;/a&gt; as a strategic goal for Microsoft, where he talked about the use of both Azure digital Twin and Azure IoT to implement Metaverse like solutions, he was also quoted saying:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"as the virtual and physical worlds converge the metaverse made up of digital twins, simulated environments, and mixed reality, is emerging as a first-class platform."&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Microsoft later used the word “enterprise metaverse" in a subsequent event, making it clear that it isn’t only Facebook that has its eyes on the ball nor understands the opportunities this new and emerging landscape will provide.&lt;/p&gt;

&lt;p&gt;The truth is we may not fully know what the Metaverse of the future will look like, the first reason being that it is at a very early stage and it is obvious that Facebook's name change and works in the hardware, software and gaming ecosystem has stirred things up, Facebook's ambition is bold, not just the name change, the Oculus Quest 2 released by Facebook has surpassed expectations with over 1.8 million units sold as of 2020, making the company one of the top deliverers of immersive experiences. &lt;/p&gt;

&lt;p&gt;With over 3 billion users of Facebook products worldwide, a dedication to research and development, a maturing immersive ecosystem and more, their goal of creating a shared reality for more than half of the world seems like a feasible aim. However, that reality will most certainly coexist with multiple realities, it is very important to note that there are thousands of brands pushing forward the establishment of the Metaverse, an example is &lt;a href="https://www.vaulthill.io/" rel="noopener noreferrer"&gt;Vault Hill&lt;/a&gt; a decentralised virtual reality (VR) and augmented reality (AR) world where users can interact with computer-generated imagery (CGI) and other users and &lt;a href="https://decentraland.org/" rel="noopener noreferrer"&gt;Decentraland&lt;/a&gt; a decentralised 3D virtual reality platform which uses the Ethereum blockchain which was opened to the public in February 2020. There are also many more brands building connected experiences and technologies that would be instrumental in shaping what the Metaverse would look like and how we will interact with it. Facebook may have changed its name to Meta, but we all hold the brush and if you are reading this, you must have noticed that the painting on the canvas has begun.&lt;/p&gt;

&lt;h2&gt;
  
  
  What the Metaverse is not?
&lt;/h2&gt;

&lt;p&gt;Metaverse is a relatively new term in terms of it going mainstream. Because of the kind of relationship and similarities shared between it and many experiential technologies it is quite understandable that confusion may arise in trying to understand or define what it is or isn’t, this is why I will be going through a few major misconceptions and try my best to clarify to the best of my knowledge.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The Metaverse is not a game
The Metaverse is not a game, because it goes far beyond gaming, though the gaming ecosystem is one of the major drivers of the concept and is probably one of the closest in implementing a working prototype. You may not have made the connection if you are not a gamer, but the gaming ecosystem has for a long time started exhibiting symptoms of shifting towards a Metaverse like state, and Epic Games have been very vocal about this, their popular multiplayer game Fortnite demonstrates very strong attributes of a Metaverse, admittedly it still has a long way to go it has served as a testbed for the metaverse concept. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Taking a close look at where it is now; Players designing their own avatars from a variety of skins that span a wide range of IPs, the ability to purchase products that only exist in virtual spaces, in game-exclusive currency, the interaction between players, events such as concerts, movies, and others… etc. Fortnite may possess and are integrating attributes similar to a Metaverse, but they still have much to do in implementing the seven main qualities that define a Metaverse as stated above. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Sidenote: I think bridging these sorts of highly interactive and immersive gaming platforms, for example making it easy for players to be able to move their characters, values and more across platforms may further advance the shift towards the Metaverse. There are some things working for and against the establishment of a true Metaverse and we will discuss them down the line.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyhx33wgfzhenf28xf188.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyhx33wgfzhenf28xf188.jpg" alt="Fortnite " width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;The Metaverse is not Virtual Reality&lt;br&gt;
It may be a little tricky to differentiate between the Metaverse and a virtual reality platform, especially when the name originates from a fictional VR metaverse concept, but just as mobile phones, computer hardware and more are devices used to access the metaverse so are VR headsets, they provide a way to experience and interact with virtual worlds, but they are by no means the only way to access the Metaverse.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The Metaverse is not a virtual world/space&lt;br&gt;
Virtual worlds have been in existence for a while now, a game such as GTA possesses AI-driven characters in a virtual world which even accepts human inputs, there are also virtual worlds that are populated by humans that still do not qualify as Metaverses. Things like gaming, shopping, attending classes and meetings, interacting with friends... etc. which many virtual spaces currently provide will only be parts of the features of a fully functional Metaverse.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  My Top 3 Predictions on the Future of Metaverse
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;It will be bigger in terms of population and economy than many nations on earth.&lt;/li&gt;
&lt;li&gt;It will not replace physical reality and activities because even the physical is also seen by many experts as a component of the Metaverse. I predict, however, that having a Metaverse will have a positive impact on the environment, individuals and brands that utilize it properly.&lt;/li&gt;
&lt;li&gt;I believe some companies may try to claim significant parts of the Metaverse to have strategic control, making accessibility harder further damaging the goal of making it decentralized, cross-platform and open. (I hope not but predict so).&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Some things working for or against the establishment of a true Metaverse?
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Technological Limitations: Without ultrafast, low-latency internet, the metaverse would never attain its full potential — millions of people accessing with various devices and living in the virtual world from anywhere, at any time in real-time isn't an easy thing to achieve. Today's third and fourth-generation (4G) connections can manage streaming multiplayer games like Call of Duty and Fortnite, but they can't handle hundreds of simultaneous transmissions of time-sensitive data to the degree needed for a fully functional Metaverse just yet. This is why the development of 5G networks and 6G networks will go a long way in achieving the goal.&lt;/li&gt;
&lt;li&gt;Governmental regulations: There are increasingly growing number of people &lt;a href="https://www.bloomberg.com/news/articles/2021-03-19/virtual-land-prices-are-booming-and-now-there-s-a-fund-for-that" rel="noopener noreferrer"&gt;buying virtual lands&lt;/a&gt; and spending their money on virtual properties and services, things such as entitlements, properties, payment platforms amongst many would exist within the Metaverse and we need to start having serious thoughts on how these things will be regulated, will they be regulated by private institutions or would the Government want to have a hand in it? If they do, what would this mean in terms of accessibility and adoption? Would the Metaverse truly be decentralized?&lt;/li&gt;
&lt;li&gt;Technological Advancements: Advancement in technology like Blockchain, cryptocurrencies, NFTs, Extended Reality hardware (eg. VR headsets)..etc. has been a strong driving force in making the Metaverse concepts catch on, many of the new platforms are based on or utilize some of these technology in some way, allowing a new kind of decentralised digital asset to be built, owned and monetised.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;We should expect more, probably much more than we currently do on the impact the Metaverse would have on our lives and businesses, it is also an extremely advantageous thing that large brands with experience designing, developing and pushing out products are putting their money and workforce into this. My next writup on this will be addressing how everyday people and medium sized businesses should prepare themselves and their businesses for a future in which Metaverses exist, either fully or in a higher capacity than they currently do, I will also focus on the possible impact this may have on the African continent and how youths can strategically position themselves in preparation for the future. Until then, Keep learning.&lt;/p&gt;

</description>
      <category>metaverse</category>
      <category>socialpresence</category>
      <category>facebook</category>
      <category>beginners</category>
    </item>
    <item>
      <title>The impact of Social presence and Co-presence on Virtual and Augmented reality.</title>
      <dc:creator>Babatunde Fatai</dc:creator>
      <pubDate>Tue, 26 Oct 2021 09:48:56 +0000</pubDate>
      <link>https://dev.to/babatunde/the-impact-of-social-presence-and-co-presence-on-virtual-and-augmented-reality-4b9h</link>
      <guid>https://dev.to/babatunde/the-impact-of-social-presence-and-co-presence-on-virtual-and-augmented-reality-4b9h</guid>
      <description>&lt;p&gt;&lt;strong&gt;What does Social presence mean?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“Virtual Reality is really a new communication platform. By feeling truly present, you can share unbounded spaces and experiences with the people in your life. Imagine sharing not just moments with your friends online, but entire experiences and adventures.”&lt;br&gt;
-Mark Zuckerberg&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Have you ever wondered what it would feel like to sit on your couch with your co-worker for a conversation over coffee, even if they are thousands of miles away on another continent? Not just hearing their voice nor seeing their face on your screen like you currently can on many platforms, but having them, or to be more precise, a virtual representation of them right there on your couch, as you converse? Virtual and Augmented reality (AR/VR) immersive technologies have made that possible for quite some time now.&lt;/p&gt;

&lt;p&gt;You must have heard or read a thousand and one articles on what AR and VR are, so I won’t bore you with many historical details nor definitions; check HERE for a brief on those. I am more interested in their current advancements and prospects, how they have started shaping industries and are creating an ecosystem that may one day be bigger than most countries on earth, existing purely in the virtual world.&lt;/p&gt;

&lt;p&gt;Comparing this ecosystem to a platform like zoom is highly short-sighted because the vision and current propagation go beyond attending meetings. People are willing to purchase virtual lands and invest insanely in abstract properties and items; as in the case of Genesis City, a plot of virtual land the size of Owerri city, with rumours of 1,100 square foot plot going for as much as $200,000. People are also willing to reinvent their images by becoming whoever or whatever they want just with the snap of their fingers, a true world without limitations except those they choose for themselves. It is indeed a fantastic thing to have expressed the last few sentences without once thinking I was describing a future to come or science fiction, but reality.&lt;/p&gt;

&lt;p&gt;Social Presence refers to the degree to which one perceives the presence of participants in the communication. Social Presence theory argues that media differ in the ability to convey the psychological perception that other people are physically present due to the different capacity of media to transmit visual and verbal cues (e.g., physical distance, gaze, postures, facial expressions, voice intonation, and so on) (Fabio et al., in Advances in Computers, 2010) or to put it simply; it is the ability to feel like you are physically present with another person in a virtual space, a (true/deep) sense of presence.&lt;/p&gt;

&lt;p&gt;Social presence is important because it goes beyond just communicating in real-time, which your phone and various meeting apps can do, to communicating and interacting as if you are physically together, with gestures, reactions, impressions, and interactions playing a very big role.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Attributes that drive Social presence&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Seeing reality as a construct made up of tiny parts such as touch, smell, sound. Etc. that comes together to form the real has helped in replicating it; whatsoever is real (not abstract like joy or love), as long as it can be defined can be replicated. Hence, one can genuinely piece together the sets of actions, events, gestures and reaction that makes a physical meeting more favourable/immersive than a virtual one (say on zoom).&lt;/p&gt;

&lt;p&gt;If these sets of attributes that makes a physical meeting different from a virtual one can be replicated (as they are already being done) then existing purely in a virtual world with a full sense of presence isn’t much of a problem, many would think, no matter how much they can be represented virtually through Avatars and extremely immersive gadgets, things like smell and touch cannot be, but even that line of thought is archaic because odours and touches, sensory immersion, can and have already been replicated.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Touch and Feel (Teslasuit &amp;amp; Tactsuit)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;One key attribute that makes a physical interaction with friends and family more memorable than a virtual one is the sense of touch. One cannot deny the presence of another if they can interact with each other by touching.&lt;/p&gt;

&lt;p&gt;It doesn’t even have to be intense continual touching. A simple handshake speaks volumes. A warm hug decimates any comparison with a Zoom call. Also, when interacting in virtual environments, feeling the gunshots (to a mild degree), punches to the body, tap on the face with the aid of haptic face covers, and the warmth of a handshake significantly changes things.&lt;/p&gt;

&lt;p&gt;This is what the likes of &lt;a href="https://teslasuit.io/" rel="noopener noreferrer"&gt;Teslasuit&lt;/a&gt; &amp;amp; &lt;a href="https://www.bhaptics.com/tactsuit/tactsuit-x40" rel="noopener noreferrer"&gt;Tactsuit&lt;/a&gt; bring to the furthering of social presence in the virtual reality space, with their full-body haptic wireless suits, which possesses multiple haptic points capable of simulating a range of physical sensations all over the body, all in the bid to increase immersion and improve human performance.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe9g2awmzs1ad59iijtgy.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe9g2awmzs1ad59iijtgy.jpg" alt="Teslasuit Image" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;They are not the only player in the field, and neither is touching or feeling the only part of sensory immersion. Still, it becomes increasingly harder to deny the presence of an individual that can punch you right in the face, or rub your shoulders lightly during a conversation, even if they are thousands of miles away.&lt;/p&gt;

&lt;p&gt;Smell (Nosulus Rift &amp;amp; Feelreal)&lt;/p&gt;

&lt;p&gt;Although it may not seem so at first, olfaction plays a significant role in regulating a wide variety of human behaviours. The ability to smell undoubtedly plays a vital role in affirming our reality.&lt;/p&gt;

&lt;p&gt;Hence, it isn’t a surprise that a device like Nosulus Rift and Feelreal exists already, to bring the smell directly to your nose, or as a Ubisoft spokesperson puts it; “The Nosulus Rift is a fully functional mask using sensors activated through inaudible sound waves in the in-game fart sound, every time the player makes use of his nefarious [fart] powers, Each time the sensors are activated, they trigger the odour’s puff, meticulously and without mercy!”.&lt;/p&gt;

&lt;p&gt;Yup, not only can you perceive the aromatic or pungent smell, but you would also not be spared the melodic sound of your friend letting one loose in your virtual space.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo72pdn2zxna7mmfgrrps.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo72pdn2zxna7mmfgrrps.jpg" alt="Nosulus Rift" width="800" height="454"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://feelreal.com/" rel="noopener noreferrer"&gt;Feelreal&lt;/a&gt; upped the game with their scent generator, which holds an easily replaceable cartridge containing nine individual aroma capsules.&lt;/p&gt;

&lt;p&gt;With the ability to choose and combine any of the 255 scents available in their store, install and change them depending on the type of VR experience. The device caught my attention mainly because of its compatibility with existing VR devices and its usage beyond smell.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnll91jj0fmq5nvuef3ra.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnll91jj0fmq5nvuef3ra.png" alt="Feelreal" width="800" height="259"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Taste - Eating in virtual reality&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Two things are clear when it comes to how taste is approached in Virtual reality and how it contributes to social presence. First, the environment matters in how we taste our food. Hence virtual reality can be used to improve one’s perception of how food tastes significantly. An important example of this was a research carried out by Cornell Cals college:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;About 50 panelists who used virtual reality headsets as they ate were given three identical samples of blue cheese. The study participants were virtually placed in a standard sensory booth, a pleasant park bench and the Cornell cow barn to see custom-recorded 360-degree videos&lt;br&gt;
The panelists were unaware that the cheese samples were identical, and rated the pungency of the blue cheese significantly higher in the cow barn setting than in the sensory booth or the virtual park bench.&lt;br&gt;
To control for the pungency results, panelists also rated the saltiness of the three samples – and researchers found there was no statistical difference among them.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;“When we eat, we perceive not only just the taste and aroma of foods, we get sensory input from our surroundings – our eyes, ears, even our memories about surroundings,” (Robin Dando, associate professor of food science and senior author of the study).&lt;/p&gt;

&lt;p&gt;Secondly, taste can be simulated and stimulated (though experiments seem to be in the early stages). Some experiments with “virtual food” use electronics to emulate the taste and feel of the real thing, even when there’s nothing in your mouth. At the same time, other experiments focus on thermal stimulation of taste. This tech could add new sensory inputs to virtual reality or augment real-world dining experiences, especially for people with restricted diets or health issues that affect their ability to eat.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm7b8pj1nrlod8yrww80z.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm7b8pj1nrlod8yrww80z.jpg" alt="Unreal: this tastes delicious" width="800" height="529"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Hence, if you can see your friend (or an avatar/version of them they choose to express), touch them, smell them (and their tremendous fart!), interact with them in real-time, have a few meals. At the same time, at it, all on a “Virtual platform”, how long do you think it would take you to forget that you are actually in a virtual world and not a real one? Therein lies the fantastic powers of social presence, an absolute sense of being with them, being there for them when needed and making and sharing memories with them even if you are thousands of miles apart.&lt;/p&gt;

&lt;p&gt;I implore you not to get lost in definitions or the limitations of the human imagination. When adamant people stand their ground on why the virtual can never be as “real” as the real world, you should understand that they either haven’t experienced enough, seen enough, or believed enough in our collective intelligence to pull this off. It has been done (though there is still considerable room for improvement), it is being done, and you can be part of it, at the very minimum, as an experiencer.&lt;/p&gt;

&lt;p&gt;The importance of social co-presence&lt;/p&gt;

&lt;p&gt;If you’re watching someone else use VR, it’s hard to tell what’s going on and what they’re seeing. And if you’re in VR with someone else, there aren’t easy ways to see their facial expressions without an avatar representation.&lt;/p&gt;

&lt;p&gt;Many people complain about how much virtual reality isolates its users from the environment and those around them. This is why social co-presence is essential. How do we maintain the seamless social connection between real and virtual worlds, especially between people who may or may not be in the same AR/VR experiences?&lt;/p&gt;

&lt;p&gt;Many companies are tackling this in various ways. Immersion in the virtual world is essential, as is collaboration in the real world, even in a virtual experience.&lt;/p&gt;

&lt;p&gt;Google headset removal and Oculus reverse passthrough&lt;/p&gt;

&lt;p&gt;It is one thing to feel the presence of your friend in a virtual call. Another to see their expressions as they undergo the experience, which is a significant obstacle to attaining social co-presence. Using a collection of techniques, Google Machine Perception researchers, in collaboration with Daydream Labs and YouTube Spaces, aim to solve this problem by digitally recreating your face in place of the VR headset blocking it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4td5lgjwnan3qb8s2ltb.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4td5lgjwnan3qb8s2ltb.gif" alt="Daydream Labs" width="560" height="315"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;On the other hand, Facebook had a different mode of approach with their reverse passthrough, which essentially shows a render of the VR user’s eyes on 3D displays at the front of the headset.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6rgt71ixgh8241zutg4z.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6rgt71ixgh8241zutg4z.jpg" alt="Reverse passthrough" width="800" height="449"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So both Google and Facebook’s approaches emphasize the importance of eye contact, even if you have a four-pound VR headset on and are in a virtual experience. Isolation which seemed like a downside to immersive technologies, seems not to be an option with these sorts of experiments; isolation is not an option when social presence and co-presence are concerned.&lt;/p&gt;

&lt;p&gt;Conclusion&lt;/p&gt;

&lt;p&gt;Many believe the future of collaboration and of social experiences is Virtual and Augmented reality technology, and when these technologies work well they deliver social presence.&lt;/p&gt;

&lt;p&gt;There are many parts of these technology that need to come together to deliver a flawless experience indistinguishable from reality and most of them are still in their infancy, the level of innovation going on in the space is mind blowing with so many brands working not just on hardware but also on improving the software, Facebook’s Oculus for one is dominating the VR consumer space and hinting Augmented reality Ray-Ban smart glasses, something many are most excited about.&lt;/p&gt;

&lt;p&gt;There are indeed many players worth mentioning aside Oculus, such as Apple, Snap, Supernatural, Valve, Qualcomm, Spatial, Unity and the list goes on, some focused on building the right platforms to deliver these social experiences while others focused on hardware and software… together they are shaping the social future, together in our own little ways as developers and enthusiasts, we are shaping the social experience of the future and I do not see us stopping anytime soon.&lt;/p&gt;

&lt;p&gt;Note:&lt;/p&gt;

&lt;p&gt;Please &lt;a href="https://anchor.fm/the-informations-411/episodes/BONUS-EPISODE--Mark-Zuckerberg-on-the-Future-of-AR-and-VR-ervthr/a-a4s4ba7" rel="noopener noreferrer"&gt;CLICK HERE&lt;/a&gt; to listen to Zuckerberg describe the future of ARVR from Facebook’s perspective (quite amazing and an inspiration for this piece).&lt;/p&gt;

</description>
      <category>virtualreality</category>
      <category>augmentedreality</category>
      <category>beginners</category>
      <category>socialpresence</category>
    </item>
    <item>
      <title>A look at WebXR and its frameworks as an important future of XR technology.</title>
      <dc:creator>Babatunde Fatai</dc:creator>
      <pubDate>Sun, 24 Oct 2021 08:42:55 +0000</pubDate>
      <link>https://dev.to/babatunde/a-look-at-webxr-and-its-frameworks-as-an-important-future-of-xr-technology-5g0h</link>
      <guid>https://dev.to/babatunde/a-look-at-webxr-and-its-frameworks-as-an-important-future-of-xr-technology-5g0h</guid>
      <description>&lt;p&gt;&lt;strong&gt;What is covered in this article?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In this article, we'll look into a future of Extended reality (XR) technology; specifically WebXR. This will be a light overview for developers, newbies and enthusiasts alike willing to explore the side of XR focused on creating 3D, VR and AR experiences on the WEB thereby making XR accessible via web browsers such as Chrome and Firefox, reducing the need for an expensive VR headset and increasing accessibility. We'll also look into two major popular WebXR frameworks - Aframe and babylonjs, see some advantages they offer and how easy it is to get started with them, with a few key details.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Quick definitions:&lt;/strong&gt; &lt;em&gt;skip ahead if you must&lt;/em&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Glitch: This platform is a lifesaver, it provides a free online code editor as well as website deployment and hosting. As you would be finding out in the course of this article, Glitch allows you to remix (i.e. create a copy) of existing projects and make them our own, so you have something to start with. Examples made in this article are available for you to experiment with on Glith. enjoy.&lt;/p&gt;

&lt;p&gt;Gltf/Glb: glTF is a derivative short form of Graphics Language Transmission Format, it is a standard file format for three-dimensional scenes and models. A glTF fileis usually the best file for WebXR compared to fbx, obj or other similar 3D filr format.&lt;/p&gt;

&lt;p&gt;WebGL: this is a JavaScript API (application programming interface) for rendering interactive 2D and 3D graphics within any compatible web browser without the use of plug-ins, many useful frameworks have been built on it&lt;/p&gt;

&lt;p&gt;Three.js: A framework built on top of WebGL which makes it easier to create 3D graphics in the browser, it uses a canvas + WebGL to display the 3D scene.&lt;/p&gt;

&lt;p&gt;Virtual Reality (VR): The use of computer technology to create a simulated environment is known as virtual reality (VR). In contrast to typical user interfaces, virtual reality immerses the user in an experience.&lt;/p&gt;

&lt;p&gt;Augmented reality (AR): Snapchat and Instagram filters rely on augmented reality technology, AR is defined as enhancing the physical world by incorporating digital elements in real-time to create a 3D experience.&lt;/p&gt;

&lt;p&gt;Extended reality (XR): XR is an umbrella term for all immersive technologies (AR/VR/MR).&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;br&gt;
Before we go on, meet Jaimee a cool Seagull enjoying his music and having his fun after being loaded as a gltf (see definition above) model from Babylonjs Mesh library to a web browser using Aframe a nifty framework for building virtual experiences on the web, to see and interact with &lt;a href="http://aframe-hello-seagull.glitch.me/" rel="noopener noreferrer"&gt;Jaimee click this&lt;/a&gt;. &lt;a href="https://glitch.com/edit/#!/aframe-hello-seagull" rel="noopener noreferrer"&gt;HERE&lt;/a&gt; is the code fully available to you on Glitch the free online code editor for you to experiment with.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcugnm6uiohals5avjeij.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcugnm6uiohals5avjeij.JPG" alt="Cool Jaimee" width="800" height="409"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Extended reality technology is on a sure path to becoming one of the most important tools to help navigate daily, personal and professional parts of our lives, which makes it no surprise the iPhone is an AR &lt;a href="https://www.cnet.com/tech/mobile/ar-is-alive-and-well-on-the-iphone-and-apple-augmented-reality-is-getting-better-fast/" rel="noopener noreferrer"&gt;powerhouse&lt;/a&gt; in people’s pockets, its powerful processors and chips, spatial audio, LiDAR sensor that scans spaces and can create 3D maps are amongst the many capabilities carved into the device than the average user is aware of or ever uses, which makes many wonders why they keep pushing the power and capabilities of Apple devices. I am sure there are many reasons, both simple (a bid to stand out amongst others) and complex, but what they all result in is the availability of XR technology for as many people as possible. XR isn’t always about big headsets or smart glasses, XR experiences can be and are being delivered via mobile devices and web browsers. As a developer and strong advocate of XR, my biggest dilemma has been how to improve accessibility for XR, especially in the African continent. XR shouldn’t be just for those who can afford or have access to VR headsets, nor should it be just for those who work for companies rich enough to afford the Hololens, everyone should and must have access to the technology and be exposed to its impact, be it in learning, gaming or performing daily activities. This is why I find WebXR interesting and worth exploring. With powerful frameworks like Babylonjs, Aframe, THREE and many more, granting the average person/developer the ability to deliver XR experiences on web browsers, a substantial amount of people who may not have the funds to get headsets nor afford the latest iPhone, can access XR experiences via there web browsers. WebXR has shown itself to be an integral part of XR technology especially when it comes to adoption. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Harnessing the power of your web browsers&lt;/strong&gt;&lt;br&gt;
WebXR harnesses the power of the web along with the unification of XR realities, under one philosophical tent, making it easier to build interactive environments, immersive 3D arts, VR tools and more, which are compatible across browsers, operating systems, and devices. The goal of WebXR API is to allow the rendering/showing of  XR content on browsers, it doesn't care how the content is created or what device is used to view it, making it so that developers can utilize whatever libraries are best for their needs and develop on top of it. You do not necessarily need to be familiar with C# or C++, which VR developers used to need to build projects in Unity and Unreal, with WebXR, people with basic knowledge of web development tools - HTML, CSS and JS know enough to get started. &lt;/p&gt;

&lt;p&gt;The &lt;a href="https://developer.mozilla.org/en-US/docs/Web/API/WebXR_Device_API" rel="noopener noreferrer"&gt;WebXR Device API&lt;/a&gt; is the primary conduit by which developers can interact with immersive headsets, AR eyeglasses, and AR-enabled smartphones, major web browsers have been focused on integrating XR features into their ecosystem, browser support has sped up adoption and encouraged the building of new and more powerful APIs to support XR. Browsers like Chrome, Microsoft Edge, firefox… etc now lead the battle, with WebXR support added to the most recent versions of these browsers making development and testing extremely easy. This article will be in series, I will be explaining what WebXR is and provide helpful resources for getting started with it in this series, then we will go on to creating projects subsequently.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F98iy7tovg0w4dpdeiph9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F98iy7tovg0w4dpdeiph9.png" alt="WebXR frameworks" width="800" height="286"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Choosing the right Framework for you - Tools for developing WebXR apps&lt;/strong&gt;&lt;br&gt;
It is a sensible rule that you don't have to start from scratch when you have tools to assist you with your tasks. Frameworks are pieces of software that developers create and utilize to make creating applications as easy as possible. In the WebXR context, examples of frameworks readily available for you to start building immersive experience in no particular order are A-frame, Threejs, Babylonjs, Modelviewer, PlayCanvas, Godot...etc. game engines such as Unity and Unreal also support WebXR development although to a limited and growing capacity. Choosing the right tool for yourself should be based on the sort of experience/knowledge you already have, the sort of immersive experience you wish to design and the framework that has the right components to meet your needs, it would be a good idea for example for those with little to no knowledge on programming or with just a base knowledge on HTML and CSS to design experiences with a simple tool like A-frame while those with very deep knowledge in programming could use three.js, which provides far more robust features and flexibility than A-frame. If you are already a unity3D developer, it might also be a good idea to pick a framework that works with unity support, so you could leverage past knowledge and experiences in your immersive experience design journey. One way or another, it is important to remember that WebXR is fast growing with new and enthusiastic developers joining the ecosystem daily with enough documentation and examples available to get just about anyone started irrespective of their previous experience. For this article I will focus on just two frameworks, Aframe and Babylonjs, Enjoy!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Passing through A-frame&lt;/strong&gt;&lt;br&gt;
Before we get into it, the image below is a screenshot of the first scene I made with Aframe, as a big fan of space, I decided to recreate the earth, as realistic as I could, with textures and a fantastic space skybox. &lt;a href="https://aframe-hello-earth.glitch.me/" rel="noopener noreferrer"&gt;Click this&lt;/a&gt; to see how the journey turned out. &lt;a href="https://glitch.com/edit/#!/aframe-hello-earth?path=index.html%3A15%3A36" rel="noopener noreferrer"&gt;HERE&lt;/a&gt; is the code on Glitch for you to experiment with.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx9q9ayu1vi6bzoln2rqc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx9q9ayu1vi6bzoln2rqc.png" alt="Earth with aframe" width="800" height="468"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://aframe.io/" rel="noopener noreferrer"&gt;A-frame&lt;/a&gt; is an open-source web framework for creating virtual reality experiences on the web maintained by Supermedium and Google developers. Being based on top of HTML, A-Frame is accessible to everyone because HTML is easy to read, understand, and copy-and-paste allowing web developers, VR enthusiasts, artists, designers, educators and kids to use HTML to construct 3D and WebVR environments. In other words, A-frame can be developed from a plain HTML file without having to install anything, right there on your browser. With a simple script editor like Glitch, you can build an XR environment, create beaches filled with sand, construct celestial bodies to mimic the solar system… etc. The A-frame library not only supports the rendering of 3D images, objects, and models, it also includes event handling scripting. Gaze events, for example, can be handled to detect when a user is staring at a specific object. You may move parts around, activate physics for items to bounce off of one another, and even integrate 3D spatial sound (sound effects that trigger and get louder/softer as you get closer to certain objects).  &lt;/p&gt;

&lt;p&gt;A-frame is simply one of the simplest tools to start with, the example above shows my first attempt at making an estimated model of the earth, with the right texture and skybox (spacelike background) to fit, this was made when I was just getting introduced to the framework, I made many mistakes and I made sure I made them early and learnt from them. Practical learning includes observing and doing things yourself, as you would notice, this approach to what I set out to achieve thought me a lot about importing files into Aframe, assigning textures and more. A-Frame's fundamental components are readily available to you include geometries, materials, lighting, animations, models, raycasters, shadows, positional/spatial audio, text, and controllers for most major headsets; these definitely get you started quickly. Hundreds of community components, such as environment, state, particle systems, physics, multi-user, oceans, teleportation...etc. all available for your learning and use in your projects.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why consider using A-frame on a project?&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;It is easy to build with and test due to how it leverages Glitch.&lt;/li&gt;
&lt;li&gt;There are many examples and resources to remix and build from.&lt;/li&gt;
&lt;li&gt;It has a huge and supportive community.&lt;/li&gt;
&lt;li&gt;It has a lot of learning resources.&lt;/li&gt;
&lt;li&gt;It has unity support( unity-to-aframe ): which I tried but didn't stick around long enough to learn it in-depth.&lt;/li&gt;
&lt;li&gt;It has a built-in A-frame inspector, that should be familiar to Unity and Maya users, which allows more flexibility and agile development.&lt;/li&gt;
&lt;li&gt;Cool projects built with aframe: &lt;a href="https://webvr.soundboxing.co/" rel="noopener noreferrer"&gt;SoundBoxing WebVR&lt;/a&gt;, &lt;a href="https://hubs.mozilla.com/SXo4ego/lime-automatic-cosmos" rel="noopener noreferrer"&gt;Hubs by Mozilla&lt;/a&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;To put it simply, with A-frame you can build and deploy an XR experience on the web within minutes, I guess this was why it was the first tool recommended to me online when I wanted to begin my WebXR journey, I loved the simplicity of the framework and would recommend it for fast prototyping, although I believe a lot still has to improve with the framework and its community. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;On to Babylonjs&lt;/strong&gt; &lt;br&gt;
I created the Hell on set scene below drawing inspiration from a similar example on Babylonjs playground, see how cool Babylonjs is? &lt;a href="http://hell-on-set.glitch.me/" rel="noopener noreferrer"&gt;Click this&lt;/a&gt; to see how the journey turned out. &lt;a href="https://glitch.com/edit/#!/hell-on-set?path=index.html%3A15%3A19" rel="noopener noreferrer"&gt;HERE&lt;/a&gt; is the code on Glitch for you to experiment with.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F95ghgyzobdwh17yw9gyk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F95ghgyzobdwh17yw9gyk.png" alt="Hell-on-Set" width="800" height="464"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After spending a few weeks with &lt;a href="https://www.babylonjs.com/" rel="noopener noreferrer"&gt;Babylonjs&lt;/a&gt;, I have to confess that I have become a very big fan and a bigger fan of its community and extremely powerful sets of tools, so forgive me for my bias. Babylonjs was first released in 2013 under the Microsoft Public License, having been developed as a side-project by two Microsoft employees, David Catuhe and David Rousset, with the help of artist Michel Rousseau as a 3D game engine, since then it has warmed it's way into the hearts of many developers, becoming one of the most popular 3D game engines for the Web. Being a robust 3D library, it provides very useful built-in functions such as &lt;a href="https://sandbox.babylonjs.com/" rel="noopener noreferrer"&gt;Sandbox&lt;/a&gt;, Node editors, particle effects and the popular Playground, these functions help you implement common 3D functionality in efficient and accurate ways. It was developed using TypeScript language based on WebGL and javascript.&lt;/p&gt;

&lt;p&gt;Babylonjs has enough resources, documentation, examples and tools to get started with, on their playground (PG) you can design, develop and test your experience right on your browser (instead of having to use a code editor like VS code), you can also download directly from the playground snippets of codes, particle effects and animations and plug or reuse them in entirely different projects with ease.  Experimenting with babylonjs has been fun, interesting and intriguing, I personally found it easier to achieve the things I set out to do compared to Aframe, it was a tidbit harder to connect my Babylonjs experiments to HTML outside the playground, but looking back now, I would say it isn't hard at all, it just took me longer to figure out but the journey was mightily rewarding. The best way to pick Babylonjs up, after you've done your research and determined it is the best tool for your solution, is to start from their Documentations, it is structured with tons of examples to try out right on your browser. Including XR functionalities that would work across devices, including oculus quest just requires &lt;a href="https://doc.babylonjs.com/divingDeeper/webXR/webXRDemos" rel="noopener noreferrer"&gt;a few lines of code&lt;/a&gt;, the Babylonjs framework truly made life easy.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why consider using Babylonjs on a project?&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Amazing Community: the thing I love most about using babylonjs is the huge, active and extremely &lt;a href="http://forum.babylonjs.com/" rel="noopener noreferrer"&gt;helpful community&lt;/a&gt;. Always ready to provide help when you are stuck.&lt;/li&gt;
&lt;li&gt;Playground: The &lt;a href="https://playground.babylonjs.com/#1OH09K#131" rel="noopener noreferrer"&gt;playground&lt;/a&gt; is a very useful tool, like Glitch it allows you to build experiences from almost any device that can access the web, saving, sharing and changing your code on the go.
It has a lot of learning resources and readily available assets, you usually do not have to start from scratch, there are thousands of playground fully functional examples and assets, ready for you to effectively utilize.
4.Cool projects built with babylonjs: &lt;a href="https://www.mercedes-benz.com/en/eq-formulae/we-drive-the-city/mercedes-eq-speedboard-game/" rel="noopener noreferrer"&gt;Speedboard&lt;/a&gt; by Mercedes-benz, &lt;a href="https://viseni.com/shiba6/" rel="noopener noreferrer"&gt;Shiba inu token to the moon&lt;/a&gt;, &lt;a href="https://poki.com/en/g/temple-run-2#" rel="noopener noreferrer"&gt;Temple run 2&lt;/a&gt; by Imangi Studios, &lt;a href="https://playground.babylonjs.com/#3I55DK#0" rel="noopener noreferrer"&gt;Marble tower&lt;/a&gt; on PG... etc. &lt;a href="https://www.babylonjs.com/community/" rel="noopener noreferrer"&gt;Here&lt;/a&gt; for more.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Conclusion:&lt;/strong&gt;&lt;br&gt;
My intention for this series is to at least show you the possibilities and the opportunities provided by these powerful frameworks and to also provide a brief knowledge to prepare you for future series which will be dealing with creating simple, useful and reusable experiences on the Web. you do not have to be a developer to have an interest in this, WebXR is universally available as more people have access to web browsers than XR headsets, which means it is increasingly becoming an important segment of XR driving adoption. Much to write, but until the next keep learning.&lt;/p&gt;

</description>
      <category>webxr</category>
      <category>babylonjs</category>
      <category>webdev</category>
      <category>javascript</category>
    </item>
  </channel>
</rss>
