<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Awodi Abdulmujeeb Ayomide</title>
    <description>The latest articles on DEV Community by Awodi Abdulmujeeb Ayomide (@awodi_abdulmujeebayomide).</description>
    <link>https://dev.to/awodi_abdulmujeebayomide</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/awodi_abdulmujeebayomide"/>
    <language>en</language>
    <item>
      <title>Vibe Coding XR Applications with Gemini XR Blocks: Lessons from Building a Prompt-Driven XR Biology Lab</title>
      <dc:creator>Awodi Abdulmujeeb Ayomide</dc:creator>
      <pubDate>Mon, 06 Apr 2026 16:23:39 +0000</pubDate>
      <link>https://dev.to/awodi_abdulmujeebayomide/vibe-coding-xr-applications-with-gemini-xr-blocks-lessons-from-building-a-prompt-driven-xr-biology-pco</link>
      <guid>https://dev.to/awodi_abdulmujeebayomide/vibe-coding-xr-applications-with-gemini-xr-blocks-lessons-from-building-a-prompt-driven-xr-biology-pco</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction: XR Development Is Entering a New Workflow Era&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;For years, building XR applications meant wrestling with heavy game engines, complex build pipelines, and massive asset workflows. Even simple experiments could take weeks to prototype.&lt;br&gt;
But that model is starting to change.&lt;/p&gt;

&lt;p&gt;We are entering the era of Vibe Coding, where the distance between an idea and a working spatial experience is shrinking from weeks to hours, and sometimes even minutes.&lt;br&gt;
Using Gemini and the XR Blocks framework, I recently built a Mixed Reality XR biology lab where users can walk around in VR, interact with DNA and cell structures, explore human organs, trigger contextual learning hotspots, and hear explanations through integrated text-to-speech.&lt;br&gt;
No heavy downloads. No complex shader pipelines. No asset dependency issues.&lt;br&gt;
Just intent translated into a functional spatial experience.&lt;br&gt;
The project was designed around Google's Material 3 spatial design ideas and targeted the emerging Android XR ecosystem. Whether running on devices like Quest 3S or future hardware like Samsung’s Android XR headset, one thing is becoming clear:&lt;br&gt;
The spatial web is no longer theoretical. It is being built right now and increasingly, it is being built with words.&lt;br&gt;
DEMO VIDEO HERE – XR Biology Lab Walkthrough &lt;a href="https://youtube.com/shorts/hLqXIe8XTCc?si=dADIT-BOIddpYLZy" rel="noopener noreferrer"&gt;https://youtube.com/shorts/hLqXIe8XTCc?si=dADIT-BOIddpYLZy&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why I Built This Project&lt;/strong&gt;&lt;br&gt;
The motivation behind this project was simple. I wanted to test whether AI could meaningfully reduce the friction involved in XR prototyping.&lt;br&gt;
Traditionally, building even a small XR demo requires:&lt;br&gt;
Setting up an engine&lt;br&gt;
Importing assets&lt;br&gt;
Configuring lighting&lt;br&gt;
Writing interaction systems&lt;br&gt;
Handling performance issues&lt;br&gt;
This creates a large barrier for experimentation.&lt;br&gt;
I wanted to test a different question:&lt;br&gt;
Can a complete XR learning environment be generated from a structured prompt while maintaining technical clarity and architectural control?&lt;br&gt;
To answer this, I defined strict constraints.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Defining Engineering Constraints Before Writing Code&lt;/strong&gt;&lt;br&gt;
Before generating any code, I defined constraints that would guide the experiment. These constraints were important because they forced architectural discipline.&lt;br&gt;
The project rules were:&lt;br&gt;
The entire XR experience must exist in one HTML file&lt;br&gt;
All models must be procedurally generated&lt;br&gt;
No external assets allowed&lt;br&gt;
No fetch calls allowed&lt;br&gt;
Interaction must use native browser APIs&lt;br&gt;
Audio must use built-in speech synthesis&lt;br&gt;
These decisions were not random. Each one solved a real XR development problem.&lt;br&gt;
For example:&lt;br&gt;
Procedural generation prevents missing asset errors&lt;br&gt;
Single file architecture improves portability&lt;br&gt;
Native APIs reduce dependencies&lt;br&gt;
No fetch calls removes hosting issues&lt;br&gt;
These constraints turned the project into a reliability experiment as much as a development experiment.&lt;br&gt;
This is an important mindset for XR developers. Constraints often improve design clarity.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Prompt as a System Specification&lt;/strong&gt;&lt;br&gt;
One interesting realization from this project is that a good XR prompt behaves very similarly to a technical design document.&lt;br&gt;
Instead of just describing visuals, the prompt described:&lt;br&gt;
Environment design&lt;br&gt;
Interaction behavior&lt;br&gt;
Navigation rules&lt;br&gt;
Performance constraints&lt;br&gt;
Educational goals&lt;br&gt;
This effectively turned the prompt into a specification.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffjc7903g62qzutw57eu4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffjc7903g62qzutw57eu4.png" alt=" " width="688" height="495"&gt;&lt;/a&gt;&lt;br&gt;
PROMPT SCREENSHOT&lt;/p&gt;

&lt;p&gt;A useful pattern I discovered is that XR prompts work best when they include five elements:&lt;br&gt;
Environment description&lt;br&gt;
Interaction expectations&lt;br&gt;
Navigation model&lt;br&gt;
Feedback mechanisms&lt;br&gt;
Technical constraints&lt;br&gt;
This structure helps AI generate more usable spatial experiences.&lt;br&gt;
For developers new to AI XR workflows, thinking of prompts as architecture documents instead of simple instructions leads to better results.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Project Architecture Overview&lt;/strong&gt;&lt;br&gt;
Even though the project exists in one file, it still follows layered architecture thinking.&lt;br&gt;
The main layers were:&lt;br&gt;
Scene setup&lt;br&gt;
Procedural modeling&lt;br&gt;
Interaction system&lt;br&gt;
XR navigation&lt;br&gt;
Audio feedback&lt;br&gt;
A simple way to think about this structure:&lt;br&gt;
Scene layer builds the world&lt;br&gt;
Model layer creates objects&lt;br&gt;
Interaction layer enables selection&lt;br&gt;
XR layer handles movement&lt;br&gt;
Audio layer reinforces learning&lt;br&gt;
Even in experimental projects, this mental separation improves maintainability.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Scene Construction Strategy&lt;/strong&gt;&lt;br&gt;
The environment was built first because XR experiences depend heavily on spatial context.&lt;br&gt;
The lab includes:&lt;br&gt;
Large window panels&lt;br&gt;
Forest backdrop&lt;br&gt;
Wood ceiling structure&lt;br&gt;
Lab benches&lt;br&gt;
Stools&lt;br&gt;
Floating model displays&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe012d54usq3wbbzd3bl0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe012d54usq3wbbzd3bl0.png" alt=" " width="800" height="449"&gt;&lt;/a&gt;  XR LAB ENVIRONMENT OVERVIEW&lt;/p&gt;

&lt;p&gt;The goal was not realism. The goal was clarity.&lt;br&gt;
Educational XR benefits more from readable environments than photorealistic ones. Clean lighting and recognizable shapes improve learning interaction.&lt;br&gt;
This is an important lesson for beginners. Start with clarity before realism.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Procedural Modeling Approach&lt;/strong&gt;&lt;br&gt;
Instead of importing 3D assets, all biology models were generated using primitive geometry.&lt;br&gt;
For example:&lt;br&gt;
DNA can be approximated using spirals and cylinders&lt;br&gt;
Cells can use spheres and layered materials&lt;br&gt;
Bones can use elongated shapes&lt;br&gt;
Organs can use combined primitives&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzic5ccoevhqttj05lp7t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzic5ccoevhqttj05lp7t.png" alt=" " width="800" height="449"&gt;&lt;/a&gt; DNA MODEL&lt;/p&gt;

&lt;p&gt;Procedural modeling provides three benefits:&lt;br&gt;
Reliability&lt;br&gt;
Speed&lt;br&gt;
Flexibility&lt;br&gt;
Reliability improves because no files can fail to load.&lt;br&gt;
Speed improves because iteration is faster.&lt;br&gt;
Flexibility improves because geometry can be modified programmatically.&lt;br&gt;
For developers starting XR, procedural modeling is a powerful starting point before moving to complex assets.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Interaction Architecture&lt;/strong&gt;&lt;br&gt;
Interaction was implemented using raycasting. This is a common approach in XR because it mirrors how users point at objects.&lt;br&gt;
The interaction flow works like this:&lt;br&gt;
User points controller&lt;br&gt;
Ray intersects object&lt;br&gt;
Object triggers tooltip&lt;/p&gt;

&lt;p&gt;This simple pipeline creates a natural learning interaction.&lt;br&gt;
Important lessons here:&lt;br&gt;
Interaction targets must be large&lt;br&gt;
Feedback must be immediate&lt;br&gt;
Visual confirmation reduces confusion&lt;br&gt;
These small UX details matter more than complex rendering.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;XR Navigation Design Decisions&lt;/strong&gt;&lt;br&gt;
Navigation used teleportation instead of free movement.&lt;br&gt;
This decision was intentional.&lt;br&gt;
Teleportation reduces motion sickness and is widely accepted as a comfortable XR movement method.&lt;br&gt;
Good XR navigation should prioritize:&lt;br&gt;
Comfort&lt;br&gt;
Predictability&lt;br&gt;
Clarity&lt;br&gt;
Beginners often try complex movement systems first. It is usually better to start with teleportation and expand later.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Reliability Engineering Decisions&lt;/strong&gt;&lt;br&gt;
Many decisions in this project were influenced by reliability rather than features.&lt;br&gt;
Key reliability choices included:&lt;br&gt;
Procedural models instead of assets&lt;br&gt;
Single file instead of modules&lt;br&gt;
Native APIs instead of libraries&lt;br&gt;
Simple lighting instead of complex shaders&lt;br&gt;
These decisions reduce failure points.&lt;br&gt;
XR developers often focus on features first. Reliability often matters more, especially in early prototypes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Performance Observations&lt;/strong&gt;&lt;br&gt;
Some useful performance observations from this experiment:&lt;br&gt;
Geometry count matters more than texture quality&lt;br&gt;
Lighting complexity affects frame rate quickly&lt;br&gt;
Reuse of materials improves performance&lt;br&gt;
Simple shapes scale better&lt;br&gt;
XR performance is about stability, not visual complexity.&lt;br&gt;
A smooth simple scene is better than a detailed unstable one.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What Worked Well&lt;/strong&gt;&lt;br&gt;
Several things worked better than expected.&lt;br&gt;
AI generated a usable starting architecture&lt;br&gt;
Procedural modeling scaled well&lt;br&gt;
Raycast interaction remained simple&lt;br&gt;
Most importantly:&lt;br&gt;
Iteration speed improved dramatically.&lt;br&gt;
Ideas could be tested quickly without heavy setup.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What Did Not Work Perfectly&lt;/strong&gt;&lt;br&gt;
Some areas still required manual refinement.&lt;br&gt;
Tooltip readability required adjustment&lt;br&gt;
Object scale needed tuning&lt;br&gt;
Interaction distances needed tweaking&lt;br&gt;
Lighting needed balancing&lt;br&gt;
This highlights an important reality.&lt;br&gt;
AI accelerates development but does not replace developer judgment.&lt;br&gt;
Developers still guide quality.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lessons for XR Developers&lt;/strong&gt;&lt;br&gt;
Some practical lessons from this experiment:&lt;br&gt;
Start with constraints&lt;br&gt;
Prioritize interaction clarity&lt;br&gt;
Use procedural models early&lt;br&gt;
Keep architecture simple&lt;br&gt;
Focus on learning experience first&lt;br&gt;
Most importantly:&lt;br&gt;
Build small experiments often.&lt;br&gt;
XR skill grows through iteration.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Implications for Android XR Development&lt;/strong&gt;&lt;br&gt;
Android XR represents a new opportunity for developers entering spatial computing early.&lt;br&gt;
AI assisted XR workflows may become important because:&lt;br&gt;
Mobile XR requires efficiency&lt;br&gt;
Lightweight apps deploy faster&lt;br&gt;
AI reduces development friction&lt;br&gt;
Web based XR lowers entry barriers&lt;br&gt;
Developers experimenting now may gain early ecosystem familiarity.&lt;br&gt;
This is similar to early Android or early web development periods.&lt;br&gt;
Early builders often become ecosystem leaders.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Future Experimentation Areas&lt;/strong&gt;&lt;br&gt;
Areas worth exploring next include:&lt;br&gt;
Hand tracking interaction&lt;br&gt;
Gesture input&lt;br&gt;
Multi user XR&lt;br&gt;
Physics simulation&lt;br&gt;
Android XR device testing&lt;br&gt;
Spatial UI improvements&lt;br&gt;
AI XR workflows are still evolving. Experiments like this help developers understand what works in practice.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;br&gt;
This project started as a simple question about AI assisted XR development. It became a useful exploration of how prompt driven workflows may change how spatial applications are built.&lt;br&gt;
The biggest takeaway is not that AI writes XR code.&lt;br&gt;
The real shift is that AI reduces the distance between idea and prototype.&lt;br&gt;
Developers can now spend more time designing experiences and less time fighting tooling.&lt;br&gt;
XR development is becoming lighter, faster, and more accessible.&lt;br&gt;
And increasingly, spatial software may begin not with engines or editors.&lt;br&gt;
But with intent.&lt;br&gt;
And sometimes, just a well structured prompt.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;FAQ&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What is Vibe Coding in XR development?&lt;/strong&gt;&lt;br&gt;
It is a workflow where developers describe XR experiences and AI helps generate implementation scaffolding.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Do I need a game engine to build XR apps?&lt;/strong&gt;&lt;br&gt;
No. WebXR and Three.js allow browser based XR development.&lt;br&gt;
Why use procedural generation?&lt;br&gt;
It removes asset dependencies and improves reliability.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Can beginners build XR projects like this?&lt;/strong&gt;&lt;br&gt;
Yes. Starting with simple procedural scenes is a good learning path.&lt;br&gt;
Why is Android XR important?&lt;br&gt;
It represents a growing ecosystem where early developers can gain experience.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Where can I see the source code?&lt;/strong&gt;&lt;br&gt;
Project repository:&lt;br&gt;
&lt;a href="https://github.com/Platinum04/vibe-coded-xr-biology-lab" rel="noopener noreferrer"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>android</category>
      <category>gemini</category>
      <category>google</category>
      <category>immersivetech</category>
    </item>
  </channel>
</rss>
