Introduction: XR Development Is Entering a New Workflow Era
For years, building XR applications meant wrestling with heavy game engines, complex build pipelines, and massive asset workflows. Even simple experiments could take weeks to prototype.
But that model is starting to change.
We are entering the era of Vibe Coding, where the distance between an idea and a working spatial experience is shrinking from weeks to hours, and sometimes even minutes.
Using Gemini and the XR Blocks framework, I recently built a Mixed Reality XR biology lab where users can walk around in VR, interact with DNA and cell structures, explore human organs, trigger contextual learning hotspots, and hear explanations through integrated text-to-speech.
No heavy downloads. No complex shader pipelines. No asset dependency issues.
Just intent translated into a functional spatial experience.
The project was designed around Google's Material 3 spatial design ideas and targeted the emerging Android XR ecosystem. Whether running on devices like Quest 3S or future hardware like Samsung’s Android XR headset, one thing is becoming clear:
The spatial web is no longer theoretical. It is being built right now and increasingly, it is being built with words.
DEMO VIDEO HERE – XR Biology Lab Walkthrough https://youtube.com/shorts/hLqXIe8XTCc?si=dADIT-BOIddpYLZy
Why I Built This Project
The motivation behind this project was simple. I wanted to test whether AI could meaningfully reduce the friction involved in XR prototyping.
Traditionally, building even a small XR demo requires:
Setting up an engine
Importing assets
Configuring lighting
Writing interaction systems
Handling performance issues
This creates a large barrier for experimentation.
I wanted to test a different question:
Can a complete XR learning environment be generated from a structured prompt while maintaining technical clarity and architectural control?
To answer this, I defined strict constraints.
Defining Engineering Constraints Before Writing Code
Before generating any code, I defined constraints that would guide the experiment. These constraints were important because they forced architectural discipline.
The project rules were:
The entire XR experience must exist in one HTML file
All models must be procedurally generated
No external assets allowed
No fetch calls allowed
Interaction must use native browser APIs
Audio must use built-in speech synthesis
These decisions were not random. Each one solved a real XR development problem.
For example:
Procedural generation prevents missing asset errors
Single file architecture improves portability
Native APIs reduce dependencies
No fetch calls removes hosting issues
These constraints turned the project into a reliability experiment as much as a development experiment.
This is an important mindset for XR developers. Constraints often improve design clarity.
The Prompt as a System Specification
One interesting realization from this project is that a good XR prompt behaves very similarly to a technical design document.
Instead of just describing visuals, the prompt described:
Environment design
Interaction behavior
Navigation rules
Performance constraints
Educational goals
This effectively turned the prompt into a specification.
A useful pattern I discovered is that XR prompts work best when they include five elements:
Environment description
Interaction expectations
Navigation model
Feedback mechanisms
Technical constraints
This structure helps AI generate more usable spatial experiences.
For developers new to AI XR workflows, thinking of prompts as architecture documents instead of simple instructions leads to better results.
Project Architecture Overview
Even though the project exists in one file, it still follows layered architecture thinking.
The main layers were:
Scene setup
Procedural modeling
Interaction system
XR navigation
Audio feedback
A simple way to think about this structure:
Scene layer builds the world
Model layer creates objects
Interaction layer enables selection
XR layer handles movement
Audio layer reinforces learning
Even in experimental projects, this mental separation improves maintainability.
Scene Construction Strategy
The environment was built first because XR experiences depend heavily on spatial context.
The lab includes:
Large window panels
Forest backdrop
Wood ceiling structure
Lab benches
Stools
Floating model displays
The goal was not realism. The goal was clarity.
Educational XR benefits more from readable environments than photorealistic ones. Clean lighting and recognizable shapes improve learning interaction.
This is an important lesson for beginners. Start with clarity before realism.
Procedural Modeling Approach
Instead of importing 3D assets, all biology models were generated using primitive geometry.
For example:
DNA can be approximated using spirals and cylinders
Cells can use spheres and layered materials
Bones can use elongated shapes
Organs can use combined primitives
Procedural modeling provides three benefits:
Reliability
Speed
Flexibility
Reliability improves because no files can fail to load.
Speed improves because iteration is faster.
Flexibility improves because geometry can be modified programmatically.
For developers starting XR, procedural modeling is a powerful starting point before moving to complex assets.
Interaction Architecture
Interaction was implemented using raycasting. This is a common approach in XR because it mirrors how users point at objects.
The interaction flow works like this:
User points controller
Ray intersects object
Object triggers tooltip
This simple pipeline creates a natural learning interaction.
Important lessons here:
Interaction targets must be large
Feedback must be immediate
Visual confirmation reduces confusion
These small UX details matter more than complex rendering.
XR Navigation Design Decisions
Navigation used teleportation instead of free movement.
This decision was intentional.
Teleportation reduces motion sickness and is widely accepted as a comfortable XR movement method.
Good XR navigation should prioritize:
Comfort
Predictability
Clarity
Beginners often try complex movement systems first. It is usually better to start with teleportation and expand later.
Reliability Engineering Decisions
Many decisions in this project were influenced by reliability rather than features.
Key reliability choices included:
Procedural models instead of assets
Single file instead of modules
Native APIs instead of libraries
Simple lighting instead of complex shaders
These decisions reduce failure points.
XR developers often focus on features first. Reliability often matters more, especially in early prototypes.
Performance Observations
Some useful performance observations from this experiment:
Geometry count matters more than texture quality
Lighting complexity affects frame rate quickly
Reuse of materials improves performance
Simple shapes scale better
XR performance is about stability, not visual complexity.
A smooth simple scene is better than a detailed unstable one.
What Worked Well
Several things worked better than expected.
AI generated a usable starting architecture
Procedural modeling scaled well
Raycast interaction remained simple
Most importantly:
Iteration speed improved dramatically.
Ideas could be tested quickly without heavy setup.
What Did Not Work Perfectly
Some areas still required manual refinement.
Tooltip readability required adjustment
Object scale needed tuning
Interaction distances needed tweaking
Lighting needed balancing
This highlights an important reality.
AI accelerates development but does not replace developer judgment.
Developers still guide quality.
Lessons for XR Developers
Some practical lessons from this experiment:
Start with constraints
Prioritize interaction clarity
Use procedural models early
Keep architecture simple
Focus on learning experience first
Most importantly:
Build small experiments often.
XR skill grows through iteration.
Implications for Android XR Development
Android XR represents a new opportunity for developers entering spatial computing early.
AI assisted XR workflows may become important because:
Mobile XR requires efficiency
Lightweight apps deploy faster
AI reduces development friction
Web based XR lowers entry barriers
Developers experimenting now may gain early ecosystem familiarity.
This is similar to early Android or early web development periods.
Early builders often become ecosystem leaders.
Future Experimentation Areas
Areas worth exploring next include:
Hand tracking interaction
Gesture input
Multi user XR
Physics simulation
Android XR device testing
Spatial UI improvements
AI XR workflows are still evolving. Experiments like this help developers understand what works in practice.
Conclusion
This project started as a simple question about AI assisted XR development. It became a useful exploration of how prompt driven workflows may change how spatial applications are built.
The biggest takeaway is not that AI writes XR code.
The real shift is that AI reduces the distance between idea and prototype.
Developers can now spend more time designing experiences and less time fighting tooling.
XR development is becoming lighter, faster, and more accessible.
And increasingly, spatial software may begin not with engines or editors.
But with intent.
And sometimes, just a well structured prompt.
FAQ
What is Vibe Coding in XR development?
It is a workflow where developers describe XR experiences and AI helps generate implementation scaffolding.
Do I need a game engine to build XR apps?
No. WebXR and Three.js allow browser based XR development.
Why use procedural generation?
It removes asset dependencies and improves reliability.
Can beginners build XR projects like this?
Yes. Starting with simple procedural scenes is a good learning path.
Why is Android XR important?
It represents a growing ecosystem where early developers can gain experience.



Top comments (0)