DEV Community

Yusufktkoglu
Yusufktkoglu

Posted on

How I built a 3D space simulator in one day with Claude (Three.js + shaders)

I'm not a graphics programmer. I had never written a GLSL shader. I'd heard of Three.js but never actually used it. One day later, aethelia.space exists — a browser-based 3D space simulator with 6 real star systems, 12 shader-rendered cosmic anomalies, moons orbiting planets, procedural textures, and ambient audio.
I built it pair-programming with Claude (Anthropic's AI). This post is the honest technical story: what worked, what didn't, the architecture decisions I made, and the exact prompts that turned "idea" into "shipped product" — in a single day.
If you're curious about AI-assisted development in 2026 — or you just want to see how a non-expert can ship a production-ready Three.js app in one day — read on.

What I was trying to build
I've had a quiet obsession with space visualization for years. Every time I watched Interstellar, I'd wonder: could a browser do that? Not the full Kip Thorne ray-traced Gargantua — but something close enough to be evocative.
The pitch in my head:

An interactive 3D website where you can explore real star systems — our solar system, TRAPPIST-1, Alpha Centauri, Sirius. And real cosmic anomalies — Sagittarius A*, M87*, the Crab Pulsar. And also design your own planet with custom composition.

No install, no signup, no ads. Just open a URL and explore.

I had no idea how to do any of this. One morning I decided to just try — opened a chat with Claude and started.

The scaffolding
I started with the simplest possible prompt:

"I want to build a 3D solar system in the browser with Three.js. Can you give me a minimal starting point that shows the Sun and Earth orbiting it?"

Claude gave me back a single HTML file with an importmap for Three.js from a CDN, a basic scene, perspective camera, and two spheres: a yellow one and a blue one, with the blue one rotating around the yellow. No build step. Opened it in the browser — it just worked.
This single decision — one HTML file, no bundler, no build step — turned out to be the most important architectural choice of the whole project. Every time I wanted to add something, I just edited the same file. No npm, no webpack, no dist folder. The final app is still one file, about 127KB, deployed on Vercel with zero configuration.

html<script type="importmap">
{
  "imports": {
    "three": "https://cdnjs.cloudflare.com/ajax/libs/three.js/0.160.0/three.module.min.js"
  }
}
</script>
<script type="module">
  import * as THREE from 'three';
  // ... scene setup
</script>
Enter fullscreen mode Exit fullscreen mode

That's the entire dependency system. In 2026, this is underrated.

The planet problem
Within the first hour, I wanted real-looking planets. My prompt:

"I don't want boring flat-colored spheres. I want Earth-like planets with actual continents and oceans that I can procedurally generate. Gas giants should have bands. Ice worlds should have polar caps. How do I do this without downloading texture files?"

Claude suggested generating the textures procedurally using a Canvas 2D context with value noise + FBM, then mapping that canvas onto the sphere as a texture. This was the first "oh wow" moment — I expected a complicated answer involving pre-made textures from NASA's blue marble dataset or something. Instead:

javascriptfunction createPlanetTexture(palette, water, land, ice, seed) {
  const canvas = document.createElement('canvas');
  canvas.width = 1024; canvas.height = 512;
  const ctx = canvas.getContext('2d');
  const imgData = ctx.createImageData(1024, 512);

  for (let y = 0; y < 512; y++) {
    for (let x = 0; x < 1024; x++) {
      // sample FBM noise
      const n = fbm(x * 0.008 + seed, y * 0.008);
      // threshold into water/land/ice based on composition
      const color = pickColor(n, palette, water, land, ice);
      const idx = (y * 1024 + x) * 4;
      imgData.data[idx] = color.r;
      imgData.data[idx + 1] = color.g;
      imgData.data[idx + 2] = color.b;
      imgData.data[idx + 3] = 255;
    }
  }
  ctx.putImageData(imgData, 0, 0);
  return new THREE.CanvasTexture(canvas);
}
Enter fullscreen mode Exit fullscreen mode

The fbm (Fractional Brownian Motion) function was 10 lines of value noise. Claude wrote it without me asking what FBM was — I had to go look it up after.
What I learned: Claude volunteers technical context I don't know I need. When I asked about procedural textures, it introduced me to noise functions, octaves, the difference between Perlin and value noise, and why you pick one over the other.
The final terrestrial planet generator mixes water regions, landmasses, polar ice caps based on latitude, and a separate cloud layer that rotates slightly faster than the planet surface. Every planet in the solar system is generated this way at runtime — there are zero image files in the project.

The black hole
This was the part I was terrified of. Every Three.js tutorial I'd seen for black holes was either (a) a flat circle or (b) a full ray-traced shader that required a CS degree.
I tried this prompt:

"I want a black hole that looks like the one in Interstellar — event horizon, accretion disk that glows, light bending around it so you can see the back of the disk over the top. It should run in a browser at 60fps. It doesn't need to be physically accurate, just visually convincing. How?"

Claude walked me through the approximation strategy:

Not ray tracing. Full GR ray tracing is infeasible in a browser shader.
Fragment shader on a camera-facing billboard quad. A single flat square, always rotated toward the viewer, with a custom fragment shader painting the black hole entirely per-pixel.
Fake the lensing. When sampling background UVs, pull them radially toward the center based on distance from the center. Close to the event horizon, stars smear into arcs. Far away, normal.
Accretion disk math. Polar coordinates, temperature gradient from inner (white hot) to outer (red). Doppler beaming: multiply brightness by 1 + cos(angle) so one side is brighter than the other.
Photon ring. A narrow bright band at ~1.5 Schwarzschild radii.
Over-the-top disk. A second copy of the disk sampled at the reflected angle, making the Interstellar halo effect.

Here's a stripped-down version of the actual fragment shader it wrote:

glsluniform float uTime;
varying vec2 vUv;

void main() {
  vec2 uv = (vUv - 0.5) * 4.0;
  float d = length(uv);
  float angle = atan(uv.y, uv.x);

  // Event horizon
  if (d < 0.3) { gl_FragColor = vec4(0.0, 0.0, 0.0, 1.0); return; }

  vec3 col = vec3(0.0);

  // Accretion disk
  if (d > 0.5 && d < 2.0 && abs(uv.y) < 0.15 * (1.0 + 0.3 * cos(angle * 2.0))) {
    float temp = 1.0 - (d - 0.5) / 1.5;
    vec3 diskCol = mix(vec3(1.0, 0.5, 0.2), vec3(1.0, 1.0, 1.0), temp);

    // Doppler beaming
    float doppler = 1.0 + 0.7 * cos(angle);
    diskCol *= doppler;

    // Rotating stripes for motion
    float stripes = 0.5 + 0.5 * sin(angle * 20.0 + uTime * 2.0 - d * 10.0);
    col += diskCol * stripes;
  }

  // Photon ring
  float ring = exp(-pow((d - 0.45) * 30.0, 2.0));
  col += vec3(1.0, 0.9, 0.7) * ring * 3.0;

  // Gravitational lensing on background
  vec2 lensUv = uv / (1.0 + 1.0 / (d * d));
  float stars = fract(sin(dot(lensUv * 50.0, vec2(12.9898, 78.233))) * 43758.5453);
  stars = pow(stars, 100.0);
  col += vec3(stars) * smoothstep(0.5, 2.0, d);

  gl_FragColor = vec4(col, 1.0);
}
Enter fullscreen mode Exit fullscreen mode

I'll be honest: I wrote none of this. Claude wrote it. I tested it, said "the disk is too flat, make it tilted," and iterated. After a handful of iterations, Sgr A* looked convincing enough that I was genuinely startled — the Interstellar halo effect was there, the Doppler asymmetry was there, the photon ring was there. It looked right.
I then duplicated the shader for:

M87* — slightly different mass ratio, wider disk
TON 618 — ultramassive, different color temperature
Quasar 3C 273 — added relativistic jets
Neutron stars / pulsars — magnetic field lines, different core
Wormhole — spiral distortion, glimpse of "other side"

Each one was a quick modification of the base black hole shader.

The moons problem
After a few hours I had real star systems working. Users could warp between Sol, Alpha Centauri, TRAPPIST-1, Sirius, etc. But Earth had no moon. Jupiter had no Galilean satellites. That felt wrong.
The naive approach would be to just add moons as extra planets — but they need to orbit their parent planet, not the star. So I needed a hierarchy.
My prompt:

"I have planets orbiting a star in my scene. Now I want moons that orbit their parent planets. What's the cleanest way to do this without breaking my existing planet system?"

Claude's answer used a simple mental model: moons are planets with a parent reference. Instead of building a true scene graph hierarchy in Three.js (which gets tricky with rotations), store the parent reference in userData and update the moon's position relative to the parent every frame:

javascript// When loading a system
system.planets.forEach(p => {
  const planet = createPlanet(p);
  scene.add(planet);

  if (p.moons) {
    p.moons.forEach(moonConfig => {
      const moon = createPlanet(moonConfig);
      moon.userData.isMoon = true;
      moon.userData.parentPlanet = planet;
      moon.userData.moonDistance = moonConfig.distance;
      scene.add(moon);
      planets.push(moon);
    });
  }
});

// In the animation loop
planets.forEach(p => {
  const d = p.userData;
  d.angle += dt * d.speed;

  if (d.isMoon && d.parentPlanet) {
    const parentPos = d.parentPlanet.position;
    p.position.x = parentPos.x + Math.cos(d.angle) * d.moonDistance;
    p.position.z = parentPos.z + Math.sin(d.angle) * d.moonDistance;
  } else {
    p.position.x = Math.cos(d.angle) * d.distance;
    p.position.z = Math.sin(d.angle) * d.distance;
  }
});
Enter fullscreen mode Exit fullscreen mode

Elegant, 20 lines, works perfectly. Phobos now rises and sets three times per Martian day (which is scientifically correct — Phobos orbits Mars faster than Mars rotates).
The solar system now includes Moon, Phobos, Deimos, Io, Europa, Ganymede, Callisto, Titan, Enceladus, Mimas, Triton, Charon — plus dwarf planets Ceres, Pluto, Eris. All with real orbital data.

Ambient audio without external files
Later in the day I wanted sound. Space simulators are 10x more immersive with ambient drone audio. But I didn't want to include MP3 files — that would require licensing, bandwidth, asset management, loading states. Ugh.

"Can I generate ambient space drone audio procedurally in the browser with zero external files?"

Claude introduced me to the Web Audio API. Turns out you can create three sine wave oscillators at low frequencies (55 Hz, 82 Hz, 110 Hz — actual musical notes: A1, E2, A2), route them through a lowpass filter, and modulate the filter's cutoff with a very slow LFO. The result is a dreamy ambient drone that sounds surprisingly like an Interstellar soundtrack.

javascriptconst audioContext = new AudioContext();
const filter = audioContext.createBiquadFilter();
filter.type = 'lowpass';
filter.frequency.value = 400;
filter.Q.value = 1;

// Slow LFO sweeping the filter
const lfo = audioContext.createOscillator();
lfo.frequency.value = 0.07;
const lfoGain = audioContext.createGain();
lfoGain.gain.value = 150;
lfo.connect(lfoGain);
lfoGain.connect(filter.frequency);

// Three oscillators making the drone
[55, 82.4, 110].forEach(freq => {
  const osc = audioContext.createOscillator();
  osc.type = 'sine';
  osc.frequency.value = freq;
  osc.connect(filter);
  osc.start();
});

filter.connect(audioContext.destination);
lfo.start();
Enter fullscreen mode Exit fullscreen mode

Plus a warp sound effect (sawtooth sweep through a bandpass filter) when changing systems. Zero audio files in the project. Zero bytes of audio bandwidth. Just math.

The deployment stack
By evening I wanted this live at a real domain. Final stack:

Source: Single index.html, committed to a GitHub repo
Hosting: Vercel, automatic deploy on every push, free tier
Domain: aethelia.space purchased from GoDaddy
DNS: A record pointing @ to Vercel (216.198.79.1), CNAME www to cname.vercel-dns.com
SSL: Let's Encrypt, auto-provisioned by Vercel
Analytics: Google Analytics 4 gtag.js
SEO: Schema.org JSON-LD structured data, Open Graph, Twitter cards, canonical URL, sitemap.xml, robots.txt

Total monthly cost: $0 (domain amortizes to ~$1/month).
From zero to live production site took about 45 minutes of actual deploy work. The first push was in the late afternoon. Less than an hour later, the site was accessible at https://aethelia.space with valid SSL.

What this workflow actually felt like
I've been trying to put the experience into words. Here's my best attempt.
It did NOT feel like:

Copy-pasting code from Stack Overflow.
Using a visual builder or no-code tool.
"Prompting" in the 2024 sense of "write me X, the end."

It DID feel like:

Working with a very fast pair programmer who happens to know every library, every API, every math trick — but has no opinions until you give them constraints.
Iterating on feel. I'd say "the disk is too flat, tilt it" and get back a corrected shader. I'd say "the moons move too fast" and get back a new speed curve.
Learning by absorption. I didn't study Three.js first. I learned it by writing prompts and reading the responses. By the end of the day, I was catching Claude's mistakes before running the code.

The biggest surprise: the AI volunteered context I didn't ask for. When I mentioned the Crab Pulsar, it casually added the detail that Chinese astronomers recorded its supernova in 1054 CE. Now that fact is a trivia card in the app. When I asked about Europa's appearance, it mentioned the subsurface ocean — so that's in the info panel.
It felt less like "AI wrote my code" and more like "I described a world out loud for a day and it materialized."

What I shipped
aethelia.space — free, no install, no signup.
Technical stats:

1 HTML file, 127KB gzipped
Zero build step
Zero asset files (all textures procedural, all audio synthesized)
Three.js 0.160.0 via CDN importmap
GLSL fragment shaders for all cosmic anomalies
Canvas 2D + value noise FBM for all planet textures
Web Audio API for ambient drone + warp effects
6 real star systems, 12 cosmic anomalies, 12 moons, 3 dwarf planets
31 trivia entries with real astronomical context
Deployed on Vercel with auto-deploys from GitHub

The broader point
I'm sharing this because I think people underestimate what non-expert developers can ship with AI assistance in 2026. A year ago, this project would have been months of work for me — if I'd ever finished it. Realistically, I'd have given up at the first shader compilation error.
Instead: a single day of focused work, and a shipped product.
The code isn't academically perfect. The black hole is a visual approximation, not physically correct. The planets aren't to scale. But it works, it's beautiful, and people can use it right now.
That's the thing that shifted for me. The barrier to shipping has dropped dramatically. It hasn't disappeared — you still need taste, direction, persistence, and the ability to critique output. But the purely technical barriers (knowing the APIs, writing the boilerplate, remembering the syntax) are increasingly optional.
If you've been sitting on an idea because "I don't know how to do X" — try describing X to Claude and see what comes back. You might ship your thing in a day.

Try it
aethelia.space
Controls: drag to orbit, scroll to zoom, click any celestial body. Gallery at the bottom warps between star systems and anomalies. Audio toggle in the bottom right. There's a planet designer too — try building something uninhabitable on purpose.
I'd love feedback in the comments — especially if you spot data errors, broken shaders, or features you'd want to see. Bug reports most welcome.
If you build something with Claude, drop a link — I'd love to see it.

Built with Claude (Anthropic). Code, deploy, and all decisions human. Cover image is the AETHELIA logo generated during the design phase.

Top comments (0)