DEV Community

Richard Fu
Richard Fu

Posted on • Originally published at richardfu.net on

Building “Unmask the City” – A Solo Game Jam Journey with AI Pair Programming

Another Year, Another Solo Jam

It’s Global Game Jam season again, and once again I found myself diving in solo. But this year was different – I had a new coding partner: Claude Code.

The theme for GGJ 2026 was “Mask”. While others might build games about literal masks – masquerades, disguises, hidden identities – I saw something different. What if the mask wasn’t on a person, but on an entire city? What if you had to “unmask” a fog-shrouded metropolis by exploring it?

That’s how Unmask the City was born.

The Challenge: Vibe Coding from Zero

I set myself an ambitious constraint: build everything from scratch. No templates. No pre-made 3D models. No asset packs. Just code, procedural generation, and pure vibes.

Why? Because I wanted to see how far modern web technologies could take me in a game jam timeframe when paired with AI assistance. Could we create something that feels complete, polished, and technically impressive using only:

  • Three.js primitives (boxes, cylinders, spheres)
  • Procedural audio (Web Audio API)
  • Custom shaders
  • A lot of mathematical creativity

Spoiler: We did. And it was wild.


The Game: Exploring the Unknown

Concept: You’re dropped into a procedurally generated city consumed by a malevolent fog. Ancient fragments of light are scattered throughout – collect them all to unmask the city and reveal its secrets.

Core Mechanic: Permanent fog clearing. Everywhere you walk, the fog disappears forever, creating a visual record of your exploration. It’s like drawing a map with your presence.

Goal: Find all fragments in the shortest time while exploring as much of the city as possible. Three difficulty levels scale the challenge (5/7/10 fragments).

Simple premise, but the execution is where things got interesting.

Aerial view showing the procedurally generated city with fog-covered and revealed areas


Tech Stack: Modern Web at Its Best

Three.js – The Rendering Engine

I chose Three.js because it’s the mature, battle-tested 3D engine for the web. Version 0.170.0 gave me:

  • WebGL 2.0 rendering
  • Built-in shadow mapping
  • Excellent primitive geometries
  • PointerLockControls for FPS gameplay

But the real power move? InstancedMesh. Instead of rendering 300 buildings with 300 draw calls, I render them all with 1-2 draw calls. That’s a 150x performance improvement right there.

TypeScript – Sanity in the Chaos

Game jams are chaotic. Code gets messy fast. TypeScript was my safety net:

  • Catch bugs at compile time, not at 3 AM during playtesting
  • IDE autocomplete saves so much time
  • Interfaces make the codebase self-documenting

Every game entity (Building, Tree, Fragment, Park) has a typed interface. When you’re iterating fast, this type safety prevents entire categories of bugs.

Vite – The Secret Weapon

Vite’s Hot Module Replacement is insane for game development:

  • Change shader code → see results in 100ms
  • Adjust particle parameters → particles update live
  • Modify audio synthesis → sounds regenerate instantly

No rebuild. No refresh. Just pure flow state.

Web Audio API – Zero Asset Files

Here’s where it gets interesting: every sound in the game is procedurally generated. No MP3s. No WAV files. Pure synthesis:

  • Footsteps: Filtered noise with different characteristics per surface (concrete, grass, water)
  • Fragment collection: Musical arpeggios (different chord progressions per fragment type)
  • Ambient sounds: Wind, water, traffic, night creatures (crickets, owls)
  • Spatial audio: Echo and reverb that adapts to building proximity

Why procedural? Because:

  1. Instant iteration (tweak a frequency, hear it immediately)
  2. Zero load times (no files to download)
  3. Infinite variation (sounds never repeat exactly)
  4. Smaller bundle size

The entire audio system fits in one TypeScript file and sounds better than most asset-pack audio.


The Build: From Void to Playable in Hours

Day 1: Foundation & Core Loop

Morning: Set up the Three.js scene, camera, and basic player controls. Implemented the fog-of-war system using a DataTexture – a 512×512 texture that tracks which areas you’ve explored.

Why DataTexture? Alternative approaches:

  • Per-vertex fog checks: Too expensive
  • Raymarching in shader: GPU bottleneck
  • DataTexture: Paint on CPU, sample on GPU. Perfect.

Afternoon: Procedural city generation. Grid-based layout with randomized:

  • Building dimensions (8-20 units wide, 15-100+ tall)
  • Building types (box, cylinder, L-shaped)
  • Rooftop details (antennas, water towers, helipads, gardens)
  • Buildings get taller toward the center for visual interest

Evening: Collectibles and collision detection. Simple 2D AABB collision (buildings are axis-aligned, player is a capsule). Fragment spawning with validation to avoid placing them inside buildings.

Status: Technically playable, but ugly and silent.

Day 2: Polish, Polish, Polish

This is where Claude Code really shined. Instead of spending hours debugging or looking up APIs, I could:

Me: “Add surface-specific footstep sounds”

Claude: Generates complete Web Audio implementation with concrete, grass, and water variations

Me: “The trees look static, add some life”

Claude: Implements vertex shader wind animation with multi-frequency sine waves

Me: “Fragments need more juice when collected”

Claude: Adds particle burst, screen color tint, slow-motion effect, and camera shake

By afternoon, I had:

  • Four visual themes (Day → Dusk → Night → Neon) on auto-cycle
  • Particle systems (birds, clouds, leaves, steam, embers)
  • Spatial audio with echo effects
  • A complete UI with loading screen, start menu, and game info modal

Night theme with neon-lit buildings and atmospheric moon lighting

By evening:

  • Glowing breadcrumb trail showing your path
  • Fireworks victory sequence
  • Local leaderboard with multiple scoring bonuses
  • Screenshot capture (P key)

The game went from “functional prototype” to “feels AAA” in a single day.


Technical Deep Dives & Code Highlight

Custom Shader Injection

Buildings need to react to the fog of war texture, but I didn’t want to write a full custom shader (losing Three.js’s nice PBR lighting). Solution? onBeforeCompile:


material.onBeforeCompile = (shader) => {
  // Add custom uniforms
  shader.uniforms.fogMap = { value: fogTexture };

  // Inject custom fragment shader code
  shader.fragmentShader = shader.fragmentShader.replace(
    '#include <fog_fragment>',
    `
    // Sample fog texture
    vec2 fogUV = (worldPos.xz + cityBounds.xy) / cityBounds.zw;
    float fogDensity = texture2D(fogMap, fogUV).r / 255.0;

    // Darken buildings in fogged areas
    gl_FragColor.rgb = mix(gl_FragColor.rgb, vec3(0.3), fogDensity * 0.9);

    #include <fog_fragment>
    `
  );
};

Enter fullscreen mode Exit fullscreen mode

This lets me keep Three.js’s lighting while adding custom fog behavior. And it works with InstancedMesh!

Instanced Rendering – 150x Performance Boost

Instead of:


// BAD: 300 draw calls
buildings.forEach(building => {
  const mesh = new THREE.Mesh(geometry, material);
  mesh.position.copy(building.position);
  scene.add(mesh);
});

Enter fullscreen mode Exit fullscreen mode

I use:


// GOOD: 1 draw call
const instancedMesh = new THREE.InstancedMesh(geometry, material, 300);
buildings.forEach((building, i) => {
  const matrix = new THREE.Matrix4();
  matrix.compose(building.position, rotation, building.scale);
  instancedMesh.setMatrixAt(i, matrix);
  instancedMesh.setColorAt(i, building.color);
});

Enter fullscreen mode Exit fullscreen mode

Result: Smooth 60 FPS with 300+ buildings, shadows, and particles.

Compass Logic – Finding Nearest Fragment

The HUD compass always points to the nearest uncollected fragment:


// Find nearest fragment
let nearest = uncollected[0];
let minDist = Infinity;

uncollected.forEach(fragment => {
  const dist = playerPos.distanceTo(fragment.getPosition());
  if (dist < minDist) {
    minDist = dist;
    nearest = fragment;
  }
});

// Calculate angle and rotate compass arrow
const dx = nearest.getPosition().x - playerPos.x;
const dz = nearest.getPosition().z - playerPos.z;
const angle = Math.atan2(dx, dz);

compassElement.style.transform = `rotate(${angle}rad)`;

Enter fullscreen mode Exit fullscreen mode

No pathfinding needed – just point toward the goal. Players love it.

Procedural Thunder – Layered Noise Synthesis

Thunder isn’t just random noise. It’s carefully crafted with rumble + crack:


playThunder() {
  const data = buffer.getChannelData(0);

  for (let i = 0; i < data.length; i++) {
    const t = i / sampleRate;

    // Envelope: quick attack, slow decay
    const env = Math.exp(-t * 1.5) * (1 - Math.exp(-t * 20));

    // Low-frequency rumble
    const rumble = Math.sin(t * 30 + Math.random() * 0.5) * 0.5;

    // High-frequency crack
    const crack = (Math.random() * 2 - 1);

    // Mix and apply envelope
    data[i] = (rumble + crack * 0.5) * env;
  }

  // Low-pass filter for deep rumble
  filter.type = 'lowpass';
  filter.frequency.value = 200 + Math.random() * 100;
}

Enter fullscreen mode Exit fullscreen mode

Real thunder has that rumble + crack quality. This captures it with pure math.

Minimap – Coordinate Mapping

Map world coordinates to minimap percentage for CSS positioning:


function worldToMinimapPercent(worldPos: Vector3): { x: number, y: number } {
  const citySize = 400; // World spans -200 to +200

  // Normalize to 0-1, then convert to percentage
  const normalizedX = (worldPos.x + citySize / 2) / citySize;
  const normalizedZ = (worldPos.z + citySize / 2) / citySize;

  return { x: normalizedX * 100, y: normalizedZ * 100 };
}

// Position fragment dots
dot.style.left = `${pos.x}%`;
dot.style.top = `${pos.y}%`;

Enter fullscreen mode Exit fullscreen mode

Same technique used for fog texture UVs. Master coordinate mapping once, use it everywhere.

Fireworks Physics – Spherical Particle Distribution


explode(position: Vector3) {
  for (let i = 0; i < particleCount; i++) {
    // Random direction on sphere
    const theta = Math.random() * Math.PI * 2;
    const phi = Math.random() * Math.PI;

    const velocity = new Vector3(
      Math.sin(phi) * Math.cos(theta),
      Math.sin(phi) * Math.sin(theta),
      Math.cos(phi)
    ).multiplyScalar(8 + Math.random() * 4);

    particles.push({ position, velocity, life: 1.0 });
  }
}

update(delta: number) {
  particles.forEach(p => {
    p.position.add(p.velocity.clone().multiplyScalar(delta));
    p.velocity.y += -9.8 * delta; // Gravity
    p.life -= delta * 0.5; // Fade
  });
}

Enter fullscreen mode Exit fullscreen mode

Spherical distribution + gravity = convincing fireworks. No physics engine needed.


Challenges & Lessons Learned

Challenge 1: The Tree Gap Bug

Problem: Trees had separate trunk and crown meshes. When I added wind animation to crowns (vertex shader), visible gaps appeared during sway.

Attempts:

  1. Increase overlap → Still visible
  2. Merge geometries with vertex attributes → Broke colors
  3. Reduce animation → Still janky

Solution: Disable wind animation entirely. Static trees look fine and the gap is gone. Sometimes the best solution is the simplest one.

Lesson: Don’t over-engineer. If a feature causes more problems than it solves, cut it.

Challenge 2: Fragment Spawning

Problem: Fragments occasionally spawned inside buildings or in unreachable locations.

Evolution:

  • v1: 2-unit clearance → Too close to buildings
  • v2: 12-unit clearance → Better but still issues
  • v3: 20-unit clearance + disabled problematic building types (pyramids, domes)
  • v4: 30-unit clearance + increased collection radius

Lesson: Incremental fixes are okay. Don’t wait for the “perfect” solution.

Challenge 3: Performance vs. Visual Fidelity

Constraint: Browser game running at 60 FPS with no stutters.

Decisions:

  • Instanced rendering for buildings (✓ massive win)
  • Shadow map resolution: 2048×2048 (sweet spot)
  • Particle count: Dynamic based on system type
  • Fog texture: 512×512 (could go lower, but 256 KB is negligible)
  • LOD system: Not needed (instancing is enough)

Lesson: Profile before optimizing. Instancing alone solved 90% of performance concerns.


Working with Claude Code

This was my first game jam with AI pair programming. Here’s what that looked like:

What Worked Really Well

1. Rapid prototyping

Me: “Add thunder sounds for lightning”

Claude: Complete Web Audio implementation with rumble, crack, and realistic delay

Time saved: 30-60 minutes per feature

2. Bug fixing

Me: “Fireworks particles aren’t cleaning up”

Claude: Identifies the issue, implements proper disposal in filter callback

3. Polish iterations

Me: “Make the breadcrumb trail look nicer”

Claude: Converts from thin lines to glowing tube geometry with particles

What Required Guidance

1. Creative decisions

Claude can implement, but vision is still human. I had to decide:

  • How fog should look (teal with corruption hints)
  • What themes to include (day/dusk/night/neon)
  • Which features to cut when time was tight

2. Bug investigation

Complex visual bugs (like the tree gap) required back-and-forth. Claude would suggest fixes, I’d test, we’d iterate.

3. Performance tuning

Deciding what to optimize required understanding the bottleneck. Claude implemented solutions once I identified the problem.

The Workflow

Typical flow:

  1. I describe what I want (feature or fix)
  2. Claude implements it
  3. I test in the browser (Vite HMR makes this instant)
  4. If issues, I describe what’s wrong
  5. Claude iterates

It’s like having a senior developer who:

  • Never gets tired
  • Remembers every file in the codebase
  • Writes clean, well-commented code
  • Doesn’t argue about architecture decisions

But I’m still the director. The creative vision, gameplay feel, and final polish decisions are mine.


The Numbers

Development Stats:

  • Time: 1.5 intensive days (+ polish sessions)
  • Code: ~3,700 lines of TypeScript
  • Commits: 15+ (after cleaning up the messy ones)
  • External dependencies: 1 (Three.js)
  • External assets: 0 (everything procedural)

Technical Achievements:

  • 300+ procedurally generated buildings
  • 100% procedural audio (zero sound files)
  • 60 FPS on modern hardware
  • ~500 KB bundle size (gzipped)
  • Zero load times (no assets to fetch)

Game Content:

  • 3 difficulty levels
  • 4 dynamic visual themes
  • 5 particle systems
  • 20+ game systems/classes
  • Multiple scoring bonuses
  • Local leaderboard

Key Takeaways

1. Procedural Generation is Powerful

No 3D models meant:

  • Instant iteration (change a parameter, see results)
  • Infinite variety (every city is unique)
  • Tiny bundle size
  • Creative constraints that forced innovation

The visual aesthetic emerged from the limitations. Low-poly geometric buildings with procedural color variation created a distinctive look.

2. Web Audio API is Underrated

Game devs sleep on Web Audio API. Yes, it’s more work than dropping in an MP3. But:

  • Sounds can react to gameplay dynamically
  • Zero licensing concerns
  • No asset management overhead
  • Perfect for game jams where time > polish

Surface-specific footsteps and spatial echo effects make the world feel alive.

3. Vite’s HMR is a Game Changer

The feedback loop is everything in game development. Vite made it:

  • Edit shader → 100ms to see change
  • Adjust physics → instant update
  • Modify UI → no page refresh

Traditional build tools would have killed my momentum.

4. AI Pair Programming Accelerates Flow

Claude Code didn’t replace my skills – it amplified them. I could:

  • Stay in creative flow (no context switching to docs)
  • Iterate faster (implement → test → refine)
  • Tackle ambitious features (spatial audio, custom shaders)
  • Focus on design while AI handles implementation

The result: A scope I would normally consider impossible for a solo jam.

5. Constraints Breed Creativity

“No external assets” forced me to:

  • Master procedural generation
  • Learn Web Audio API deeply
  • Think in primitives and compositions
  • Build systems instead of placing objects

These constraints made the game more interesting, not less.


What I’d Do Differently

1. Test on Different Machines Sooner

I developed on a beefy machine. Didn’t test on lower-end hardware until late. Luckily, instanced rendering meant performance was fine, but that was lucky – not planned.

2. Implement Settings Menu

Audio volume, mouse sensitivity, graphics quality – these should have been in from the start. Players expect them. I shipped without them due to time constraints.

3. Better Spawn Validation from Day One

The fragment spawning issues ate more time than they should have. A robust validation system upfront would have saved iterations.


Unexpected Wins

1. The Breadcrumb Trail

Initially just a debug feature to see where I’d been. Players loved it so much I polished it:

  • Thin lines → Glowing tube geometry
  • Static → Particle sparkles
  • Flat → Elevated with smooth curves

Became one of the game’s signature visual elements.

2. Automatic Theme Cycling

Originally, themes were manual (press T to cycle). Making it automatic with smooth cross-fades created this living, breathing atmosphere. The city feels different every few minutes.

3. Procedural Audio Reactions

Making audio react to environment (echo in tight spaces, wind in open areas) was a last-minute addition. It’s subtle but makes the world feel responsive and real.


The Verdict: Did Vibe Coding Work?

Yes.

I shipped a complete 3D exploration game with:

  • ✅ Procedurally generated world
  • ✅ Complete audio design (100% procedural)
  • ✅ Four visual themes with smooth transitions
  • ✅ Multiple particle systems
  • ✅ Professional UI with loading screens, menus, and HUD
  • ✅ Scoring system with bonuses
  • ✅ Local leaderboards
  • ✅ Victory celebration sequence
  • ✅ Comprehensive documentation
  • ✅ Clean codebase (~3,700 lines)

Zero external assets. Zero templates. Just code, creativity, and AI assistance.

Would it have been possible solo without Claude? Sure – but it would have taken a week, not two days. And I would have cut half the features.


For Future Game Jammers

If you’re considering AI-assisted development for your next jam:

Do This:

  • Use AI for implementation, not vision
  • Iterate fast (test → feedback → refine)
  • Let AI handle boilerplate and documentation
  • Stay in flow state (avoid context switching)
  • Focus on creative decisions

Don’t Do This:

  • Accept AI suggestions blindly
  • Skip testing (“it compiled, ship it”)
  • Outsource all problem-solving
  • Forget that you’re still the designer

The Real Benefit

It’s not that AI writes code faster (though it does). It’s that you can stay in creative flow. No googling APIs. No context switching to documentation. No “how do I implement X” rabbit holes.

You think of a feature, describe it, and boom – it exists. Then you playtest, refine, polish.

That’s the game jam superpower.


Play the Game

Live demo: https://furic.github.io/unmask-the-city/

GGJ page: https://globalgamejam.org/games/2026/unmask-city-4

Source code: https://github.com/furic/unmask-the-city

Tech stack docs: See TECH_STACK.md in the repo

Built for Global Game Jam 2026 | Theme: Mask

Developer: Richard Fu / Raw Fun Gaming

AI Pair Programming: Claude Code


Final Thoughts

Game jams are about constraints, creativity, and controlled chaos. This year, I added a new constraint: no external assets. And a new tool: AI pair programming.

The result surprised me. Not just in scope (way bigger than I expected), but in polish. The game feels complete. Menus, sound, particles, themes, documentation – all the things you usually sacrifice in a jam.

Is this the future of solo game development? Maybe. At minimum, it’s a glimpse of how AI tools can amplify individual creativity instead of replacing it.

Would I do it again? Absolutely.

Next jam, I’m going even bigger. 🎮


P.S. – If you’re curious about specific technical implementations (shader code, audio synthesis, procedural generation algorithms), check out the full technical documentation in the repo. It’s a deep dive into every technique used.

The post Building “Unmask the City” – A Solo Game Jam Journey with AI Pair Programming appeared first on Richard Fu.

Top comments (0)