DEV Community

Ben Santora
Ben Santora

Posted on

Watching Entropy Happen: A Go Program

Entropy is the central element in my philosophy. In physics, the Second Law of Thermodynamics describes its one-way direction — from organized to spread out, from structured to diffuse. The universe has a strong preference for disorder, not order. Things fall apart on their own. They don't spontaneously reassemble.

Order is the exception. It has to be built and maintained. Left alone, everything drifts toward disorder. Stars burn through their fuel and go cold. Sandcastles erode. Civilizations eventually scatter. This isn't pessimism — it's just physics. The Second Law of Thermodynamics says that in any closed system, entropy always increases over time.


Why Write Code About It?

Philosophy is great for talking about big ideas, but science wants to measure them. If entropy always increases, can we actually watch that happen? Can we put numbers on it?

That's exactly what a simulation lets us do. Instead of a real gas in a real box, we create a virtual one — a bunch of particles bouncing around in a cube that lives inside the computer. We give each particle a position and a speed, let them bounce off the walls, and measure the total energy in the system over time.

A simulation doesn't replace an experiment — it's a way to make a theory concrete enough to run, inspect, and question.

The goal here is to try and ground this worldview in actual physics. Writing entropy.go is a way of saying: this isn't just a belief, it's something we can model, run, and verify against what the equations predict.


Why Go?

There are lots of languages you could write this in. So why Go specifically?

  • explicit — Go does what it says
  • readable — no hidden layers, no magic
  • concurrent — built-in goroutines when you need scale
  • fast — compiles and runs quickly
  • verifiable — you can understand it line by line

Go is a language that does what it says. There are no hidden layers, no framework doing mysterious things behind the scenes. If you write a loop, it runs. If you write a struct, it's just data. You can read Go code and understand it line by line — which is exactly what you want when you're trying to model something you care about understanding. For me, the Go language matches my mindset.

And when you eventually want to simulate millions of particles instead of a thousand, Go has goroutines — a built-in way to run parts of your program in parallel, without a lot of complicated setup. The scale-up path is already there.


What the program does: We create 1,000 invisible particles inside an invisible box, then give each one a random push.

What we watch: Every step, each particle moves and bounces off the walls — but loses a little speed each time (multiplied by 0.9). Over 100 steps, they slow down.

What this shows: Energy always decreases. In our run, only 9% of the original energy remained. That's entropy in action — movement from order to disorder.

The program doesn't prove entropy, but helps make it more concrete. You can change the rules — bouncier walls, more particles — and see what happens. That's the difference between a thought experiment and something you can run.

Entropy gives time a direction. Systems move from ordered states to disordered ones — never the reverse. The simulation mirrors this: it starts ordered, ends disordered, never reverses. That's not opinion — that's what the code does, every time.

NOTE: "This isn't a full statistical mechanics simulation — true thermodynamic entropy, as Boltzmann defined it, requires counting the number of possible microstates a system can occupy. But this little program does demonstrate the same irreversible tendency: energy spreads, order decays, and the process never runs backward."

entropy.go - The Go Program

The program models 1,000 particles bouncing around inside a 1×1×1 cube. Here's what happens step by step:

Step 1: Create the particles

Each particle gets a random starting position inside the cube, and a small random velocity — how fast it's moving in each direction (x, y, and z). This is the "ordered" starting state: every particle has a nice, tight velocity distribution.

Step 2: Move every particle

Each step, the program adds the velocity to the position. If a particle hits a wall, it bounces back — but loses a little energy each time (multiplied by –0.9 instead of –1.0). That energy loss is the key: it's how disorder creeps in.

Step 3: Measure the total energy

After each step, the program adds up all the kinetic energy in the system — energy of motion. This number gets saved. Over 100 steps, you get a history of how the total energy changes.

Step 4: Report the result

At the end, the program prints the starting energy, the final energy, the ratio between them, and a log of that ratio — which is a simple stand-in for entropy change. If the number goes negative, energy was lost. The system became more disordered.

The output looks something like this:

Initial energy: 5.0231
Final energy:   3.8847
Ratio:          0.7734
Entropy trend:  -0.2573 -> more diffuse
Enter fullscreen mode Exit fullscreen mode

The ratio is less than 1 — energy went down. The log of the ratio is negative — which, in our simplified model, signals that entropy increased. The system moved in exactly the direction physics says it should.


The Code

Here's the complete, runnable Go program:

package main

import (
    "fmt"
    "math"
    "math/rand"
)

type Particle struct {
    x, y, z    float64
    vx, vy, vz float64
}

func kineticEnergy(p Particle) float64 {
    return 0.5 * (p.vx*p.vx + p.vy*p.vy + p.vz*p.vz)
}

func totalSystemEnergy(particles []Particle) float64 {
    var e float64
    for _, p := range particles {
        e += kineticEnergy(p)
    }
    return e
}

func simulate(n int, steps int) []float64 {
    particles := make([]Particle, n)
    for i := 0; i < n; i++ {
        particles[i] = Particle{
            x:  rand.Float64(),
            y:  rand.Float64(),
            z:  rand.Float64(),
            vx: rand.NormFloat64() * 0.1,
            vy: rand.NormFloat64() * 0.1,
            vz: rand.NormFloat64() * 0.1,
        }
    }

    history := make([]float64, steps)
    for step := 0; step < steps; step++ {
        for i := 0; i < n; i++ {
            particles[i].x += particles[i].vx
            particles[i].y += particles[i].vy
            particles[i].z += particles[i].vz

            if particles[i].x < 0 || particles[i].x > 1 {
                particles[i].vx *= -0.9
            }
            if particles[i].y < 0 || particles[i].y > 1 {
                particles[i].vy *= -0.9
            }
            if particles[i].z < 0 || particles[i].z > 1 {
                particles[i].vz *= -0.9
            }
        }
        history[step] = totalSystemEnergy(particles)
    }
    return history
}

func main() {
    history := simulate(1000, 100)
    fmt.Printf("Initial energy: %.4f\n", history[0])
    fmt.Printf("Final energy:   %.4f\n", history[len(history)-1])
    ratio := history[len(history)-1] / history[0]
    fmt.Printf("Ratio:          %.4f\n", ratio)
    fmt.Printf("Entropy trend:  %.4f -> more diffuse\n", math.Log(ratio))
}
Enter fullscreen mode Exit fullscreen mode

The Key Pieces

A Particle is just a position and a velocity — three numbers for where it is, three for how fast it's moving:

type Particle struct {
    x, y, z    float64
    vx, vy, vz float64
}
Enter fullscreen mode Exit fullscreen mode

Kinetic energy = ½mv². We treat mass as 1, so it's just ½v²:

func kineticEnergy(p Particle) float64 {
    return 0.5 * (p.vx*p.vx + p.vy*p.vy + p.vz*p.vz)
}
Enter fullscreen mode Exit fullscreen mode

The inelastic bounce — particle hits a wall and loses a little energy:

if particles[i].x < 0 || particles[i].x > 1 {
    particles[i].vx *= -0.9  // reverses direction, damps speed by 10%
}
Enter fullscreen mode Exit fullscreen mode

The bounce line is where physics enters the code. Without -0.9 (perfect elastic bounce), energy would stay constant forever. That 0.1 of energy loss per wall collision is what drives the system toward disorder.


The Takeaway

Entropy can be an overwhelming concept. Order is rare, which is exactly what makes it interesting to build.

Run the code yourself with go run entropy.go and watch entropy happen in real time. Then try changing the numbers — more particles, more steps, different damping — and see what happens. That's the beauty of simulation: you can test ideas, not just talk about them.


Ben Santora - April 2026 - www.bensantora.com

Top comments (0)