DEV Community

Cover image for I couldn’t understand my MRI, so I built something that could
Pranjal Sailwal
Pranjal Sailwal

Posted on

I couldn’t understand my MRI, so I built something that could

On 3 February 2025, I had an MRI done — and even after sitting through explanations, I still didn’t really get what I was looking at. There was a lot of data, a lot of terminology, and almost no way to explore it on my own.

That stuck with me.

So after things settled, I built NeuroTract.

The idea was simple: take diffusion MRI data and turn it into something I can actually interact with. Not just reports or static visuals, but something where you can follow pathways, see how regions connect, and get a feel for what’s happening instead of guessing.

It processes the scan data step by step and ends up with two things I found useful:

  • a 3D view where you can rotate and inspect white matter pathways
  • a network view where the brain is treated like a graph (regions and connections)

That second part changed how I looked at it. Instead of thinking “this is a scan,” it became “this is a network,” with hubs, clusters, and connections you can actually reason about.

I also kept the interface flexible on purpose — same data, but you can look at it in a simpler way or go deeper depending on how much you want to understand.

This wasn’t built to replace medical advice. It was built because I wanted to understand what I was being shown, without feeling lost.

If you’ve ever seen an MRI report and felt the same, you’ll get why I made this.

Repo: NeuroTract

Open to remote tech consulting, SDE, and Senior Developer roles.

Top comments (0)