Heap dump analysis is one of those tasks that usually pulls you out of your editor. You generate an .hprof file, open a separate tool, wait for it to parse, and then navigate an unfamiliar UI to find what's eating memory.
I built HeapLens to bring that workflow into VS Code. It's a free, open-source extension that lets you open .hprof files directly in your editor and get interactive analysis — no context switching, no separate tool.
As of today, HeapLens is the only VS Code extension on the marketplace that does heap dump analysis.
What it actually does
You open a .hprof file in VS Code. HeapLens takes over with a custom editor and a Rust-powered backend that memory-maps the file and starts analyzing. Within seconds, you get a multi-tab interface:
Overview — Heap statistics at a glance with a treemap visualization of your largest objects. You immediately see where memory is going.
Class Histogram — Every class in the heap, sortable by instance count, shallow size, or retained size. Click a class to drill into individual instances.
Dominator Tree — The hierarchy of object ownership. Expand nodes to explore who's holding onto memory and why. Each node shows retained size bars so you can visually spot the heaviest branches.
Leak Suspects — Objects or classes retaining a disproportionate share of the heap, flagged automatically. You can adjust the detection threshold.
Waste Analysis — HeapLens detects duplicate strings, empty collections, over-allocated arrays, and boxed primitives — the kind of low-hanging fruit that quietly bloats your heap.
Compare — Load two heap dumps and diff them. See which classes grew, which shrank, which leak suspects appeared or resolved. Export the diff as markdown or CSV.
AI-assisted analysis
HeapLens has a built-in chat tab where you can ask questions about your heap dump in plain English. It works with 10 LLM providers — Anthropic, OpenAI, Google Gemini, Ollama for local models, and others. You bring your own API key (or use Ollama for free).
The chat isn't just a wrapper around a prompt. When the LLM suggests a query, you can run it directly against your heap data from the chat interface. It bridges the gap between "I think there's a leak in my connection pool" and actually finding the objects responsible.
There's also a Copilot Chat integration if you use GitHub Copilot — you can @heaplens /leaks right from the Copilot panel.
Why Rust
The backend is written in Rust with async I/O. It memory-maps HPROF files and uses a compressed sparse row (CSR) graph representation for the object reference graph. On my M2 Mac, it parses a 1.5 GB heap dump in under a second.
Performance matters here because heap dumps from real production systems are big — often 2-10 GB. A tool that takes minutes to open a file doesn't get used. HeapLens is designed to feel instant for the files you actually encounter.
Who this is for
If you're a Java developer who works with heap dumps — whether you're debugging OutOfMemoryError in production, profiling memory usage during development, or comparing heap snapshots across releases — HeapLens is built to make that faster and easier without leaving your editor.
It also supports Android HPROF files from adb, so Android developers can use it too.
Try it
Install from the VS Code Marketplace or Open VSX. It's free and open source.
If you work with heap dumps and have feedback, I'd appreciate hearing from you — open an issue on GitHub or drop a comment below.
HeapLens is open source under the Apache 2.0 license. If you find it useful, a star on GitHub or a rating on the marketplace goes a long way.
Top comments (0)