The Virtual DOM has shaped frontend development for more than a decade, but its coarse-grained reconciliation model introduces unnecessary performance overhead. An alternative is emerging: fine-grained reactivity powered by signals. To explore this, I built a benchmark suite comparing two identical applications — one with React (Virtual DOM) and one with Solid.js (signals) — across six common scenarios. The results were striking: signals reduced DOM mutations by up to 99.9%, lowered heap usage by more than 70%, and cut update latency by as much as 94%. These findings suggest that signal-based reactivity isn’t just an optimization, but a fundamental architectural evolution in how we build modern user interfaces.
Virtual DOM: Today’s Standard, Tomorrow’s Bottleneck
The Virtual DOM (VDOM) was popularized by React in 2013 and quickly adopted by frameworks like Vue. Instead of updating the browser’s DOM directly, frameworks maintain an in-memory tree that represents the UI. Whenever state changes, the framework re-renders the affected components, produces a new virtual tree, and reconciles it against the previous one, applying only the differences to the real DOM. This abstraction made large-scale, declarative UI development practical — and it became the dominant model across the industry.
But like any abstraction, the Virtual DOM comes with hidden costs:
- Re-render by assumption: On every state change, a component and its descendants re-run, even if only a single binding was relevant.
- Unknown change scope: Because the framework can’t see which binding changed, it regenerates and diffs entire subtrees “just in case.”
- Reconciliation overhead: Tree diffing is O(n) relative to the size of the subtree, not O(1) to the actual change.
- Developer optimization burden: Without tools like useMemo, useCallback, or manual component splitting, entire trees re-render unnecessarily. Many new React developers are surprised to learn that updating a parent state can re-render all children, even if those children don’t depend on that state.
This approach works well enough for small apps, but at scale it often leads to thousands of redundant DOM mutations, excessive garbage collection, and long main-thread tasks that harm responsiveness.
Signals: A Fine-Grained Alternative
Signals flip the Virtual DOM model on its head. Instead of assuming that “everything might have changed,” they operate on a much simpler principle: only update exactly what did change.
A signal is a small reactive primitive that stores a value and tracks the code or DOM bindings that read it. When the signal updates, only those specific consumers re-run. There’s no global re-render, no tree diffing, and no wasted work. Updates are constant-time and fully localized.
This reflects a fundamental mental shift in how we think about reactivity:
- Virtual DOM: “We don’t know what changed, so let’s re-render the component tree and reconcile it to check all bindings.”
- Signals: “We know exactly what changed, so we update only those bindings that depend on that signal.”
Solid.js embodies this model end-to-end, showing what’s possible when fine-grained reactivity is the default. Angular has also begun a major transition, introducing signals in v16, stabilizing them further in v20, and explicitly positioning them as the foundation of its future reactivity system. The momentum is clear: signals aren’t just an experiment — they’re shaping up to be the next standard.
Internal Mechanics Compared
Virtual DOM (React, Vue)
When state changes, a Virtual DOM framework kicks off a render phase: component functions re-run to produce a new virtual tree. A reconciliation algorithm then compares this new tree to the previous one, and a commit phase applies the minimal set of patches to the real DOM.
Keys (key in React/Vue) help guide reconciliation so lists aren’t torn down unnecessarily. To keep things efficient, developers often need to split components into smaller pieces and use memoization (useMemo, useCallback, etc.) to avoid excessive re-renders.
Signals (Solid, Angular Signals)
Signals work at a much finer level of granularity. Each signal stores a value and tracks the computations that depend on it. When a computation (like a DOM text binding) reads a signal, a dependency edge is recorded.
When that signal updates, only the dependent computations re-run, updating the exact DOM nodes or values they control. No virtual tree is generated, no reconciliation pass is needed. Correctness is guaranteed by design: only the parts of the UI that actually changed get updated.
A Quantitative Benchmark Suite for Comparative Analysis
Philosophically and technically, the Virtual DOM and signals represent very different approaches. But how do they perform in practice? To find out, I built two identical applications — a React dashboard (Virtual DOM) and a Solid.js dashboard (signals) — and measured their behavior under controlled workloads.
The benchmarking harness, built with Puppeteer, simulated six real-world scenarios: filtering, incremental updates, bulk insertions, bulk removals, sorting, and idle time. For each scenario, I collected metrics on DOM mutations, update latency, heap size, and long task duration. Every test was repeated 10 times per framework, for a total of 120 runs.
A key methodological choice was to benchmark React in its default state — without optimizations like useMemo or useCallback. This isolates the architectural efficiency of the Virtual DOM itself, rather than the skill of the developer applying micro-optimizations. Solid.js, in turn, was built with its native signal primitives (createSignal, createMemo), with no Virtual DOM overhead. Both implementations shared the same UI, styling, and data structures (filters, KPI header, 10k-row grid, sorting, and log).
Testing Scenarios:
-
S1_FILTER – Region Filter Change
- Operation: Apply a single dropdown filter.
- Purpose: Measure simple filtering and re-rendering.
-
S2_UPDATE_1PCT – Incremental Updates
- Operation: 50 consecutive 1% row updates at 100ms intervals.
- Purpose: Test continuous updates under frequent state changes.
-
S3_INSERT_1K – Bulk Insertion
- Operation: Insert 1,000 rows into the grid.
- Purpose: Measure large-scale DOM expansion.
-
S4_REMOVE_1K – Bulk Removal
- Operation: Remove 1,000 rows.
- Purpose: Measure large-scale DOM reduction.
-
S5_SORT_COL – Column Sorting
- Operation: Sort the price column 5 times consecutively.
- Purpose: Stress-test dataset reordering.
-
S6_IDLE_30S – Idle Monitoring
- Operation: Let the app sit idle for 30 seconds.
- Purpose: Detect memory leaks and measure baseline behavior.
Metrics Collected
- DOM Mutations (via MutationObserver)
- Update Latency (time from action → settled DOM)
- Heap Size (MB, via Chrome Memory API)
- Long Tasks (count of >50ms blocks)
- Total Long Task Duration (aggregate ms blocked)
Benchmark Results
The data clearly shows the performance implications of both models. Below I present the quantitative results and charts (placeholders included from the generated HTML).
Update Latency
- React median latencies ranged from 1,035ms to 8,298ms.
- Solid.js reduced this to 473ms to 3,083ms in most cases.
- The largest gap appeared in continuous updates (S2_UPDATE_1PCT): Solid.js was 93.5% faster.
Long Task Count & Duration
- React often generated dozens of long tasks, with blocking durations over 8s.
- Solid.js kept this to 1–2 tasks with durations under 1s in most scenarios.
Memory Usage
- React’s heap usage peaked at 2.4 GB during idle.
- Solid.js stayed around 675 MB.
- Consistently, Solid.js reduced heap usage by 70–75%.
DOM Mutations
- React generated tens of thousands of DOM mutations per operation (e.g., 101,997 for sorting).
- Solid.js reduced this to single digits (7).
- Overall, Solid.js achieved a 99.9% reduction in DOM mutations.
Results
Sample Size
10 runs × 6 scenarios × 2 frameworks = 120 total measurements.
Executive Summary
- DOM Mutations: Solid.js reduced mutations by 99.9% vs React.
- Memory Usage: Solid.js consumed 70–75% less heap memory.
- Update Latency: Solid.js improved operation speed by 30–94%.
- Long Tasks: Solid.js reduced blocking task frequency and duration by up to 98%.
DOM Mutations Comparison
Scenario | React (Median) | Solid (Median) | Improvement
S1_FILTER - 10,007 → 3 → 3,336× fewer mutations
S2_UPDATE_1PCT - 25,168 → 52 → 484× fewer mutations
S3_INSERT_1K - 11,007 → 3 → 3,669× fewer mutations
S4_REMOVE_1K - 11,010 → 3 → 3,670× fewer mutations
S5_SORT_COL - 101,997 → 7 → 14,571× fewer mutations
S6_IDLE_30S - 10,005 → 2 → 5,003× fewer mutations
Insight: Solid achieves near-minimal DOM mutations, while React's reconciliation triggers tens of thousands - even when idle.
Update Latency
Scenario | React Median | React P95 | Solid Median | Solid P95 | Improvement
S1_FILTER - 1,035 ms / 1,323 ms → 473 ms / 560 ms → 54.3% faster
S2_UPDATE_1PCT - 8,298 ms / 9,371 ms → 541 ms / 669 ms → 93.5% faster
S3_INSERT_1K - 1,554 ms / 1,653 ms → 759 ms / 857 ms → 51.2% faster
S4_REMOVE_1K - 1,361 ms / 15,360 ms → 661 ms / 837 ms → 51.4% faster
S5_SORT_COL - 4,948 ms / 5,688 ms → 3,083 ms / 3,358 ms → 37.7% faster
S6_IDLE_30S - 1,036 ms / 1,091 ms → 720 ms / 973 ms → 30.5% slower
Insight: The biggest gap is in continuous updates (S2): Solid is ~15× faster. React's P95 latency spikes dramatically in removals (15.36 s).
Memory Consumption (Heap)
Scenario | React Median | React P95 | Solid Median | Solid P95 | Reduction
S1_FILTER - 264.68 MB / 441.04 MB → 75.80 MB / 121.55 MB → 71.4% less
S2_UPDATE_1PCT - 748.22 MB / 955.88 MB → 199.74 MB / 248.16 MB → 73.3% less
S3_INSERT_1K - 1,235.10 MB / 1,417.45 MB → 322.25 MB / 374.85 MB → 73.9% less
S4_REMOVE_1K - 1,659.76 MB / 1,817.85 MB → 441.23 MB / 485.65 MB → 73.4% less
S5_SORT_COL - 2,133.75 MB / 2,329.08 MB → 559.64 MB / 605.78 MB → 73.8% less
S6_IDLE_30S - 2,472.70 MB / 2,568.36 MB → 675.05 MB / 722.94 MB → 72.7% less
Insight: Solid uses roughly ¼ the memory of React across all scenarios.
Long Task Performance
Scenario | React Tasks | React Duration | Solid Tasks | Solid Duration | Reduction
S1_FILTER - 2 / 1,035 ms → 1 / 473 ms → 50% fewer
S2_UPDATE_1PCT - 51 / 8,298 ms → 1 / 541 ms → 98% fewer
S3_INSERT_1K - 2 / 1,554 ms → 2 / 759 ms → 51% shorter
S4_REMOVE_1K - 2 / 1,361 ms → 1 / 661 ms → 50% fewer
S5_SORT_COL - 6 / 4,948 ms → 7 / 3,083 ms → ≈38% shorter
S6_IDLE_30S - 1 / 1,036 ms → 1 / 720 ms → 31% shorter
Insight: Continuous updates (S2) show the most dramatic difference - React blocked the main thread with 51 long tasks, Solid reduced this to 1.
Architectural Deep Dive: Why Signals Outperform the Virtual DOM
1. Virtual DOM: A Useful but Costly Abstraction
The Virtual DOM (VDOM) transformed frontend development by making declarative UI practical. Frameworks like React and Vue popularized the idea of rendering an in-memory tree and then reconciling it against the previous tree to update only the differences in the real DOM. This two-step cycle—render and reconcile—solves complexity at scale but introduces structural inefficiencies.
When a root state changes, React often re-renders entire component subtrees, even if only a single binding actually changed. Reconciliation is O(n) relative to subtree size, not O(1) to the real change. To mitigate this, developers reach for micro-optimizations like useMemo, useCallback, or React.memo. But in practice, many engineers overlook these patterns, leading to widespread full-tree re-renders and performance surprises. The result is a model that works, but only with careful tuning.
2. Signals: Surgical Update Precision
Signals flip the model. Instead of regenerating trees and diffing them, a signal tracks exactly which computations or DOM bindings read its value. When the signal updates, only those dependents re-run. No re-renders, no reconciliation, and no wasted work.
This makes updates constant-time, memory-efficient, and predictable. Developers think in terms of what changed rather than how to optimize updates. Hooks like useMemo or dependency arrays become unnecessary; the framework itself guarantees precision.
3. Developer Experience: Simplicity by Default
Signals also simplify the mental model.
- No need to sprinkle memoization hooks or refactor components to avoid redundant renders.
- UI updates are localized: change a signal, and only its dependents update.
- Code reflects intent directly, instead of defensive optimization.
Signals remove the need for useEffect, useMemo, and useCallback, reducing both complexity and cognitive load.
4. Ecosystem Momentum
The shift toward signals is already underway.
- Solid.js was built from the ground up on fine-grained reactivity, avoiding a Virtual DOM entirely.
- Angular introduced signals in v16, is stabilizing them in v20, and has positioned them as the foundation of its future reactivity model.
- Preact Signals and frameworks like Qwik are also leaning heavily on fine-grained reactivity.
Momentum is building across the ecosystem, suggesting signals are more than a niche experiment—they are becoming the expected baseline.
5. Benchmarks in Practice
The architectural differences show up clearly in benchmarks:
-
S2_UPDATE_1PCT – Continuous Updates
- React: ~25k DOM mutations, 51 long tasks, 8.3s median latency.
- Solid: ~52 mutations, completes in ~541ms.
- Insight: React re-renders subtrees; signals update only the affected rows.
-
S5_SORT_COL – Heavy Reordering
- React: ~102k mutations.
- Solid: 7 mutations.
- Insight: Signals reorder efficiently without rebuilding or diffing the table.
-
S6_IDLE_30S – Idle State
- React: still ~10k background mutations.
- Solid: near-perfect stability.
- Insight: Signals avoid unnecessary background churn, delivering true idleness.
6. Scientific Rigor and Limitations
- Sample size: 10 independent runs per scenario (120 total).
- Reproducibility: Deterministic datasets, public code.
- Metrics: DOM, latency, memory, and long tasks measured precisely with browser APIs.
Limitations:
- Tests were desktop Chromium only (no mobile browsers).
- React was benchmarked without optimizations like useMemo (a choice reflecting common real-world usage, not best-case).
- No virtualization was applied; the goal was to measure reactivity overhead, not rendering tricks.
Final Thoughts
The data points to a clear trend: fine-grained signals deliver orders-of-magnitude improvements in DOM efficiency, memory use, and latency compared to the Virtual DOM—especially in data-heavy scenarios where reconciliation overhead dominates.
React and other VDOM-based frameworks remain influential and have shaped the last decade of frontend development. But as our benchmarks show, frameworks like Solid.js—and now Angular with its signal adoption—achieve much greater precision by default. They eliminate unnecessary mutations, cut memory use by more than two-thirds, and deliver idle states that are actually idle.
Perhaps most importantly, signals achieve this without placing the optimization burden on developers. Performance is the default, not something unlocked through careful memoization.
Signals represent more than a micro-optimization. They are an architectural evolution—simplifying developer experience while unlocking performance headroom that will matter even more as applications grow richer, more data-driven, and more mobile. The question is no longer whether signals work, but how quickly teams and frameworks across the ecosystem embrace them.
Top comments (0)