We use React every day, but we haven’t been paying enough attention to one of the most important aspects of it: how React has evolved since v16, how the new concurrent model works.
Photo by Tim Mossholder on Unsplash
Wow.. React is unstoppable!
The Great Rewrite of v16
Before we talk about Lanes, we must acknowledge the foundation. In 2017, React v16 introduced Fiber, a complete rewrite of the reconciliation engine. Before Fiber, React used a "stack-based" reconciler that processed updates synchronously - once a render started, it couldn't be stopped until it finished.
Fiber changed the game by introducing a linked-list tree structure that allowed React to break rendering into small units of work. This was the birth of Incremental Rendering: the ability to pause work, yield to the browser for a high-priority task (like a user click), and then resume.
From Expiration Times to Lanes
Initially, Fiber used a linear "Expiration Time" model to prioritize work: the longer an update waited, the higher its priority. However, this model struggled with "IO-bound" updates where different tasks needed to be handled in separate "streams" without blocking each other.
In 2026, we don't just use Fiber; we use React Lanes. This is a 31-bit bitmask system that allows React to manage multiple updates with granular precision. Instead of a single "time to expire," React assigns updates to specific lanes (like a highway), allowing it to
-
OR (|)for Merging: When multiple state updates occur, React doesn't store them in an array. It merges them into a single pendingLanes integer usinga | b. -
AND (&)for Filtering: To check if a specific work cycle includes a particular priority, React uses(pendingLanes & SyncLane) !== NoLane. - Interrupt a low-priority "Transition Lane" (e.g., loading a search result) to process a "Sync Lane" (e.g., the user typing).
- Merge tasks to ensure that the UI remains consistent across disparate state changes.
But why are bitmasks used? Because this allows React to use bitwise operators (&, |, ~) to merge or filter updates in constant time O(1) rather than iterating through arrays.
Because updates are grouped, React can pause a low-priority render (like a heavy data list) to handle a high-priority "SyncLane" event immediately. Once the urgent work is done, it resumes the lower-priority work using the same lane state.
A common issue with priority-based systems is starvation, where low-priority tasks (like an idle background task) never run because the user is constantly interacting with high-priority tasks (SyncLanes).
- React tracks how long each lane has been pending. If a lane remains in the pendingLanes mask for too long, it "expires".
- React then automatically merges that expired lane into the next SyncLane pass. This "promotes" the task, forcing it to render even if higher-priority work is still coming in, ensuring the UI eventually becomes consistent.
Hierarchical Propagation (childLanes)
Merging isn't just about combining updates, it's about signaling. When a child component has a pending update, its lane is merged into its parent's childLanes field.
This propagates all the way up to the Root Fiber.
During a render pass, if a parent component's lanes and childLanes bitwise intersection with the current "render lane" is zero (NoLanes), React can instantly skip (bail out) the entire subtree. It knows with 100% certainty that no urgent work exists below that point.
Photo by Bruce Tang on Unsplash
I'm getting stronger!
The Hierarchy of Lanes
React defines several categories of lanes to handle different types of user and system work:
- SyncLane: Highest priority. Reserved for discrete user events like clicks and key presses that require immediate feedback.
- InputContinuousLane: Used for continuous interactions like scrolling or mouse movement.
- DefaultLane: The standard priority for general setState updates.
- TransitionLanes: A set of 16 different lanes specifically for useTransition and startTransition. These are interruptible and can be processed in parallel.
- IdleLane: Lowest priority for background tasks or offscreen content.
useTransition vs. useDeferredValue
As engineers, we must view these hooks not as "performance boosters," but as manual lane selectors for React's scheduler. Both hooks leverage Transition Lanes to move non-urgent work out of the SyncLane, preventing heavy renders from blocking user input.
- useTransition (Imperative Orchestration): Use this when you have direct access to the state-updating code (e.g., setSearchTerm). By wrapping the update in startTransition, you explicitly tell React to assign this task a lower-priority lane. This is ideal for major UI shifts, like switching tabs or filtering lists, where you can also use the isPending flag to show a "Loading..." state. (https://react.dev/reference/react/useTransition)
The problem: The Heavy Data Dashboard
Imagine an enterprise dashboard where clicking a "Global Analytics" tab triggers a massive re-render involving thousands of data points or complex SVG charts.
Without useTransition (Blocking):
- User clicks the "Analytics" tab.
- React triggers an urgent state update.
- The JavaScript main thread is locked for 500ms while React calculates the massive chart.
The Problem: If the user suddenly realizes they clicked the wrong tab and clicks "Home," the browser won't respond until the Analytics render finishes. The UI feels "frozen".
With useTransition (Concurrent/Interruptible):
We wrap the tab-switching logic in startTransition. This moves the "Analytics" render into a lower-priority Transition Lane.
const [isPending, startTransition] = useTransition();
const [tab, setTab] = useState('home');
function handleTabChange(nextTab) {
// 1. High-priority: UI stays responsive to the next click
startTransition(() => {
// 2. Low-priority: This heavy render can be INTERRUPTED
setTab(nextTab);
});
}
return (
<nav>
<TabButton onClick={() => handleTabChange('analytics')}>
{isPending ? <Spinner /> : 'Analytics'}
</TabButton>
<TabButton onClick={() => setTab('home')}>Home</TabButton>
</nav>
);
- useDeferredValue (Declarative Buffering): Use this when a value is passed down as a prop and you have no control over the source setState call. It creates a "lagging mirror" of a value. React will first re-render the UI with the old value (keeping the input responsive) and then, in the background, schedule a separate render for the new value in a Transition Lane. (https://react.dev/reference/react/useDeferredValue)
query: r → re → rea → reac → react
deferredQuery: "" → "" → r → re → react
The problem: expensive filtering blocks typing
Imagine a list with 10,000 items.
function Search() {
const [query, setQuery] = useState("");
const filteredItems = items.filter(item =>
item.name.toLowerCase().includes(query.toLowerCase())
);
return (
<>
<input
value={query}
onChange={e => setQuery(e.target.value)}
/>
<List items={filteredItems} />
</>
);
}
What goes wrong?
setQuery triggers a re-render on every keystroke
- filter() runs every time
- Large lists = input lag
- Typing feels “sticky”
Now let’s introduce useDeferredValue.
import { useDeferredValue, useState } from "react";
function Search() {
const [query, setQuery] = useState("");
const deferredQuery = useDeferredValue(query);
const filteredItems = items.filter(item =>
item.name.toLowerCase().includes(deferredQuery.toLowerCase())
);
return (
<>
<input
value={query}
onChange={e => setQuery(e.target.value)}
/>
<List items={filteredItems} />
</>
);
}
What changed? Only one thing:
- filter(query)
+ filter(deferredQuery)
While the user is typing:
- Query updates immediately (high priority)
- Input stays smooth
Meanwhile:
- deferredQuery lags behind
- Filtering runs less often
- React schedules it as low priority
The Concurrent Model: Standardized in 2026
While Concurrent Mode was once an experimental opt-in, as we move through 2026, it is the default paradigm for high-performance React architecture. With React 19 now fully established, features like startTransition, useDeferredValue, and automatic batching aren't just "cool tricks" - they are the primary tools we use to orchestrate the "Update Streams" that Lanes manage under the hood.
What’s Next: The Priority Trilogy
Understanding Lanes is the first step in mastering React’s internal orchestration. However, Lanes are only the "logic" layer. To get the full picture of how React maintains 60fps in complex applications, we need to look at the systems on either side of the Lane bitmask.
In my upcoming articles, I will be doing deep-dives into:
The Event Priority System: How React translates physical user intent (clicks vs. scrolls) into Lane requests.
The Scheduler: How React negotiates with the browser's main thread to execute Lane-prioritized work without dropping frames.
Photo by Alex Kalinin on Unsplash



Top comments (1)
Hayk, thanks for the useful post. It's worth exploring these approaches and applying them to real projects. Looking forward to more!