As a best-selling author, I invite you to explore my books on Amazon. Don't forget to follow me on Medium and show your support. Thank you! Your support means the world!
When I build full-stack JavaScript applications today, I don't just think about databases and user interfaces. I think about rivers. I picture data as water, flowing from a source, through various channels, and finally reaching a destination where it's useful. The architecture I design determines whether that water flows smoothly, gets stuck in a dam, or floods the user's screen.
The old way was to build a single, massive reservoir of data on the client, often called a global store. Every component would drink from this same big pool. It was simple at first. But as the app grew, it became hard to know which component was polluting the water or why the water level was suddenly empty. We spent more time managing the reservoir than building features.
Now, I keep the water closer to the garden that needs it. Instead of one central pool, I have many small, dedicated wells. A component gets its data from a well right next to it. If the component is removed, the well can be covered up. This makes things simpler and cleaner.
Here's a practical way I do this with server data. I use a tool that handles the boring stuff—caching, retrying failed requests, and updating the UI—so I can focus on what the data is, not how to get it.
// A component that shows a product
function ProductPage({ productId }) {
// This hook manages everything about the product data
const { data: product, isLoading } = useQuery({
queryKey: ['product', productId], // A unique key for this data
queryFn: () => fetchProduct(productId), // How to fetch it
staleTime: 30000, // Data is considered fresh for 30 seconds
});
// A hook to handle updating the product
const updateProduct = useMutation({
mutationFn: (updates) => updateProductOnServer(productId, updates),
// Before the server confirms, optimistically update the UI
onMutate: async (updates) => {
// Cancel any ongoing fetches for this product
await queryClient.cancelQueries(['product', productId]);
// Save a snapshot of the old data, just in case
const previousProduct = queryClient.getQueryData(['product', productId]);
// Update the cached data immediately
queryClient.setQueryData(['product', productId], old => ({ ...old, ...updates }));
// Return the snapshot for potential rollback
return { previousProduct };
},
// If the server update fails, put the old data back
onError: (err, updates, context) => {
queryClient.setQueryData(['product', productId], context.previousProduct);
},
// After the update succeeds or fails, refetch to be sure
onSettled: () => {
queryClient.invalidateQueries(['product', productId]);
},
});
if (isLoading) return <div>Please wait...</div>;
return (
<div>
<h1>{product.name}</h1>
<p>Price: ${product.price}</p>
<button onClick={() => updateProduct.mutate({ price: product.price * 0.9 })}>
Apply 10% Discount
</button>
</div>
);
}
The magic here is the optimistic update. When a user clicks the discount button, the price changes on their screen immediately. They get that instant feedback. In the background, my code talks to the server. If the server says "yes," we're all set. If the server says "no," the code quietly changes the price back. The user never sees a loading spinner for this action, which makes the app feel incredibly fast and responsive.
Now, what about features where multiple people are editing the same document at the same time, like Google Docs? This is a much harder problem. I can't just optimistically update and hope for the best. Two people might delete the same word at the same time. I need a way for their changes to merge automatically, without conflict.
For this, I use a concept called Conflict-Free Replicated Data Types, or CRDTs. It's a complex idea, but the library code makes it manageable. It ensures that no matter what order changes arrive in, every user ends up with the same document.
// A simplified class to manage a collaborative document
class CollaborativeDocument {
constructor(documentId) {
this.documentId = documentId;
// Create a new Y.js document (a CRDT library)
this.doc = new Y.Doc();
// Connect it to a websocket server for real-time sync
this.provider = new Y.WebsocketProvider(
'wss://our-sync-server.com',
documentId,
this.doc
);
// Get the shared text element
this.sharedText = this.doc.getText('mainContent');
// Listen for sync events
this.provider.on('sync', isSynced => {
console.log(`Document is ${isSynced ? 'fully synced' : 'syncing...'}`);
});
}
// Insert text at a specific position. The CRDT handles the rest.
insertText(position, newText) {
this.sharedText.insert(position, newText);
}
// Enable offline support
enableOfflineSupport() {
const saveState = () => {
// Convert the document state to a saveable format
const documentState = Y.encodeStateAsUpdate(this.doc);
// Store it locally
localStorage.setItem(`doc-${this.documentId}`,
btoa(String.fromCharCode(...documentState)));
};
// Save on every change
this.doc.on('update', saveState);
// Load saved state when starting up
const savedState = localStorage.getItem(`doc-${this.documentId}`);
if (savedState) {
const updateArray = new Uint8Array(
atob(savedState).split('').map(char => char.charCodeAt(0))
);
Y.applyUpdate(this.doc, updateArray);
}
}
}
In a React component, using this would look like connecting to a live text stream. The user types, and their words are instantly part of a shared stream with everyone else. The CRDT library is the unsung hero, constantly resolving tiny conflicts before they ever become a problem the user sees.
Speed is another huge focus. Users hate waiting. A major breakthrough has been moving my server logic closer to them, to the "edge" of the network. Think of it like having a coffee stand on every block instead of one big café downtown. An Edge Function runs in a data center that's geographically near the user. It can personalize a response before sending it, cutting down travel time.
// This function runs on the network edge, close to the user
export const config = { runtime: 'edge' };
export default async function handleRequest(request) {
// Get user info from the request
const userId = request.headers.get('x-user-id');
const userCountry = request.headers.get('cf-ipcountry');
// Fetch the basic product data from my main API
const apiResponse = await fetch(`https://api.myapp.com/products/123`);
let product = await apiResponse.json();
// --- Personalize the response right here at the edge ---
if (userId) {
// Fetch user-specific data in parallel
const [wishlist, history] = await Promise.all([
checkUserWishlist(userId, product.id),
getUserPurchaseHistory(userId, product.id)
]);
// Add personal flags to the product data
product.inYourWishlist = wishlist;
product.youBoughtThisBefore = history.count > 0;
product.yourDiscount = history.count > 2 ? 0.1 : 0; // 10% off for loyal customers
}
// Adjust price for the user's country
product.localPrice = convertPrice(product.basePrice, userCountry);
// Tell the browser and cache servers how long to keep this
const headers = new Headers({
'Cache-Control': 'public, max-age=60',
'CDN-Cache-Control': 'public, max-age=300'
});
return new Response(JSON.stringify(product), { headers });
}
// The client code doesn't change much, but the response is faster and tailored.
async function getProductPageData(productId) {
const response = await fetch(`/api/products/${productId}`, {
headers: { 'x-user-id': getCurrentUserId() }
});
return response.json(); // Gets data already personalized at the edge
}
The user gets a page that feels made for them, and it loads incredibly fast because the personalization happened on a server just a few miles away, not on my central server across the country.
Not all updates are created equal. Changing your username is different from pausing a music subscription. I need different strategies. I think of them as three modes: Optimistic, Pessimistic, and Background.
class UpdateManager {
constructor() {
this.strategies = {
optimistic: this.doOptimisticUpdate,
pessimistic: this.doPessimisticUpdate,
background: this.doBackgroundUpdate
};
}
// MODE 1: OPTIMISTIC - Assume success, update UI immediately.
async doOptimisticUpdate(action, newData) {
const updateId = createUniqueId();
// 1. Tell the UI: "An update is starting!"
emitEvent('updateStarted', { id: updateId, action, newData });
// 2. Immediately change the data in the UI cache. The screen updates now.
this.applyChangeToCache(action, newData);
try {
// 3. *After* the UI updates, try the real server call.
const serverResult = await callServer(action, newData);
emitEvent('updateSucceeded', { id: updateId, serverResult });
} catch (error) {
// 4. If the server fails, revert the UI change. User sees a quick correction.
this.revertChangeInCache(action, newData);
emitEvent('updateFailed', { id: updateId, error });
}
}
// MODE 2: PESSIMISTIC - Wait for server confirmation first.
async doPessimisticUpdate(action, newData) {
const updateId = createUniqueId();
emitEvent('updateStarted', { id: updateId, action, newData });
// Show a loading spinner...
emitEvent('updateLoading', { id: updateId });
try {
// 1. Call the server FIRST.
const serverResult = await callServer(action, newData);
// 2. Only update the UI after the server says "OK."
this.applyChangeToCache(action, serverResult);
emitEvent('updateSucceeded', { id: updateId, serverResult });
} catch (error) {
emitEvent('updateFailed', { id: updateId, error });
}
}
// MODE 3: BACKGROUND - Do it silently, notify later.
async doBackgroundUpdate(action, newData) {
const updateId = createUniqueId();
// Just store it. Don't block the user or change the UI yet.
this.queuedUpdates.set(updateId, { action, newData });
// Fire and forget.
callServer(action, newData)
.then(result => {
emitEvent('backgroundUpdateDone', { id: updateId, result });
})
.catch(error => {
emitEvent('backgroundUpdateFailed', { id: updateId, error });
});
return updateId; // So we can track it if needed
}
}
// Using the right strategy for the job:
const manager = new UpdateManager();
// Good for likes, comments, UI toggles: FEEL fast.
manager.execute('likePost', { postId: 123 }, 'optimistic');
// Good for payments, deleting accounts: BE safe.
manager.execute('chargeCreditCard', paymentData, 'pessimistic');
// Good for saving settings, analytics: STAY out of the way.
manager.execute('updateProfilePreference', settings, 'background');
Choosing the right mode is a design decision. It balances speed, safety, and user distraction. A social media app might be mostly optimistic. A banking app would be mostly pessimistic.
The final piece of the puzzle is understanding how data depends on other data. If I view a user's profile, I'll probably want to see their recent orders. If I look at an order, I'll likely need the product details for that order. Modeling these relationships lets my app get smarter.
I can build a dependency graph. It's a map that says "Thing B needs Thing A." When Thing A changes, the system knows that Thing B is now outdated and should be refetched. It can also prefetch Thing B because it's likely needed next.
class DataGraph {
constructor() {
this.nodes = new Map(); // Stores each piece of data and its info
this.dependencies = new Map(); // "Product details" -> ["User profile", "Inventory"]
}
register(dataKey, fetchFunction, options = {}) {
// Store the node
this.nodes.set(dataKey, {
fetch: fetchFunction,
cachedData: null,
lastFetched: null,
dependsOn: new Set(options.dependsOn || []) // What other data it needs
});
// Update the dependency map for each parent
options.dependsOn?.forEach(parentKey => {
if (!this.dependencies.has(parentKey)) {
this.dependencies.set(parentKey, new Set());
}
this.dependencies.get(parentKey).add(dataKey);
});
}
async get(dataKey) {
const node = this.nodes.get(dataKey);
if (!node) throw new Error(`Unknown data: ${dataKey}`);
// If cached data is fresh, return it.
if (node.cachedData && isFresh(node.lastFetched)) {
// Smart prefetch: since we used this, fetch its likely neighbors.
this.prefetchRelated(dataKey);
return node.cachedData;
}
// First, make sure all the data it DEPENDS ON is fetched.
const parentPromises = [...node.dependsOn].map(key => this.get(key));
await Promise.all(parentPromises);
// Now fetch this data itself.
node.cachedData = await node.fetch();
node.lastFetched = Date.now();
return node.cachedData;
}
// If a user's profile changes, invalidate all data that depends on it.
markStale(dataKey) {
const node = this.nodes.get(dataKey);
if (node) node.cachedData = null;
// Recursively mark children as stale.
const children = this.dependencies.get(dataKey) || new Set();
children.forEach(childKey => this.markStale(childKey));
}
prefetchRelated(dataKey) {
// Simple logic: prefetch the first-level dependencies of this node's children.
const children = this.dependencies.get(dataKey) || new Set();
children.forEach(childKey => {
const childNode = this.nodes.get(childKey);
childNode?.dependsOn.forEach(depKey => {
this.get(depKey).catch(() => { /* Ignore prefetch errors */ });
});
});
}
}
// Using the graph
const appDataGraph = new DataGraph();
// Define the relationships
appDataGraph.register('user:789', () => fetchUser(789));
appDataGraph.register('orders:789', () => fetchOrders(789), {
dependsOn: ['user:789'] // Can't fetch orders without knowing the user
});
appDataGraph.register('products:from:orders:789', () => fetchProductsFromOrders(789), {
dependsOn: ['orders:789'] // Need orders before getting their products
});
// In a component
async function UserPage({ userId }) {
// Getting the user also primes the cache for their orders.
const user = await appDataGraph.get(`user:${userId}`);
// This call might be instant if prefetched.
const orders = await appDataGraph.get(`orders:${userId}`);
// If the user's name is updated elsewhere:
const updateUserName = (newName) => {
updateServer(userId, newName).then(() => {
// This will cause orders and products to be refetched when needed.
appDataGraph.markStale(`user:${userId}`);
});
};
}
This graph approach makes data flow predictable. It removes a whole class of bugs where the UI shows an old user name on a new order because the order cache wasn't cleared. The dependencies are explicit, and the system manages the cascade.
Putting it all together, modern data flow is about intentional design. It's choosing to keep state local, to sync conflicts intelligently, to run logic closer to users, to match update strategies to user expectations, and to map out how data connects. These patterns handle the complexity so the user gets a simple experience: an app that feels alive, responsive, and trustworthy, no matter how much data is flowing behind the scenes.
📘 Checkout my latest ebook for free on my channel!
Be sure to like, share, comment, and subscribe to the channel!
101 Books
101 Books is an AI-driven publishing company co-founded by author Aarav Joshi. By leveraging advanced AI technology, we keep our publishing costs incredibly low—some books are priced as low as $4—making quality knowledge accessible to everyone.
Check out our book Golang Clean Code available on Amazon.
Stay tuned for updates and exciting news. When shopping for books, search for Aarav Joshi to find more of our titles. Use the provided link to enjoy special discounts!
Our Creations
Be sure to check out our creations:
Investor Central | Investor Central Spanish | Investor Central German | Smart Living | Epochs & Echoes | Puzzling Mysteries | Hindutva | Elite Dev | Java Elite Dev | Golang Elite Dev | Python Elite Dev | JS Elite Dev | JS Schools
We are on Medium
Tech Koala Insights | Epochs & Echoes World | Investor Central Medium | Puzzling Mysteries Medium | Science & Epochs Medium | Modern Hindutva
Top comments (0)