DEV Community

Cover image for System Design - EP 3.1 - Latency
Hrishikesh Dalal
Hrishikesh Dalal

Posted on

System Design - EP 3.1 - Latency

Time is Money

In 2006, Amazon found that every 100ms of latency cost them 1% in sales. Google found that an extra 0.5 seconds in search page generation time dropped traffic by 20%. In the world of systems, speed isn't just a feature; it's a requirement.

What is Latency?

Latency is the time it takes for a single packet of data to travel from the source to the destination. Think of it as the "delay" between a user clicking a button and the action actually happening.

The "Restaurant" Analogy

Imagine you are at a restaurant.

  • Latency is the time it takes for the waiter to walk from your table to the kitchen to deliver your order.

What Causes Latency?

  1. Propagation Delay: The time it takes for a signal to travel at the speed of light through cables.
  2. Transmission Delay: The time to push all the bits of a packet onto the wire.
  3. Processing Delay: The time a server takes to read the request and decide what to do.
  4. Queueing Delay: The time a request spends waiting in a line (buffer) because the server is busy.

How to Reduce It?

  • CDNs: Move the data physically closer to the user.
  • Caching: Store frequent data in memory (RAM) so you don't have to hit the slow database.
  • Optimization: Write more efficient code to reduce processing time.

Top comments (0)