DEV Community

Cover image for Concurrency and Asynchronous Programming Part-1
AnkitDevCode
AnkitDevCode

Posted on • Edited on

Concurrency and Asynchronous Programming Part-1

What is Parallel Programming?

  • Tasks run at the same time.
  • Progress happens simultaneously at any given instant.
  • Requires sufficient hardware resources (e.g., multiple CPU cores).

Example

  • Walking and talking:
    • Both actions occur at the same time.
    • At every moment, both are progressing.

What is Concurrent Programming?

  • Tasks make progress over time, but not at the same instant.
  • The system switches between tasks.
  • Over a time interval, all tasks move forward.

Example

  • Talking and drinking:
    • You alternate between the two.
    • At any moment, you’re doing one or the other, not both.

What “Asynchronous” Really Means?

Tasks don’t block while waiting for something (I/O, timers, network).

Key idea:

Asynchrony is about waiting efficiently.

  • A task starts work
  • It pauses when waiting
  • Resumes later via callbacks, promises

Common Misconception

  • Asynchronous ≠ no waiting
  • Tasks still wait for data.

The Real Question

Does the thread block while waiting?

  • Blocking threads = poor scalability.
  • Free threads = better resource utilization.

The twist: they’re not opposites

You can have:

  • Concurrent + synchronous (multiple threads blocking)
  • Single-threaded + asynchronous (Node.js style)
  • Concurrent + asynchronous (modern web servers)
  • Parallel (true simultaneous execution on multiple cores)

Parallelism = literally running at the same time

Concurrency = dealing with multiple things

Asynchrony = not blocking while waiting

Real-world analogy

  • Concurrency: A chef cooking 3 dishes at once
  • Asynchronous: Putting something in the oven and doing other prep instead of staring at it
  • Parallel: Multiple chefs cooking at the same time

Why threads are expensive?

Threads are considered expensive because each thread consumes significant system resources, even when it’s idle.

Main reasons:

  1. Memory usage (stack space)
    • Each thread has its own stack
    • Typical stack size:
      • ~1 MB per thread (common default on 64-bit systems)
      • Can range from 256 KB to several MB, depending on OS and JVM/runtime configuration
    • 1,000 threads ≈ ~1 GB of memory just for stacks
  2. Context switching overhead
    • The CPU must save and restore thread state when switching
    • Frequent context switches reduce CPU efficiency
    • Becomes costly at high thread counts
  3. Scheduling overhead
    • The OS scheduler must manage runnable threads
    • Large numbers of threads increase scheduling complexity and latency
  4. Synchronization costs
    • Threads often require locks, mutexes, or other coordination mechanisms
    • Leads to contention, deadlocks, and performance bottlenecks

Why Blocking Threads Hurt Scalability

When threads block on I/O:

  • They sit idle.
  • More threads are created to compensate.
  • Threads are limited by:
    • CPU cores.
    • Memory.

This leads to:

  • More machines.
  • More architectural complexity.
  • Higher costs (and environmental impact).

Evolution of Java Multithreading

Java 5 – ExecutorService

  • Introduced thread pools.
  • Solved:
    • Uncontrolled thread creation.
  • New problem:
    • Thread-pool–induced deadlocks.

Java 7 – Fork/Join Framework

  • Introduced work stealing.
  • Reduced pool starvation issues.
  • Well-suited for CPU-bound parallelism.

Java 8 – Expressive Concurrency

Introduced:

  • Parallel Streams
  • CompletableFuture

Enabled:

  • More declarative parallelism.
  • Better asynchronous composition.

Java 20 / 21 – Virtual Threads

  • Another major step forward.
  • Makes blocking cheap and scalable.
  • Simplifies concurrency for many workloads.

What Are Virtual Threads?

  • Virtual threads are a lightweight threading model introduced with Project Loom in Java.
  • They are managed by the JVM, not the operating system.
  • Designed to support massive concurrency with minimal resource usage.

Key Characteristics

  • Extremely lightweight compared to platform (OS) threads.
  • Millions of virtual threads can be created safely.
  • Allow developers to write simple, blocking-style code while remaining highly scalable.

The Problem Virtual Threads Aim to Solve

Traditional Java Concurrency Issues

  • Thread-per-request model:
    • Threads block during I/O (DB, network, file).
    • Blocking threads consume memory and OS resources.
    • Scalability is limited by thread count.
  • To scale:
    • More threads → more memory.
    • More machines → higher cost and complexity.

Rise of Reactive Programming

  • Reactive frameworks (Reactor, RxJava) emerged to:
    • Avoid blocking OS threads.
    • Handle high concurrency using non-blocking I/O.
  • Downsides:
    • Complex APIs.
    • Harder to read, debug, and reason about.

How Virtual Threads Work?

  • Virtual threads are scheduled by the JVM, not the OS.
  • When a virtual thread blocks on I/O:
    • It is unmounted from its carrier (platform) thread.
    • The carrier thread is reused for other work.
    • The virtual thread is remounted once I/O completes.
  • Result:
    • Blocking no longer wastes threads.
    • High scalability with familiar programming models.

Virtual Threads vs Reactive Programming

Shared Goal

Both aim to:

  • Maximize concurrency.
  • Avoid wasting threads during I/O.
  • Improve scalability of server-side applications.

Key Differences

Virtual Threads Reactive Programming
Blocking code is acceptable Requires non-blocking code
Simple, imperative style Functional, stream-based style
Easier to read and debug Steeper learning curve
Works with existing APIs Requires reactive-compatible APIs

Core Argument

  • Virtual threads make thread-per-task scalable again.
  • This reduces the need for reactive frameworks in many common cases.

Benefits of Virtual Threads

Simplicity

  • Write synchronous, blocking code.
  • No need to manage callbacks or reactive pipelines.

Scalability

  • Blocking no longer ties up OS threads.
  • Supports very high concurrency with low overhead.

Compatibility

  • Works with existing Java libraries and APIs.
  • No need to rewrite large codebases.

Does Virtual Threads Kill Reactive Programming?

Short Answer: No — But It Shrinks Its Use Cases

Reactive programming still matters when:

  • Fine-grained event streams are required.
  • Backpressure control is critical.
  • Complex data-flow transformations are needed.

Virtual threads:

  • Do not automatically provide:
    • Backpressure.
    • Stream composition.
    • Reactive operators.

Takeaways


Why Reactive Programming Existed

  • To avoid blocking OS threads and improve scalability.

What Changed

  • Virtual threads make blocking cheap and scalable.
  • Structured concurrency brings better control and safety.

Practical Impact

  • Many server-side workloads can return to:
    • Simple, imperative code.
    • Thread-per-request style.
  • Reactive programming remains relevant for specialized scenarios, but is no longer mandatory for scalability.

Top comments (0)