DEV Community

Danyson
Danyson

Posted on

Staged Event Driven Architecture for highly concurrent systems

Brief:

Anything connected to internet has the probability of accessing its resources from anywhere at any given time from any number of users.

As the "any number of users" translates to the number of users can be from 1 to billion at any given time.

Arises the question of, how we are going to allocate resources accessible by the users of any number?

Lets break things up,

So any number of user can be accessing resources at any time, says clearly the need arises dynamically.

So lets say we can dynamically increase our resources at any time.

A fictious controller that is responsible for increasing the resource pool as per the load.

The Staged Event driven system:

Consider every call towards the system as Event.

Each Event can be processed by Network of Stages.

Why Events should be processed by "Network of Stages"?

Lets take HTTP request as an example,

  1. A Socket connection Event is established with the server and request is read from Socket
  2. The HTTP request packet is parsed in HTTP parser
  3. If the requested data by the HTTP request Event is available in Cache, then Cache hit and proceeds to send response by writing the data to the Socket
  4. If the requested data by the HTTP request is not available in Cache, then Cache miss, now handle Cache miss by doing an I/O Event to the Database to fetch the Cache missed Data back to system, now system sends data by writing the data to the Socket

The above 4 steps are general, which is why Events are processed in stages.

A Staged event driven HTTP server will look as follows

Staged event driven HTTP server
Figure 1: Staged event driven HTTP server[1]

Each stage from above example

  1. Socket IN
  2. Data Parser
  3. Cache
  4. Database (File I/O)
  5. Socket OUT

as each stage handles their very own Events, each stages will have a Event Queue, Event Handler and Resource Pool Controller

  1. Event Queue holds events
  2. Event Handler manages the events by doing enqueue and dequeue operations on the Event Queue
  3. Resource Pool Controller will allocate the resources like CPU, Network I/O, Memory required by Each Stage SEDA Stage Figure 2: SEDA Stage[1]

The Resource Pool Controller can be classified in to two types

  1. Thread pool controller
  2. Batching controller

As the name suggest the Thread pool controller controls the amount of threads running in the pool.

While the Batch controller controls the number of events processed by Event handlers on their each iterations

Thread pool controller & Batching controller
Figure 3: Thread pool controller[1] and Batching controller[1]

Reference:

  1. http://www.sosp.org/2001/papers/welsh.pdf

Hope you guys find usefull

Support us by Buying us some Cookies

Visit our site Doge Algo

Top comments (0)