DEV Community

Ashutosh Sarangi
Ashutosh Sarangi

Posted on

(aiohttp & asyncio) vs Requests: Comparing Python HTTP Libraries

1. Requests - The Simple Synchronous Library

What it is:

  • Synchronous blocking HTTP library
  • Simple, intuitive API
  • Most popular for basic HTTP operations
  • Blocks execution until response is received

When to use:

  • Simple scripts
  • Sequential API calls
  • Learning/prototyping
  • When performance isn't critical

Example:

import requests
import time

# Basic GET request
response = requests.get('https://api.github.com/users/github')
print(response.json())

# Making multiple requests (BLOCKING - one at a time)
def fetch_multiple_sync():
    urls = [
        'https://api.github.com/users/github',
        'https://api.github.com/users/google',
        'https://api.github.com/users/microsoft'
    ]

    start = time.time()
    results = []

    for url in urls:
        response = requests.get(url)
        results.append(response.json())

    print(f"Time taken: {time.time() - start:.2f} seconds")
    # Output: ~3 seconds (1 second per request, sequential)
    return results

# POST request with headers and data
response = requests.post(
    'https://httpbin.org/post',
    json={'key': 'value'},
    headers={'Authorization': 'Bearer token'},
    timeout=5
)

# Session for connection pooling
session = requests.Session()
session.headers.update({'User-Agent': 'MyApp'})
response = session.get('https://api.github.com/users/github')
Enter fullscreen mode Exit fullscreen mode

Pros:

  • ✅ Dead simple API
  • ✅ Excellent documentation
  • ✅ Perfect for beginners
  • ✅ Built-in JSON decoding
  • ✅ Session management

Cons:

  • ❌ Synchronous/blocking
  • ❌ Slow for multiple requests
  • ❌ Not suitable for high-concurrency

2. asyncio - The Foundation (Not an HTTP Library!)

What it is:

  • Asynchronous I/O framework built into Python 3.4+
  • NOT an HTTP library itself
  • Provides the foundation for async programming
  • Event loop for concurrent operations

Key Concepts:

  • async def - Defines coroutine functions
  • await - Waits for async operations
  • asyncio.gather() - Runs multiple coroutines concurrently

Example (without HTTP):

import asyncio
import time

# Basic async function
async def say_hello(name, delay):
    await asyncio.sleep(delay)  # Non-blocking sleep
    print(f"Hello, {name}!")
    return name

# Running async functions
async def main():
    start = time.time()

    # Sequential execution (3 seconds total)
    await say_hello("Alice", 1)
    await say_hello("Bob", 1)
    await say_hello("Charlie", 1)

    print(f"Sequential time: {time.time() - start:.2f}s")

    # Concurrent execution (1 second total!)
    start = time.time()
    await asyncio.gather(
        say_hello("Alice", 1),
        say_hello("Bob", 1),
        say_hello("Charlie", 1)
    )

    print(f"Concurrent time: {time.time() - start:.2f}s")

# Run the async function
asyncio.run(main())
Enter fullscreen mode Exit fullscreen mode

Why asyncio alone isn't enough for HTTP:

# This WON'T work - requests is synchronous!
import requests
import asyncio

async def fetch_url(url):
    # ❌ requests.get() blocks the event loop!
    response = requests.get(url)  
    return response.json()

# Even with asyncio.gather(), this is still sequential
# because requests.get() blocks
Enter fullscreen mode Exit fullscreen mode

3. aiohttp - Async HTTP Client/Server

What it is:

  • Asynchronous HTTP client and server
  • Built on top of asyncio
  • Non-blocking I/O for concurrent requests
  • Supports WebSockets

When to use:

  • Multiple concurrent HTTP requests
  • High-performance web scraping
  • API integrations with many endpoints
  • WebSocket connections
  • Building async web servers

📊 Side-by-Side Comparison---

🔥 Real Performance Example

Let me show you a real-world scenario comparing both:

import requests
import aiohttp
import asyncio
import time

# Scenario: Fetch data from 20 different API endpoints

urls = [f'https://jsonplaceholder.typicode.com/posts/{i}' for i in range(1, 21)]

# ============ REQUESTS (SYNCHRONOUS) ============
def fetch_with_requests():
    start = time.time()
    results = []

    for url in urls:
        response = requests.get(url)
        results.append(response.json())

    elapsed = time.time() - start
    print(f"Requests (sync): {elapsed:.2f} seconds")
    return results

# ============ AIOHTTP (ASYNCHRONOUS) ============
async def fetch_with_aiohttp():
    start = time.time()

    async with aiohttp.ClientSession() as session:
        async def fetch_one(url):
            async with session.get(url) as response:
                return await response.json()

        tasks = [fetch_one(url) for url in urls]
        results = await asyncio.gather(*tasks)

    elapsed = time.time() - start
    print(f"aiohttp (async): {elapsed:.2f} seconds")
    return results

# Run both
fetch_with_requests()  # ~5-10 seconds
asyncio.run(fetch_with_aiohttp())  # ~0.5-1 second
Enter fullscreen mode Exit fullscreen mode

Results:

  • Requests: 8.5 seconds (sequential)
  • aiohttp: 0.9 seconds (concurrent)
  • Speedup: ~9.4x faster! 🚀

🔍 What exactly happens with session.get(url)?

tasks = [fetch_one(url) for url in urls]
Enter fullscreen mode Exit fullscreen mode

Answer: NO, the API is NOT called yet!

Here's what's actually happening step by step:

Step 1: Understanding what fetch_one(url) returns

async def fetch_one(url):
    async with session.get(url) as response:
        return await response.json()

# When you call this:
task = fetch_one(url)  # ❌ API NOT called yet!
Enter fullscreen mode Exit fullscreen mode

When you call an async def function without await, it returns a coroutine object (not the result). The function body doesn't execute yet!

import asyncio

async def fetch_one(url):
    print(f"Fetching {url}")  # This won't print yet!
    return "data"

# This creates a coroutine object but doesn't run it
task = fetch_one("https://example.com")
print(type(task))  # <class 'coroutine'>

# The function body hasn't run yet!
# "Fetching https://example.com" hasn't printed!

# To actually run it, you need await:
result = await task  # NOW it runs and prints "Fetching..."
Enter fullscreen mode Exit fullscreen mode

Step 2: What's in the tasks list?

tasks = [fetch_one(url) for url in urls]
# tasks = [<coroutine>, <coroutine>, <coroutine>, ...]
# These are "suspended" function calls waiting to be executed
Enter fullscreen mode Exit fullscreen mode

The tasks list contains coroutine objects - basically "promises" or "blueprints" of work to be done, but the work hasn't started yet.

Step 3: What does asyncio.gather() do?

results = await asyncio.gather(*tasks)
Enter fullscreen mode Exit fullscreen mode

NOW the magic happens! asyncio.gather():

  1. Takes all the coroutine objects
  2. Schedules them to run concurrently on the event loop
  3. Waits for all of them to complete
  4. Returns their results as a list

* and ** Unpacking

* (Single asterisk) - Unpacking Iterables

# Unpacks lists, tuples, sets, etc.
tasks = [task1, task2, task3]

# These are equivalent:
asyncio.gather(*tasks)
asyncio.gather(task1, task2, task3)

# Another example:
numbers = [1, 2, 3]
print(*numbers)  # Same as: print(1, 2, 3)
# Output: 1 2 3
Enter fullscreen mode Exit fullscreen mode

It's not just for lists - it works with ANY iterable (lists, tuples, sets, generators, etc.)

** (Double asterisk) - Unpacking Dictionaries

# Unpacks dictionaries as keyword arguments
config = {'timeout': 10, 'headers': {'User-Agent': 'MyApp'}}

# These are equivalent:
session.get(url, **config)
session.get(url, timeout=10, headers={'User-Agent': 'MyApp'})

# Another example:
def greet(name, age):
    print(f"{name} is {age} years old")

person = {'name': 'Alice', 'age': 30}
greet(**person)  # Same as: greet(name='Alice', age=30)
Enter fullscreen mode Exit fullscreen mode

Top comments (0)