DEV Community

Maksym
Maksym

Posted on

Combining Async and Sync Code in Python

Mixing async and sync code is a common challenge in Python. Here's how to do it effectively:

Running Async Code from Sync Code

Use asyncio.run() (Python 3.7+) for the top-level entry point:

import asyncio

async def fetch_data():
    await asyncio.sleep(1)
    return "Data fetched"

# Sync code calling async
def main():
    result = asyncio.run(fetch_data())
    print(result)

main()
Enter fullscreen mode Exit fullscreen mode

Use asyncio.get_event_loop().run_until_complete() for older patterns:

import asyncio

async def async_task():
    await asyncio.sleep(1)
    return "Done"

def sync_function():
    loop = asyncio.get_event_loop()
    result = loop.run_until_complete(async_task())
    return result
Enter fullscreen mode Exit fullscreen mode

Running Sync Code from Async Code

For CPU-bound sync code, use loop.run_in_executor():

import asyncio
import time

def blocking_task(n):
    # Simulates CPU-intensive work
    time.sleep(2)
    return n * 2

async def main():
    loop = asyncio.get_event_loop()

    # Run blocking code in a thread pool
    result = await loop.run_in_executor(None, blocking_task, 5)
    print(f"Result: {result}")

asyncio.run(main())
Enter fullscreen mode Exit fullscreen mode

For I/O-bound sync code, wrap it similarly:

import asyncio
import requests

def fetch_sync(url):
    response = requests.get(url)
    return response.text

async def fetch_multiple():
    loop = asyncio.get_event_loop()

    # Run multiple sync requests concurrently
    urls = ['https://api.example.com/1', 'https://api.example.com/2']
    tasks = [loop.run_in_executor(None, fetch_sync, url) for url in urls]
    results = await asyncio.gather(*tasks)
    return results
Enter fullscreen mode Exit fullscreen mode

Practical Example: Web Scraper

Here's a real-world example combining both:

import asyncio
import aiohttp
from bs4 import BeautifulSoup

# Sync function (CPU-bound parsing)
def parse_html(html):
    soup = BeautifulSoup(html, 'html.parser')
    return soup.find_all('h2')

# Async function (I/O-bound fetching)
async def fetch_page(session, url):
    async with session.get(url) as response:
        return await response.text()

async def scrape_website(urls):
    async with aiohttp.ClientSession() as session:
        # Fetch all pages concurrently
        html_pages = await asyncio.gather(
            *[fetch_page(session, url) for url in urls]
        )

        # Parse each page (sync work) in executor
        loop = asyncio.get_event_loop()
        results = await asyncio.gather(
            *[loop.run_in_executor(None, parse_html, html) for html in html_pages]
        )
        return results

# Run from sync code
urls = ['https://example.com/page1', 'https://example.com/page2']
results = asyncio.run(scrape_website(urls))
Enter fullscreen mode Exit fullscreen mode

Important Considerations

Avoid blocking the event loop: Never call time.sleep() or blocking I/O in async functions. Use await asyncio.sleep() instead.

Thread safety: When using run_in_executor(), be aware of thread safety issues if sharing state.

Performance: Only offload truly blocking operations to executors. Async overhead isn't worth it for quick sync functions.

Nested event loops: You can't call asyncio.run() from within an already-running event loop. Use await instead or consider libraries like nest_asyncio if absolutely necessary.

The key principle is: async functions should await other async functions, and sync blocking code should be offloaded to executors when called from async contexts.

Hopefully this article would help you to better understand how to better structure code. As always leave your thoughts in the comments!

Top comments (0)