DEV Community

loading...

How to speed up python3 code in one line

Tanner Burns
I enjoy creating simple solutions to hard problems.
Updated on ・2 min read

In this short article, we will use asnycio to speed up our code. We will use a function from the modutils library to show the effectiveness and ease of adding asnyc to your code.


Here is example code we might want to speed up. This is a basic example that is just meant to prove the point of asyncio. It will send a requests to google and return the response. We are going to do this 32 times and see how long it takes.

import requests

from time import time

def task(url):
    """ a simple task to return the response of a given url

    :param url: {str} -- url to send requests

    :return: Response object from requests
    """
    return requests.get(url, headers={'User-Agent':'Chrome; Python'})

# timing execution
start = time()
# create list to store Response objects
responses = []
for _ in range(0, 32):
    # add Response to list
    responses.append(task('https://www.google.com'))
print(f'{time()-start} to get all responses')

Output:

2.498640775680542 to get all responses


Now we will update this example to use asyncio to retrieve 16 responses at a time instead of 1. This is determined by the max_async_pool variable which we set to 16. This can be increased or decreased depending on desired speed and system resources.

import requests

from time import time
from modutils import aioloop


def task(url: str):
    """ a simple task to return the response of a given url

    :param url: {str} -- url to send requests

    :return: Response object from requests
    """
    return requests.get(url, headers={'User-Agent': 'Chrome; Python'})

# timing execution
start = time()
# create a list of arguments for the task function
args = [['https://www.google.com'] for _ in range(0, 32)]
# sending 16 requests at one time by setting max_async_pool to 16
# responses is a list of Response objects
responses = aioloop(task, args, max_async_pool=16)
print(f'{time()-start} to get all responses')

Output:

0.2680017948150635 to get all responses


Using the aioloop function from modutils we were able to speed up our code and make what took two seconds almost instant.

Note: For the aioloop function we created an args list that unpacks each inner item to the given function. We can also add named arguments with a dictionary in the list of args. Example below:

import requests

from time import time
from modutils import aioloop


def task(url: str, params: dict=None):
    """ a simple task to return the response of a given url

    :param url: {str} -- url to send requests
    :param params: {dict} -- optional named argument for requests parameters

    :return: Response object from requests
    """
    return requests.get(url, headers={'User-Agent': 'Chrome; Python'}, params=params)

# timing execution
start = time()
# create a list of arguments for the task function
args = [['https://www.google.com', {'params':{'q':'testing'}}] for _ in range(0, 32)]
# sending 16 requests at one time by setting max_async_pool to 16
# responses is a list of Response objects
responses = aioloop(task, args, max_async_pool=16)
print(f'{time()-start} to get all responses')

Learn more about modutils here.

Discussion (2)

Collapse
vikasneha profile image
VikasNeha

Hello Tanner
Nice post. Just wanted to point out that in the last example, it seems you need to add two arguments to the task function. Because you are passing url and a named dict.

Collapse
tannerburns profile image
Tanner Burns Author

Hello, thank you very much for catching this. I have added an update to address this error.