DEV Community

Chanh Le
Chanh Le

Posted on • Edited on

Async programming: AsyncIO

Asyncio in Python is about writing concurrent code using the async/await syntax. It’s not parallelism (that’s multiprocessing or threads), but it lets you handle many I/O-bound tasks without blocking.

Think: downloading 100 web pages, handling thousands of socket connections, or waiting on a database — asyncio makes it scalable and efficient.

Core Concepts

  • Event loop: the “engine” that runs your coroutines.

  • Coroutine: an async function you can pause (await) and resume.

  • Task: a coroutine scheduled on the event loop.

Basic Coroutine

import asyncio async def say_hello(): print("Hello...") await asyncio.sleep(1) # non-blocking sleep  print("...World!") asyncio.run(say_hello()) 
Enter fullscreen mode Exit fullscreen mode

Key things here:

  • async def defines a coroutine.
  • await suspends execution until the awaited task is done.
  • asyncio.sleep is non-blocking, unlike time.sleep.

Running Multiple Coroutines

import asyncio async def worker(name, delay): print(f"{name} started") await asyncio.sleep(delay) print(f"{name} finished after {delay}s") async def main(): # Schedule tasks concurrently  task1 = asyncio.create_task(worker("A", 2)) task2 = asyncio.create_task(worker("B", 1)) await task1 await task2 asyncio.run(main()) 
Enter fullscreen mode Exit fullscreen mode

Even though worker A takes 2 seconds and worker B takes 1 second, the total runtime is about 2 seconds, not 3. That’s concurrency.

Gathering Tasks

Instead of awaiting them one by one:

await asyncio.gather(worker("A", 2), worker("B", 1)) 
Enter fullscreen mode Exit fullscreen mode

This runs both concurrently and returns results in order.

Example: Fetching URLs

import asyncio import aiohttp # pip install aiohttp  async def fetch(session, url): async with session.get(url) as resp: return await resp.text() async def main(): urls = [ "https://example.com", "https://httpbin.org/get", "https://python.org", ] async with aiohttp.ClientSession() as session: results = await asyncio.gather(*(fetch(session, u) for u in urls)) for url, result in zip(urls, results): print(f"Fetched {url[:30]}... {len(result)} bytes") asyncio.run(main()) 
Enter fullscreen mode Exit fullscreen mode

Here, instead of fetching one page at a time, all requests are fired together. Big win in speed.

Things to remember

  • Use asyncio.run() to start your program.
  • Use await only inside async def.
  • CPU-bound tasks don’t belong in asyncio. For those, use concurrent.futures with threads or processes.
  • asyncio is single-threaded. It just pauses/wakes tasks on I/O, not multiple CPU execution.

Conclusion

Using asyncio is like writing Javascript or Typescript. You will see async/await there as well. The mechanism is the same, both are using event loop under the hood.
For Python developers, in the past we usually used ThreadPool for concurrent programming. This requires us to manage the pool and identify beforehand how big (how many threads) the pool should be. With asyncio, we don't have to anymore and I really love this. Now, I just use asyncio to write code without concerning about the pool size.

Top comments (0)