Skip to content
Vladimir Chavkov
Go back

A Practical Guide to Python async/await

Edit page

Python’s async/await syntax, built on top of asyncio, lets you write concurrent code that handles thousands of I/O-bound operations without the complexity of threads. But it is not a universal performance tool — understanding when it helps and when it gets in the way is critical.

The Core Model

Traditional synchronous code blocks on every I/O call. When you fetch a URL, the entire thread sits idle waiting for the response. asyncio uses an event loop to multiplex many I/O operations onto a single thread. While one coroutine waits for a network response, others can run.

import asyncio
async def fetch_data(url: str) -> str:
"""Simulate an I/O-bound operation."""
print(f"Starting fetch: {url}")
await asyncio.sleep(1) # Simulates network delay
print(f"Finished fetch: {url}")
return f"Data from {url}"
async def main():
urls = ["https://api.example.com/a", "https://api.example.com/b", "https://api.example.com/c"]
results = await asyncio.gather(*(fetch_data(url) for url in urls))
print(results)
asyncio.run(main())

All three fetches run concurrently — total time is roughly 1 second, not 3. The key insight: await yields control back to the event loop, allowing other coroutines to progress.

Practical Example: Concurrent HTTP Requests

Using aiohttp for real HTTP calls:

import asyncio
import aiohttp
async def fetch_url(session: aiohttp.ClientSession, url: str) -> dict:
async with session.get(url) as response:
return {"url": url, "status": response.status, "body": await response.text()}
async def main():
urls = [
"https://httpbin.org/get",
"https://httpbin.org/delay/1",
"https://httpbin.org/status/200",
]
async with aiohttp.ClientSession() as session:
tasks = [fetch_url(session, url) for url in urls]
results = await asyncio.gather(*tasks, return_exceptions=True)
for result in results:
if isinstance(result, Exception):
print(f"Error: {result}")
else:
print(f"{result['url']} -> {result['status']}")
asyncio.run(main())

The return_exceptions=True flag prevents one failed request from canceling the rest — a pattern you will want in production code.

async/await vs Threading

Factorasynciothreading
Best forI/O-bound (network, disk)I/O-bound or C-extension work
OverheadVery low (coroutines are cheap)Higher (OS threads)
Scaling10,000+ concurrent tasks easilyHundreds of threads max
ComplexityExplicit yield pointsRace conditions, locks
CPU-bound workNo benefit (blocks the loop)Limited by GIL

For CPU-bound work, use multiprocessing or concurrent.futures.ProcessPoolExecutor. You can bridge the two worlds:

import asyncio
from concurrent.futures import ProcessPoolExecutor
def cpu_heavy(n: int) -> int:
return sum(i * i for i in range(n))
async def main():
loop = asyncio.get_running_loop()
with ProcessPoolExecutor() as pool:
result = await loop.run_in_executor(pool, cpu_heavy, 10_000_000)
print(result)
asyncio.run(main())

Common Pitfalls

Blocking the event loop. Calling synchronous I/O (like requests.get() or time.sleep()) inside a coroutine blocks the entire loop. Every library in your async call chain must be async-aware.

Forgetting to await. Calling an async function without await returns a coroutine object, not the result. Python will emit a runtime warning, but the mistake is easy to miss.

Fire-and-forget tasks. Creating a task with asyncio.create_task() without storing a reference means the task can be garbage-collected before completion. Always keep a reference:

background_tasks = set()
async def schedule_work():
task = asyncio.create_task(some_coroutine())
background_tasks.add(task)
task.add_done_callback(background_tasks.discard)

Mixing sync and async code. If your application is mostly synchronous with one async library, the integration cost may exceed the benefit. You will end up wrapping everything in asyncio.run() calls, which defeats the purpose.

When async Is NOT the Right Choice

Summary

Use async/await when you have many concurrent I/O operations — web scrapers, API aggregators, WebSocket servers, or high-concurrency web frameworks like FastAPI. For everything else, synchronous Python is simpler and often fast enough. The right tool depends on your workload, not on what is trending.


Edit page
Share this post on:

Previous Post
FastAPI vs Flask vs Django: Choosing the Right Python Web Framework
Next Post
Vladimir's Training Programs Overview