Asynchronous vs. Synchronous Functions in FastAPI When to Pick Which
Olivia Novak
Dev Intern · Leapcell

Introduction
Building efficient and scalable web applications is a constant pursuit for developers. In the Python ecosystem, FastAPI has emerged as a powerhouse for creating high-performance APIs, largely thanks to its asynchronous capabilities. However, a common point of confusion for newcomers and even experienced developers transitioning to FastAPI is understanding when to use async def and when to stick with a traditional def function. This decision isn't merely stylistic; it has profound implications for your application's responsiveness, resource utilization, and overall performance. This article will demystify the differences between asynchronous and synchronous function definitions in FastAPI, providing clear guidance on when to employ each, ultimately helping you construct more robust and efficient web services.
Understanding the Core Concepts
Before diving into the specifics of FastAPI, it's essential to grasp the fundamental concepts of synchronous and asynchronous programming in Python.
Synchronous Functions (def)
When you define a function using def, it's considered synchronous. This means that when the function is called, it executes its operations sequentially, one after another. If a synchronous function encounters an operation that takes a long time to complete (e.g., waiting for a database query, an external API call, or file I/O), the entire program will block at that point, waiting for the operation to finish before moving on to the next line of code. In the context of a web server, this means that while one request is performing a blocking I/O operation, the server cannot process other incoming requests on that same worker thread.
import time def synchronous_task(task_id: int): print(f"Synchronous Task {task_id}: Starting CPU-bound work...") # Simulate a CPU-bound operation count = 0 for _ in range(1_000_000_000): count += 1 print(f"Synchronous Task {task_id}: CPU-bound work finished.") print(f"Synchronous Task {task_id}: Starting I/O-bound wait...") # Simulate a blocking I/O operation time.sleep(2) # This will block the thread print(f"Synchronous Task {task_id}: I/O-bound wait finished.") return f"Result from synchronous task {task_id}" # When run, synchronous_task(1) would complete entirely before synchronous_task(2) starts.
Asynchronous Functions (async def)
Functions defined with async def are asynchronous, meaning they are designed to perform operations concurrently without blocking the main execution thread. The await keyword is crucial within async def functions. When an async def function encounters an await expression for an "awaitable" object (like an asyncio.sleep, an asynchronous HTTP request using httpx, or an async database driver call), it pauses its execution at that point and returns control to the event loop. The event loop can then switch to another task that is ready to run. Once the awaited operation completes, the async def function can resume its execution from where it left off. This non-blocking behavior is particularly advantageous for I/O-bound tasks.
import asyncio async def asynchronous_task(task_id: int): print(f"Asynchronous Task {task_id}: Starting I/O-bound wait...") # Simulate a non-blocking I/O operation await asyncio.sleep(2) # This will yield control to the event loop print(f"Asynchronous Task {task_id}: I/O-bound wait finished.") print(f"Asynchronous Task {task_id}: Starting CPU-bound work...") # Simulate a CPU-bound operation (still blocks if not offloaded) count = 0 for _ in range(1_000_000_000): count += 1 print(f"Asynchronous Task {task_id}: CPU-bound work finished.") return f"Result from asynchronous task {task_id}" # In an async context, multiple calls to asynchronous_task might run 'side-by-side' during their await periods.
FastAPI and Function Execution
FastAPI, built on Starlette and Pydantic, leverages Python's asyncio library to enable asynchronous request handling. This allows it to achieve high concurrency with a relatively small number of worker processes.
How FastAPI Handles async def
When FastAPI receives a request and routes it to an async def endpoint function, it executes this function directly within its event loop. If the async def function encounters an await I/O-bound operation, it yields control, allowing the event loop to switch to processing other incoming requests or other tasks that are ready. This is where the power of async def shines: for I/O-bound operations (database calls, external API calls, file reads/writes, network requests), your application can handle many simultaneous clients efficiently without having a dedicated thread for each waiting operation.
Example: async def for I/O-bound operations
Consider an API endpoint that fetches data from an external service.
from fastapi import FastAPI import httpx # An async HTTP client import asyncio app = FastAPI() @app.get("/items_async/{item_id}") async def get_item_async(item_id: int): print(f"Request for item {item_id}: Starting external API call asynchronously...") async with httpx.AsyncClient() as client: # Simulate an external API call that takes time response = await client.get(f"https://jsonplaceholder.typicode.com/todos/{item_id}") data = response.json() print(f"Request for item {item_id}: External API call finished.") return {"item_id": item_id, "data": data} # To test this, you could make multiple concurrent requests to /items_async/1, /items_async/2, etc. # You'd observe that they complete in an interleaved fashion, not strictly one after another.
In this example, await client.get(...) pauses the execution of get_item_async without blocking the main event loop. FastAPI can then handle other incoming requests or perform other tasks.
How FastAPI Handles def
When FastAPI encounters a def endpoint function, it smartly recognizes it as a synchronous function. To prevent synchronous functions from blocking its main asynchronous event loop, FastAPI automatically runs synchronous endpoint functions in a separate thread pool. This means that if your def function performs a blocking I/O operation or a long CPU-bound computation, it will block a thread from this thread pool, but it won't block the main event loop itself.
Example: def for synchronous, potentially blocking operations
Imagine an endpoint that performs a complex, CPU-intensive calculation that cannot be easily made asynchronous.
from fastapi import FastAPI import time app = FastAPI() def perform_heavy_computation(number: int): print(f"Synchronous Computation for {number}: Starting CPU-bound work...") # Simulate a CPU-bound operation result = 0 for i in range(number * 10_000_000): # A big loop result += i print(f"Synchronous Computation for {number}: CPU-bound work finished.") return result @app.get("/compute_sync/{number}") def compute_sync(number: int): print(f"Request for computation {number}: Received.") computation_result = perform_heavy_computation(number) return {"input_number": number, "result": computation_result} # If multiple requests hit /compute_sync concurrently, each will run in a separate thread from FastAPI's thread pool. # The number of concurrent synchronous operations is limited by the size of this thread pool.
In this case, perform_heavy_computation is a blocking function. FastAPI runs it in a background thread, preventing it from blocking the main event loop. However, the number of such concurrent blocking operations is limited by the thread pool's size (default often around 40 threads for uvicorn), and creating and managing threads incurs overhead.
When to Use async def vs. def
The choice between async def and def hinges primarily on the nature of the operations your endpoint performs.
Use async def when:
- Your function involves I/O-bound operations that can be awaited. This is the primary use case. Examples include:
- Making HTTP requests to external APIs using
httpx. - Interacting with asynchronous database drivers (e.g.,
asyncpgfor PostgreSQL,aioodbc,SQLModelwithasyncio). - Reading/writing files asynchronously (e.g.,
aiofiles). - Waiting for messages from an asynchronous queue.
- Any operation that involves waiting for an external resource without intense CPU usage.
- Making HTTP requests to external APIs using
- You need to leverage other
awaitable utilities or libraries. If you are integrating with libraries that are inherently asynchronous,async defis required toawaittheir operations. - You want to maximize concurrency for I/O-bound tasks.
async defallows your application to handle a high volume of concurrent requests efficiently, as long as those requests spend most of their time waiting for I/O.
Rule of thumb: If your function contains an await keyword, it must be async def.
Use def when:
- Your function performs purely CPU-bound operations. If your function spends most of its time performing calculations, processing data in memory, or looping extensively without waiting for external resources, it's CPU-bound. Making it
async defwon't magically make the CPU computation non-blocking; the actual computation will still block the event loop (if not awaited) or a background thread (if FastAPI offloads adeffunction).- Examples: Complex mathematical computations, heavy data transformations, image processing, video encoding.
- Your function interacts with synchronous-only libraries or drivers. Many older Python libraries, especially database drivers (like
psycopg2for PostgreSQL orSQLAlchemyORM in its traditional form), are synchronous. If your endpoint needs to use these, defining it asdefallows FastAPI to handle the blocking nature by running it in a thread pool. - Simplicity and familiarity. For very simple endpoints that don't involve I/O or complex logic, a
deffunction might be marginally simpler to write and understand, especially if you're not deeply familiar withasynciobest practices. However, always consider future scalability.
Important Note on CPU-Bound from async def: If an async def function includes a piece of CPU-bound code that doesn't actively await anything, that CPU-bound code will still block the event loop. To handle CPU-bound work within an async def endpoint without blocking the event loop, you would typically offload it to a separate process or a thread pool using loop.run_in_executor() (or libraries built upon this, like starlette.concurrency.run_in_threadpool). FastAPI does this automatically for def functions, but for async def functions, you have to manage it explicitly if you have long-running CPU code.
from concurrent.futures import ThreadPoolExecutor from functools import partial # ... (assume app = FastAPI() and the perform_heavy_computation function from above) executor = ThreadPoolExecutor(max_workers=4) # A thread pool for CPU-bound tasks @app.get("/compute_async_offloaded/{number}") async def compute_async_offloaded(number: int): print(f"Request for computation {number}: Received, offloading CPU work...") # Offload the intensive CPU computation to a thread pool loop = asyncio.get_event_loop() computation_result = await loop.run_in_executor( executor, partial(perform_heavy_computation, number) ) return {"input_number": number, "result": computation_result}
This is a more advanced pattern and signals that sometimes even async def functions might need to explicitly manage CPU-bound work.
Conclusion
Choosing between async def and def in FastAPI is a critical decision that impacts your application's performance characteristics. For I/O-bound tasks, async def is almost always the superior choice, allowing for high concurrency and efficient resource utilization by leveraging Python's asyncio event loop. Conversely, for CPU-bound operations or interactions with synchronous libraries, def is appropriate, with FastAPI intelligently offloading these to a thread pool to prevent blocking its main event loop. By understanding the underlying mechanics and considering the nature of your operations, you can effectively leverage both paradigms to build highly performant and scalable FastAPI applications. When in doubt, prefer async def if any part of your function is I/O-bound and can be awaited asynchronously.