Python Snippets

Asynchronous HTTP Requests with aiohttp for Concurrent API Calls

import aiohttp
import asyncio

async def fetch_data(url):
    async with aiohttp.ClientSession() as session:
        async with session.get(url) as response:
            return await response.json()

async def fetch_multiple_urls(urls):
    tasks = [fetch_data(url) for url in urls]
    return await asyncio.gather(*tasks)

async def main():
    api_urls = [
        'https://api.example.com/data/1',
        'https://api.example.com/data/2',
        'https://api.example.com/data/3'
    ]
    
    results = await fetch_multiple_urls(api_urls)
    for i, result in enumerate(results, 1):
        print(f"Data from API {i}: {result}")

if __name__ == "__main__":
    asyncio.run(main())

Explanation

This code demonstrates how to make multiple HTTP requests concurrently using Python’s asyncio and aiohttp libraries. It solves the common problem of efficiently fetching data from multiple API endpoints without waiting for each request to complete sequentially.

Key Features:

  1. Asynchronous execution: Requests are made concurrently, significantly reducing total wait time
  2. Error handling: The context managers ensure proper session and response cleanup
  3. Scalability: Can handle dozens or hundreds of requests efficiently
  4. Clean output: Results are collected and displayed in order

Why it’s useful:

How to run:

  1. Install required packages: pip install aiohttp
  2. Replace the example URLs with your actual API endpoints
  3. Run the script: python script_name.py
  4. For Python 3.6 or earlier, replace asyncio.run(main()) with:
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())
    

Customization: