Tech Tutorial - February 18 2026 173007
AI TOOLS Feb. 18, 2026, 5:30 p.m.

Tech Tutorial - February 18 2026 173007

Welcome back, Codeyaan explorers! Today we’re diving into the world of real‑time web applications by building a fully functional chat platform using FastAPI and WebSockets. By the end of this tutorial you’ll not only have a working prototype but also a solid grasp of the concepts that power live updates across browsers, mobile apps, and even IoT devices.

Why Real‑time Communication Matters

In 2026, users expect instantaneous feedback—whether they’re collaborating on a document, tracking a delivery, or gaming with friends. Traditional HTTP request/response cycles introduce latency that feels sluggish for these scenarios. WebSockets flip the script: they open a persistent, full‑duplex connection, allowing the server to push data the moment something changes.

Beyond chat, WebSockets power live dashboards, collaborative editors, notification systems, and multiplayer games. Understanding how to harness them gives you a competitive edge and opens doors to a whole class of interactive experiences.

Choosing the Right Stack

FastAPI has become the go‑to framework for high‑performance Python APIs thanks to its async‑first design and automatic OpenAPI docs. Pair it with uvicorn as the ASGI server, and you have a lightweight, production‑ready environment that natively supports WebSockets.

On the client side, plain JavaScript with the native WebSocket API is sufficient for a tutorial, but you can later swap it for libraries like socket.io or frameworks such as React, Vue, or Svelte for richer UI interactions.

Setting Up FastAPI

First, create a fresh virtual environment and install the dependencies:

python -m venv venv
source venv/bin/activate  # Windows: venv\Scripts\activate
pip install fastapi uvicorn[standard] python-multipart

Now, scaffold the basic FastAPI app. This file, main.py, will host both HTTP routes and WebSocket endpoints.

from fastapi import FastAPI, WebSocket, WebSocketDisconnect
from fastapi.responses import HTMLResponse
from typing import List

app = FastAPI()

# In‑memory store for active connections
class ConnectionManager:
    def __init__(self):
        self.active_connections: List[WebSocket] = []

    async def connect(self, websocket: WebSocket):
        await websocket.accept()
        self.active_connections.append(websocket)

    def disconnect(self, websocket: WebSocket):
        self.active_connections.remove(websocket)

    async def broadcast(self, message: str):
        for connection in self.active_connections:
            await connection.send_text(message)

manager = ConnectionManager()

@app.get("/", response_class=HTMLResponse)
async def get():
    # Simple HTML page for quick testing
    return """
    
    
    
    
        
        
        
    """ @app.websocket("/ws/chat") async def websocket_endpoint(websocket: WebSocket): await manager.connect(websocket) try: while True: data = await websocket.receive_text() await manager.broadcast(f"User says: {data}") except WebSocketDisconnect: manager.disconnect(websocket) await manager.broadcast("A user has left the chat")

    The ConnectionManager class abstracts the boilerplate of tracking connections and broadcasting messages. For production you’d replace the in‑memory list with Redis or another distributed store to support multiple server instances.

    Implementing WebSocket Endpoints

    Our /ws/chat endpoint does three things: accepts the connection, continuously reads incoming messages, and broadcasts each message to every connected client. The while True loop keeps the coroutine alive as long as the client stays connected.

    Notice the use of await websocket.receive_text() and await connection.send_text(...). These are async calls that free the event loop while waiting for I/O, allowing thousands of concurrent sockets on a single CPU core.

    Pro tip: Always wrap your receive loop in a try/except WebSocketDisconnect block. If you forget, a dropped client will raise an exception that bubbles up and can crash the entire endpoint, disconnecting all users.

    Building the Frontend

    The client side is intentionally minimal. It establishes a WebSocket connection, sends user input, and appends incoming messages to a list. Save the following script as static/client.js and make sure FastAPI serves static files (we’ll add the mount later).

    const ws = new WebSocket(`ws://${location.host}/ws/chat`);
    
    ws.onmessage = function(event) {
        const messages = document.getElementById('messages');
        const li = document.createElement('li');
        li.textContent = event.data;
        messages.appendChild(li);
    };
    
    function sendMessage() {
        const input = document.getElementById('msgInput');
        if (input.value.trim() !== '') {
            ws.send(input.value);
            input.value = '';
        }
    }
    

    When you open http://localhost:8000 in multiple browser tabs, each tab will receive every message typed in any other tab—exactly what a chat app should do.

    Handling Broadcasts and Rooms

    Real‑world chat systems rarely broadcast to every user. Instead they support “rooms” or “channels.” Let’s extend ConnectionManager to manage multiple rooms without overhauling the whole architecture.

    class RoomManager:
        def __init__(self):
            self.rooms: dict[str, List[WebSocket]] = {}
    
        async def join(self, room: str, websocket: WebSocket):
            await websocket.accept()
            self.rooms.setdefault(room, []).append(websocket)
    
        def leave(self, room: str, websocket: WebSocket):
            if room in self.rooms:
                self.rooms[room].remove(websocket)
                if not self.rooms[room]:
                    del self.rooms[room]
    
        async def broadcast(self, room: str, message: str):
            for ws in self.rooms.get(room, []):
                await ws.send_text(message)
    
    room_manager = RoomManager()
    
    @app.websocket("/ws/{room_name}")
    async def room_endpoint(websocket: WebSocket, room_name: str):
        await room_manager.join(room_name, websocket)
        try:
            while True:
                data = await websocket.receive_text()
                await room_manager.broadcast(room_name, f"[{room_name}] {data}")
        except WebSocketDisconnect:
            room_manager.leave(room_name, websocket)
            await room_manager.broadcast(room_name, f"A user left {room_name}")
    

    Clients now specify a room in the URL, e.g., ws://localhost:8000/ws/sports. This pattern scales nicely: you can add authentication, permission checks, or even persistence per room without touching the core broadcast logic.

    Deploying to Production

    Running uvicorn main:app --reload is perfect for development, but production demands a more robust setup. Use gunicorn with multiple worker processes, each running an uvicorn worker class:

    pip install gunicorn
    gunicorn -k uvicorn.workers.UvicornWorker -w 4 -b 0.0.0.0:8000 main:app
    

    Four workers give you parallelism across CPU cores while still leveraging async I/O inside each worker. For horizontal scaling across multiple machines, place a reverse proxy (NGINX or Traefik) in front and enable sticky sessions or, better yet, move the connection state to a shared store like Redis Pub/Sub.

    Performance Tuning

    WebSocket throughput is bound by three factors: network latency, message serialization, and the event loop’s ability to process I/O. Here are quick wins you can apply:

    • Binary payloads: Switch from send_text to send_bytes for large data blobs (e.g., images, binary game state). Binary frames are smaller and avoid UTF‑8 encoding overhead.
    • Message batching: If you need to push dozens of updates per second, bundle them into a single JSON array instead of sending individual messages.
    • Back‑pressure handling: Monitor websocket.send_text for await timeouts. If a client is slow, consider dropping non‑essential updates to keep the server responsive.
    Pro tip: Enable uvicorn --log-level debug while profiling. Look for “websocket disconnect” logs; frequent disconnects often indicate client‑side network issues or server overload.

    Common Pitfalls and How to Avoid Them

    1. Forgetting to accept the connection. If you call await websocket.accept() after the first receive, the client will time out. Always accept immediately after the handshake.

    2. Mixing sync and async code. Calling a blocking function (e.g., a heavy DB query) inside the WebSocket loop will stall the entire event loop. Offload such work to a thread pool with run_in_executor or use async‑compatible libraries.

    3. Memory leaks from stale connections. When a client crashes or loses network, the server may not receive a WebSocketDisconnect. Implement a heartbeat ping/pong mechanism and prune connections that miss several heartbeats.

    Pro Tips for Scaling Beyond the Basics

    • Redis Pub/Sub as a message bus: Replace the in‑memory rooms dict with a Redis channel per room. Each FastAPI instance subscribes to its relevant channels, guaranteeing that messages cross all server nodes.
    • Authentication tokens in query string: Pass a JWT when opening the socket (ws://host/ws/chat?token=…) and validate it inside the endpoint before accepting.
    • Graceful shutdown: Register a shutdown event handler that closes all websockets cleanly, preventing “broken pipe” errors during container restarts.
    Pro tip: When deploying to Kubernetes, set terminationGracePeriodSeconds to at least 30 seconds and use a pre‑stop hook that sends a broadcast “Server is going down” message. Clients can then attempt reconnection automatically.

    Putting It All Together: A Minimal Production‑Ready Example

    Below is a concise Dockerfile that packages everything we’ve built. It pulls a slim Python image, installs dependencies, copies the code, and runs Gunicorn with Uvicorn workers.

    # Dockerfile
    FROM python:3.12-slim
    
    WORKDIR /app
    
    COPY requirements.txt .
    RUN pip install --no-cache-dir -r requirements.txt
    
    COPY . .
    
    EXPOSE 8000
    
    CMD ["gunicorn", "-k", "uvicorn.workers.UvicornWorker", "-w", "4", "-b", "0.0.0.0:8000", "main:app"]
    

    Build and run:

    docker build -t fastapi-chat .
    docker run -d -p 8000:8000 fastapi-chat
    

    Visit http://localhost:8000 from any browser, open multiple tabs, and watch the real‑time magic happen. You now have a production‑grade, containerized chat service ready for further feature work.

    Conclusion

    Real‑time communication is no longer a niche skill—it’s a core competency for modern web developers. By mastering FastAPI’s async capabilities, WebSocket fundamentals, and the patterns for rooms, scaling, and deployment, you’ve earned a powerful toolset that can be repurposed for dashboards, collaborative editors, gaming back‑ends, and more.

    Remember to keep your connection logic lightweight, offload heavy work to async‑compatible services, and always plan for horizontal scaling from day one. With the code snippets, pro tips, and deployment checklist in this guide, you’re well‑equipped to turn a simple chat demo into a robust, production‑ready real‑time platform.

    Share this article