Tech Tutorial - February 24 2026 173008
Welcome to today’s deep‑dive tutorial! In the next few minutes, we’ll walk through the entire lifecycle of a real‑time chat application built with FastAPI, WebSockets, and a lightweight front‑end. By the end, you’ll have a production‑ready prototype you can extend, deploy, and showcase in your portfolio. Grab a cup of coffee, fire up your IDE, and let’s turn those ideas into live, interactive code.
Why Real‑Time Communication Matters
Instant messaging, live dashboards, and collaborative editors have become the backbone of modern web experiences. Users expect sub‑second feedback, and traditional HTTP request‑response cycles simply can’t keep up. WebSockets provide a persistent, bi‑directional channel over a single TCP connection, eliminating the overhead of repeated handshakes and allowing the server to push updates whenever new data arrives.
Beyond chat, WebSockets power multiplayer games, IoT telemetry, and real‑time analytics. Choosing FastAPI as the backend gives you async‑first performance, automatic validation, and seamless integration with modern Python tooling—all while keeping the codebase clean and readable.
Key Benefits of FastAPI + WebSockets
- Asynchronous I/O: Handle thousands of concurrent connections without blocking.
- Built‑in validation: Pydantic models keep payloads safe and well‑typed.
- OpenAPI docs: Auto‑generated docs for your HTTP routes, while WebSocket routes stay documented in code comments.
- Extensibility: Easy to plug in authentication, Redis pub/sub, or background tasks.
Project Structure & Prerequisites
Before diving into code, let’s outline the directory layout. Keeping a tidy structure helps you scale the project later.
chat_app/
│
├─ app/
│ ├─ __init__.py
│ ├─ main.py # FastAPI entry point
│ ├─ router.py # WebSocket routes
│ └─ auth.py # JWT utilities (optional)
│
├─ static/
│ └─ index.html # Simple front‑end client
│
├─ requirements.txt
└─ README.md
Make sure you have Python 3.11+ installed. Install dependencies with:
pip install fastapi[all] uvicorn python-multipart python-jose[cryptography]
Understanding the Event Loop
FastAPI runs on top of Starlette, which leverages the asyncio event loop. When a WebSocket connection is accepted, the server creates a coroutine that runs for the lifetime of that connection. Each message you receive or send is an awaitable operation, meaning the loop can serve other clients while waiting for I/O. This model is the secret sauce behind the scalability of real‑time apps.
Implementing the WebSocket Server
Let’s start with the core server logic. We’ll maintain an in‑memory set of connected WebSocket objects and broadcast incoming messages to all participants. In a production setting you’d replace the set with a Redis channel, but for this tutorial the simple approach keeps things clear.
# app/main.py
from fastapi import FastAPI, WebSocket, WebSocketDisconnect
from typing import List
app = FastAPI()
class ConnectionManager:
def __init__(self):
self.active_connections: List[WebSocket] = []
async def connect(self, websocket: WebSocket):
await websocket.accept()
self.active_connections.append(websocket)
def disconnect(self, websocket: WebSocket):
self.active_connections.remove(websocket)
async def broadcast(self, message: str):
for connection in self.active_connections:
await connection.send_text(message)
manager = ConnectionManager()
@app.websocket("/ws/chat")
async def chat_endpoint(websocket: WebSocket):
await manager.connect(websocket)
try:
while True:
data = await websocket.receive_text()
await manager.broadcast(f"User says: {data}")
except WebSocketDisconnect:
manager.disconnect(websocket)
await manager.broadcast("A user has left the chat.")
This snippet does three things: accepts a new connection, continuously reads incoming text, and broadcasts it to every connected client. The WebSocketDisconnect exception gracefully handles client disconnects, ensuring our connection list stays accurate.
Adding Simple Message History
Users love seeing recent messages when they join. Let’s extend ConnectionManager with a short in‑memory buffer.
class ConnectionManager:
def __init__(self, history_limit: int = 20):
self.active_connections: List[WebSocket] = []
self.message_history: List[str] = []
self.history_limit = history_limit
async def connect(self, websocket: WebSocket):
await websocket.accept()
# Send recent history to the new client
for msg in self.message_history:
await websocket.send_text(msg)
self.active_connections.append(websocket)
async def broadcast(self, message: str):
self.message_history.append(message)
# Trim history to the configured limit
if len(self.message_history) > self.history_limit:
self.message_history.pop(0)
for connection in self.active_connections:
await connection.send_text(message)
Now every newcomer receives the last 20 messages instantly, creating a smoother chat experience.
Pro tip: When scaling beyond a single process, replace the in‑memory message_history with a distributed cache like Redis. That way every instance shares the same chat log.
Creating the Front‑End Client
Our client will be a single HTML file that opens a WebSocket connection, displays incoming messages, and sends user input. Keep the UI minimal to focus on the WebSocket mechanics.
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>FastAPI Chat</title>
<style>
body {font-family: Arial, sans-serif; margin: 2rem;}
#messages {border: 1px solid #ccc; height: 300px; overflow-y: auto; padding: .5rem;}
#input {width: 80%; padding: .5rem;}
#send {padding: .5rem 1rem;}
</style>
</head>
<body>
<h2>FastAPI Real‑Time Chat</h2>
<div id="messages"></div>
<input id="input" type="text" placeholder="Type a message..." />
<button id="send">Send</button>
<script>
const ws = new WebSocket(`ws://${location.host}/ws/chat`);
const messagesDiv = document.getElementById('messages');
const input = document.getElementById('input');
const sendBtn = document.getElementById('send');
ws.onmessage = (event) => {
const msg = document.createElement('div');
msg.textContent = event.data;
messagesDiv.appendChild(msg);
messagesDiv.scrollTop = messagesDiv.scrollHeight;
};
sendBtn.onclick = () => {
if (input.value) {
ws.send(input.value);
input.value = '';
}
};
// Optional: send on Enter key
input.addEventListener('keypress', (e) => {
if (e.key === 'Enter') sendBtn.click();
});
</script>
</body>
</html>
Save this file as static/index.html and serve it via FastAPI’s static files middleware. The JavaScript establishes a WebSocket connection, appends incoming messages to the DOM, and sends user input on button click or Enter key.
Serving Static Files with FastAPI
# app/main.py (add to existing file)
from fastapi.staticfiles import StaticFiles
app.mount("/static", StaticFiles(directory="static"), name="static")
# Optional: redirect root to the chat UI
@app.get("/")
async def get_root():
return RedirectResponse(url="/static/index.html")
Now running uvicorn app.main:app --reload launches both the API and the UI on http://localhost:8000. Open two browser tabs, type a message in one, and watch it appear instantly in the other—real‑time magic in action.
Securing the Chat with JWT Authentication
Public chat rooms are fun, but many applications need user identity. Let’s add a lightweight JWT flow that issues a token on login and validates it before allowing a WebSocket upgrade.
# app/auth.py
from datetime import datetime, timedelta
from typing import Optional
from jose import JWTError, jwt
SECRET_KEY = "super-secret-key-change-me"
ALGORITHM = "HS256"
ACCESS_TOKEN_EXPIRE_MINUTES = 30
def create_access_token(data: dict, expires_delta: Optional[timedelta] = None):
to_encode = data.copy()
expire = datetime.utcnow() + (expires_delta or timedelta(minutes=ACCESS_TOKEN_EXPIRE_MINUTES))
to_encode.update({"exp": expire})
return jwt.encode(to_encode, SECRET_KEY, algorithm=ALGORITHM)
def verify_token(token: str):
try:
payload = jwt.decode(token, SECRET_KEY, algorithms=[ALGORITHM])
return payload # contains "sub" (username) and "exp"
except JWTError:
return None
Next, expose a simple login endpoint that returns a token. In a real app you’d check a database; here we accept any username/password for brevity.
# app/main.py (add below imports)
from fastapi import Depends, HTTPException, status
from fastapi.security import OAuth2PasswordBearer, OAuth2PasswordRequestForm
from .auth import create_access_token, verify_token
oauth2_scheme = OAuth2PasswordBearer(tokenUrl="token")
@app.post("/token")
async def login(form_data: OAuth2PasswordRequestForm = Depends()):
# In production, validate against a user store
if not form_data.username or not form_data.password:
raise HTTPException(status_code=400, detail="Invalid credentials")
access_token = create_access_token({"sub": form_data.username})
return {"access_token": access_token, "token_type": "bearer"}
Now protect the WebSocket route. FastAPI doesn’t have built‑in async dependency injection for WebSockets, so we manually extract the token from the query string.
# app/router.py (replace previous @app.websocket)
from fastapi import Query
@app.websocket("/ws/chat")
async def chat_endpoint(websocket: WebSocket, token: str = Query(...)):
payload = verify_token(token)
if not payload:
await websocket.close(code=1008) # Policy Violation
return
username = payload.get("sub", "Anonymous")
await manager.connect(websocket)
await manager.broadcast(f"🔔 {username} joined the chat.")
try:
while True:
data = await websocket.receive_text()
await manager.broadcast(f"{username}: {data}")
except WebSocketDisconnect:
manager.disconnect(websocket)
await manager.broadcast(f"🔔 {username} left the chat.")
Update the front‑end to include the token in the connection URL. After a successful login request, store the token in localStorage and append it as a query parameter.
<script>
async function login() {
const resp = await fetch('/token', {
method: 'POST',
headers: {'Content-Type': 'application/x-www-form-urlencoded'},
body: new URLSearchParams({username: 'alice', password: 'wonderland'})
});
const data = await resp.json();
localStorage.setItem('token', data.access_token);
initWebSocket();
}
function initWebSocket() {
const token = localStorage.getItem('token');
const ws = new WebSocket(`ws://${location.host}/ws/chat?token=${token}`);
// ... same message handling as before ...
}
// Auto‑login on page load for demo purposes
login();
</script>
Pro tip: UseSecureandHttpOnlycookies for JWTs in production. Storing tokens inlocalStorageis convenient for demos but vulnerable to XSS attacks.
Scaling Beyond a Single Process
Our in‑memory manager works fine for a single‑process dev server, but production environments often run multiple workers behind a load balancer. To keep all instances in sync, we need a shared pub/sub layer. Redis is the go‑to choice because it offers low‑latency message broadcasting and persistence options.
Replace the broadcast method with a Redis publisher, and have each worker subscribe to the same channel. When a message arrives, the subscriber forwards it to its local WebSocket connections.
# app/redis_manager.py
import aioredis
from typing import List
class RedisConnectionManager:
CHANNEL = "chat"
def __init__(self):
self.active_connections: List[WebSocket] = []
self.redis = None
async def startup(self):
self.redis = await aioredis.from_url("redis://localhost")
asyncio.create_task(self.listener())
async def listener(self):
pubsub = self.redis.pubsub()
await pubsub.subscribe(self.CHANNEL)
async for message in pubsub.listen():
if message['type'] == 'message':
data = message['data'].decode()
for ws in self.active_connections:
await ws.send_text(data)
async def broadcast(self, message: str):
await self.redis.publish(self.CHANNEL, message)
async def connect(self, websocket: WebSocket):
await websocket.accept()
self.active_connections.append(websocket)
def disconnect(self, websocket: WebSocket):
self.active_connections.remove(websocket)
Initialize the manager on startup and shut it down gracefully.
# app/main.py (add lifecycle events)
from .redis_manager import RedisConnectionManager
redis_manager = RedisConnectionManager()
@app.on_event("startup")
async def on_startup():
await redis_manager.startup()
@app.on_event("shutdown")
async def on_shutdown():
await redis_manager.redis.close()
Finally, replace references to manager with redis_manager in the WebSocket endpoint. With Redis in place, you can horizontally scale the FastAPI workers, and every chat participant sees the same stream of messages.
Deploying with Docker Compose
Containerization ensures consistent environments across dev, staging, and production. Below is a minimal docker-compose.yml that spins up the FastAPI app, a Redis instance, and an optional Nginx reverse proxy.
version: "3.9"
services:
web:
build: .
command: uvicorn app.main:app --host 0.0.0.0 --port 8000
ports:
- "8000:8000"
depends_on:
- redis
redis:
image: redis:7-alpine
ports:
- "6379:6379"
# Optional Nginx for TLS termination
nginx:
image: nginx:stable-alpine
ports:
- "80:80"
volumes:
- ./nginx.conf:/etc/nginx/nginx