How to Build a Real-Time Data Dashboard with FastAPI, WebSockets, and Plotly
RELEASES Dec. 10, 2025, 5:30 p.m.

How to Build a Real-Time Data Dashboard with FastAPI, WebSockets, and Plotly

Welcome back, fellow coders! In today’s deep‑dive we’ll build a real‑time data dashboard from scratch using FastAPI, WebSockets, and Plotly. By the end of this tutorial you’ll have a fully functional web app that streams live sensor data, updates charts instantly, and scales gracefully—all with clean, production‑ready Python code.

Why this stack? FastAPI delivers blazing‑fast HTTP endpoints with automatic OpenAPI docs, while its native WebSocket support makes low‑latency bi‑directional communication a breeze. Plotly, on the other hand, gives us interactive, browser‑based visualizations without the heavy lifting of D3.js. Together they form a modern, async‑first solution that’s perfect for IoT dashboards, financial tickers, or any scenario where data changes by the second.

Project Overview

Our dashboard will consist of three core components:

  • A FastAPI backend that simulates sensor readings and pushes updates via WebSockets.
  • A lightweight HTML/JavaScript front‑end that subscribes to the WebSocket stream and renders Plotly charts.
  • Docker configuration for easy deployment and reproducibility.

We’ll start by setting up the development environment, then move on to the backend, followed by the front‑end, and finally wrap everything in Docker.

Setting Up the Environment

First, ensure you have Python 3.11+ installed. Create a virtual environment and install the required packages:

python -m venv .venv
source .venv/bin/activate   # On Windows: .venv\Scripts\activate
pip install fastapi[all] uvicorn python‑dotenv plotly

We’ll also need websockets for client‑side testing and pydantic for data validation (already included with FastAPI).

Project Structure

dashboard/
│
├─ app/
│   ├─ __init__.py
│   ├─ main.py          # FastAPI entry point
│   ├─ sensor.py        # Simulated sensor logic
│   └─ models.py        # Pydantic models
│
├─ static/
│   └─ index.html       # Front‑end page
│
├─ Dockerfile
└─ requirements.txt

Keeping the static assets separate from the API logic makes the codebase easier to navigate, especially when you later split the front‑end into a dedicated React or Vue project.

Creating the Sensor Simulator

The sensor.py module will generate random temperature and humidity values every second. In a real deployment you’d replace this with actual hardware reads or a third‑party API.

import asyncio
import random
from datetime import datetime
from typing import Dict

async def generate_readings() -> Dict[str, float]:
    """
    Simulate a sensor reading.
    Returns a dict with timestamp, temperature, and humidity.
    """
    await asyncio.sleep(1)  # Mimic I/O latency
    return {
        "timestamp": datetime.utcnow().isoformat(),
        "temperature": round(random.uniform(20.0, 30.0), 2),
        "humidity": round(random.uniform(30.0, 70.0), 2),
    }

This coroutine can be awaited inside our WebSocket endpoint, ensuring the server remains fully asynchronous and can handle many concurrent connections.

Defining Data Models with Pydantic

Strong typing and validation are essential for reliable APIs. Let’s define a simple model for our sensor payload.

from pydantic import BaseModel, Field

class SensorReading(BaseModel):
    timestamp: str = Field(..., description="ISO‑8601 UTC timestamp")
    temperature: float = Field(..., description="Temperature in °C")
    humidity: float = Field(..., description="Relative humidity in %")

FastAPI will automatically serialize instances of SensorReading to JSON, and the OpenAPI spec will reflect these fields.

Building the FastAPI Backend

Now we stitch everything together in main.py. The file sets up the HTTP routes, serves static files, and handles WebSocket connections.

from fastapi import FastAPI, WebSocket, WebSocketDisconnect
from fastapi.staticfiles import StaticFiles
from fastapi.responses import HTMLResponse
import asyncio

from .sensor import generate_readings
from .models import SensorReading

app = FastAPI(title="Real‑Time Dashboard API")

# Serve the static HTML front‑end
app.mount("/static", StaticFiles(directory="static"), name="static")

@app.get("/", response_class=HTMLResponse)
async def get_index():
    """
    Return the dashboard HTML page.
    """
    with open("static/index.html") as f:
        return HTMLResponse(content=f.read())

# In‑memory set of active WebSocket connections
active_connections: set[WebSocket] = set()

@app.websocket("/ws")
async def websocket_endpoint(websocket: WebSocket):
    await websocket.accept()
    active_connections.add(websocket)
    try:
        while True:
            # Generate a new reading and broadcast to all clients
            raw = await generate_readings()
            reading = SensorReading(**raw)
            await broadcast(reading.json())
            await asyncio.sleep(0)  # Yield control to other coroutines
    except WebSocketDisconnect:
        active_connections.remove(websocket)

async def broadcast(message: str):
    """
    Send a JSON string to every connected client.
    """
    dead_connections = set()
    for conn in active_connections:
        try:
            await conn.send_text(message)
        except Exception:
            dead_connections.add(conn)
    # Clean up any broken connections
    for conn in dead_connections:
        active_connections.remove(conn)

Key points to note:

  • We keep a global active_connections set, allowing us to broadcast to all clients with a single loop.
  • The broadcast helper gracefully handles disconnects, preventing stale sockets from clogging memory.
  • Using await asyncio.sleep(0) inside the loop yields to other coroutines, keeping the event loop responsive.
Pro tip: For production you’ll want a more robust pub/sub system (Redis, NATS, or RabbitMQ). The in‑memory set works fine for demos but won’t survive process restarts or scale across multiple workers.

Designing the Front‑End

The front‑end lives in static/index.html. It establishes a WebSocket connection, receives JSON payloads, and updates a Plotly line chart in real time.

<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <title>Live Sensor Dashboard</title>
    <script src="https://cdn.plot.ly/plotly-2.24.1.min.js"></script>
    <style>
        body {font-family: Arial, sans-serif; margin: 2rem;}
        #chart {width: 100%; max-width: 900px; margin: auto;}
    </style>
</head>
<body>
    <h1>Real‑Time Temperature & Humidity</h1>
    <div id="chart"></div>

    <script>
        const ws = new WebSocket(`ws://${location.host}/ws`);
        const timestamps = [];
        const temps = [];
        const humids = [];

        // Initialize empty Plotly chart
        const layout = {
            title: 'Live Sensor Readings',
            xaxis: {title: 'Time (UTC)'},
            yaxis: {title: 'Value'},
            legend: {orientation: 'h'},
        };
        Plotly.newPlot('chart', [
            {x: timestamps, y: temps, name: 'Temperature (°C)', mode: 'lines+markers'},
            {x: timestamps, y: humids, name: 'Humidity (%)', mode: 'lines+markers'}
        ], layout);

        ws.onmessage = (event) => {
            const data = JSON.parse(event.data);
            timestamps.push(data.timestamp);
            temps.push(data.temperature);
            humids.push(data.humidity);

            // Keep only the latest 30 points for readability
            if (timestamps.length > 30) {
                timestamps.shift();
                temps.shift();
                humids.shift();
            }

            Plotly.update('chart', {
                x: [timestamps, timestamps],
                y: [temps, humids]
            });
        };

        ws.onclose = () => {
            console.warn('WebSocket closed – attempting reconnection in 3s...');
            setTimeout(() => location.reload(), 3000);
        };
    </script>
</body>
</html>

The script is intentionally terse: it maintains three arrays for timestamps, temperature, and humidity, updates the Plotly figure on every incoming message, and trims the data to the most recent 30 points to keep the chart responsive.

Running the Application Locally

Start the server with Uvicorn, the ASGI server that powers FastAPI.

uvicorn app.main:app --reload --host 0.0.0.0 --port 8000

Visit http://localhost:8000 in your browser. You should see a line chart that updates every second with new temperature and humidity values.

Tip: Use the browser’s developer console (F12) to inspect the WebSocket traffic. The Frames tab under the Network panel shows each JSON payload, which is handy for debugging data format issues.

Dockerizing the Service

Containerization ensures that the dashboard runs the same way on any machine. Create a Dockerfile in the project root:

# Use official Python slim image
FROM python:3.12-slim

# Set working directory
WORKDIR /app

# Install dependencies
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

# Copy source code
COPY . .

# Expose the FastAPI port
EXPOSE 8000

# Run the app with Uvicorn
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]

Generate requirements.txt from your virtual environment:

pip freeze > requirements.txt

Now build and run the container:

docker build -t realtime-dashboard .
docker run -p 8000:8000 realtime-dashboard

The dashboard will be reachable at http://localhost:8000 just as before, but now it’s isolated from your host environment.

Scaling with Multiple Workers

Uvicorn’s single‑process model is fine for development, but production workloads often require multiple workers. Use gunicorn with the uvicorn.workers.UvicornWorker class to spawn several processes.

# Install gunicorn
pip install gunicorn

# Run with 4 workers
gunicorn app.main:app -w 4 -k uvicorn.workers.UvicornWorker --bind 0.0.0.0:8000

When you scale horizontally, remember that each worker maintains its own in‑memory active_connections set. To share WebSocket connections across processes, move the broadcast logic to an external message broker (Redis Pub/Sub is a popular choice).

Integrating a Real Sensor (Optional Extension)

If you have a physical sensor (e.g., a DHT22 on a Raspberry Pi), replace the generate_readings coroutine with actual I/O calls. Here’s a quick sketch using the Adafruit_DHT library:

import Adafruit_DHT

DHT_SENSOR = Adafruit_DHT.DHT22
DHT_PIN = 4  # GPIO pin

async def generate_readings() -> Dict[str, float]:
    await asyncio.sleep(1)
    humidity, temperature = Adafruit_DHT.read_retry(DHT_SENSOR, DHT_PIN)
    return {
        "timestamp": datetime.utcnow().isoformat(),
        "temperature": round(temperature, 2) if temperature else None,
        "humidity": round(humidity, 2) if humidity else None,
    }

Make sure to handle None values gracefully on the front‑end, perhaps by skipping updates when the sensor fails to read.

Testing the WebSocket Endpoint

Automated tests give you confidence that future changes won’t break the real‑time flow. Below is a simple pytest test using httpx’s asynchronous client.

import pytest
import asyncio
from httpx import AsyncClient
from app.main import app

@pytest.mark.asyncio
async def test_websocket_broadcast():
    async with AsyncClient(app=app, base_url="http://test") as client:
        async with client.websocket_connect("/ws") as ws:
            # Receive three messages
            messages = [await ws.receive_text() for _ in range(3)]
            assert len(messages) == 3
            for msg in messages:
                data = json.loads(msg)
                assert "timestamp" in data
                assert isinstance(data["temperature"], float)
                assert isinstance(data["humidity"], float)

Run the test suite with pytest -s. The -s flag ensures that any async warnings surface during execution.

Pro tip: For larger projects, consider using pytest‑asyncio fixtures to spin up a temporary FastAPI server and clean up resources automatically.

Security Considerations

WebSockets bypass some of the traditional CSRF protections that apply to HTTP forms. To mitigate risks:

  1. Validate the Origin header in the WebSocket handshake.
  2. Require an authentication token (e.g., JWT) as a query parameter or sub‑protocol.
  3. Rate‑limit connection attempts using a middleware like slowapi.

Here’s a quick example of origin validation inside the endpoint:

@app.websocket("/ws")
async def websocket_endpoint(websocket: WebSocket):
    if websocket.headers.get("origin") != "https://mydashboard.com":
        await websocket.close(code=1008)  # Policy Violation
        return
    await websocket.accept()
    # ...rest of the logic...

Deploying to the Cloud

FastAPI works seamlessly with most PaaS providers. Below is a minimal docker‑compose.yml for deploying to a VPS or a managed Kubernetes service:

version: "3.8"
services:
  dashboard:
    image: realtime-dashboard:latest
    build: .
    ports:
      - "80:8000"
    restart: unless‑stopped
    environment:
      - LOG_LEVEL=info

Push the image to a container registry (Docker Hub, GitHub Packages, or GCR) and run docker compose up -d on the target host. For Kubernetes, create a Deployment and Service manifest; the same container image works without modification.

Performance Benchmarks

We ran a quick load test with locust simulating 500 concurrent WebSocket clients. Results:

  •  
Share this article