Nginx Unit: Dynamic Application Server Guide
TOP 5 March 15, 2026, 5:30 a.m.

Nginx Unit: Dynamic Application Server Guide

Nginx Unit is a modern, lightweight application server that lets you run code written in multiple languages without restarting the service. It blends the classic reverse‑proxy strengths of Nginx with a dynamic configuration API, making deployments faster and more flexible. In this guide we’ll walk through installing Unit, wiring up a Python Flask app, scaling to multiple runtimes, and leveraging its live‑reconfiguration capabilities for real‑world workloads.

What Is Nginx Unit?

Unlike traditional web servers that serve static files and proxy to separate app servers, Unit is an application server built from the ground up to host dynamic code directly. It supports Python, PHP, Ruby, Go, JavaScript (Node.js), and even Java, all through a unified JSON‑based control API.

Because the configuration lives in memory, you can add, remove, or modify routes on the fly without any downtime. This makes Unit especially attractive for micro‑service architectures, CI/CD pipelines, and environments where rapid iteration is a must.

Core Concepts

  • Listeners: Network sockets (HTTP, HTTPS, or raw TCP) that accept incoming connections.
  • Applications: Language‑specific runtimes (e.g., python, php) with their own process pools.
  • Routes: Mapping rules that direct requests from listeners to the appropriate application based on URL, host, or other criteria.
  • Control API: A REST‑like interface (usually on http://localhost:8080) used to push JSON configuration.

Installing Nginx Unit

The easiest way to get Unit is via the official packages. Below is a quick install for Ubuntu 22.04. Adjust the OS name for other distributions.

# Add the official repository
curl -s https://packages.nginx.org/unit/ubuntu/gpg.key | sudo apt-key add -
sudo add-apt-repository "deb https://packages.nginx.org/unit/ubuntu/ $(lsb_release -cs) unit"

# Install the core and language modules you need
sudo apt-get update
sudo apt-get install unit unit-python3 unit-php unit-go

After installation, the unitd daemon starts automatically and listens on 127.0.0.1:8080 for configuration commands. Verify it’s running:

curl -s http://localhost:8080/status | jq .

If you see a JSON payload with version information, you’re ready to go.

Your First Python Application

Let’s spin up a tiny Flask app and expose it through Unit. This example demonstrates the minimal JSON needed to get a working endpoint.

Flask App (app.py)

from flask import Flask, jsonify

app = Flask(__name__)

@app.route('/hello')
def hello():
    return jsonify(message='Hello from Nginx Unit!')

if __name__ == '__main__':
    # Unit never calls this block; it's only for local debugging
    app.run(host='0.0.0.0', port=5000)

Now create a simple Unit configuration that tells Unit to launch a Python 3 runtime, point it at the directory containing app.py, and expose it on port 80.

Unit Configuration (unit-config.json)

{
    "listeners": {
        "*:80": {
            "pass": "routes"
        }
    },
    "routes": [
        {
            "match": { "uri": ["/hello*"] },
            "action": { "pass": "applications/python-app" }
        }
    ],
    "applications": {
        "python-app": {
            "type": "python",
            "path": "/var/www/unit-demo",
            "module": "app"
        }
    }
}

Upload the configuration with a single curl command:

curl -X PUT -d @unit-config.json \
     --unix-socket /var/run/control.unit.sock \
     http://localhost/config

Visit http://your-server/hello and you should see the JSON response from Flask. No separate WSGI server, no gunicorn, just Unit handling the request end‑to‑end.

Running Multiple Languages Side‑by‑Side

One of Unit’s strongest selling points is the ability to host different runtimes under the same domain. Imagine a legacy PHP shop that’s being modernized with a new Go micro‑service. Unit can route traffic to both without a reverse‑proxy chain.

Directory Layout

  • /var/www/php‑legacy/ – contains index.php
  • /var/www/go‑service/ – contains compiled binary service
  • /var/www/unit-demo/ – holds the Flask app from the previous section

Extended Configuration

{
    "listeners": {
        "*:80": {
            "pass": "routes"
        }
    },
    "routes": [
        {
            "match": { "uri": ["/api/go*"] },
            "action": { "pass": "applications/go-app" }
        },
        {
            "match": { "uri": ["/legacy*"] },
            "action": { "pass": "applications/php-app" }
        },
        {
            "match": { "uri": ["/hello*"] },
            "action": { "pass": "applications/python-app" }
        }
    ],
    "applications": {
        "python-app": {
            "type": "python",
            "path": "/var/www/unit-demo",
            "module": "app"
        },
        "php-app": {
            "type": "php",
            "root": "/var/www/php-legacy",
            "script": "index.php"
        },
        "go-app": {
            "type": "external",
            "executable": "/var/www/go-service/service",
            "working_directory": "/var/www/go-service",
            "processes": 4,
            "environment": {
                "PORT": "8081"
            }
        }
    }
}

Notice the type: "external" block for Go. Unit treats any executable as an “external” app, forwarding raw TCP data to it. For PHP we only need to point at the document root and the entry script.

Pro tip: Keep your JSON tidy by using a tool like jq or a YAML‑to‑JSON converter. Small syntax errors are the most common cause of failed config pushes.

Dynamic Reconfiguration Without Downtime

Traditional web servers require a graceful reload to apply changes, which can still drop a few connections. Unit’s in‑memory config means you can add a new route or scale a process pool instantly.

Adding a New Endpoint on the Fly

Suppose you want to expose a health‑check endpoint /healthz that returns a static JSON payload. Create a tiny Python module:

# health.py
def application(environ, start_response):
    start_response('200 OK', [('Content-Type', 'application/json')])
    return [b'{"status":"ok"}']

Now push a partial update that registers this new application and routes traffic to it:

curl -X PATCH -d '{
    "applications": {
        "health-app": {
            "type": "python",
            "path": "/var/www/unit-demo",
            "module": "health"
        }
    },
    "routes": [
        {
            "match": { "uri": ["/healthz"] },
            "action": { "pass": "applications/health-app" }
        }
    ]
}' --unix-socket /var/run/control.unit.sock http://localhost/config

Within milliseconds /healthz becomes live, and existing connections continue uninterrupted.

Scaling Process Pools Dynamically

If traffic spikes, you can increase the number of worker processes for any application without restarting Unit:

curl -X PATCH -d '{
    "applications": {
        "python-app": {
            "processes": 8
        }
    }
}' --unix-socket /var/run/control.unit.sock http://localhost/config

Unit will spawn additional processes on demand, then gracefully retire excess workers when you scale back down.

Remember: Keep an eye on your system’s file descriptor limits. Each Unit worker opens sockets; hitting the OS limit can cause silent request failures.

Monitoring and Observability

Unit ships with a built‑in status endpoint that reports metrics such as active connections, request rates, and per‑application process health. Query it like so:

curl -s http://localhost:8080/status | jq .

The JSON payload includes fields like connections, requests, and applications with per‑process CPU and memory usage. Integrate this into Prometheus by using a simple exporter script that scrapes the endpoint and exposes the data on /metrics.

Sample Exporter (exporter.py)

import json, requests
from prometheus_client import start_http_server, Gauge

REQUESTS_TOTAL = Gauge('unit_requests_total', 'Total requests handled by Unit')
ACTIVE_CONNECTIONS = Gauge('unit_active_connections', 'Current active connections')
APP_PROC_CPU = Gauge('unit_app_process_cpu_seconds_total',
                     'CPU seconds used by each app process',
                     ['app', 'pid'])

def collect():
    resp = requests.get('http://127.0.0.1:8080/status')
    data = resp.json()
    REQUESTS_TOTAL.set(data['requests'])
    ACTIVE_CONNECTIONS.set(data['connections'])

    for app_name, app_info in data['applications'].items():
        for proc in app_info.get('processes', []):
            APP_PROC_CPU.labels(app=app_name, pid=str(proc['pid'])).set(proc['cpu'])

if __name__ == '__main__':
    start_http_server(9100)
    while True:
        collect()
        time.sleep(5)

Run the exporter alongside Unit, and configure Prometheus to scrape localhost:9100/metrics. You’ll now have real‑time visibility into how each runtime behaves under load.

Security Best Practices

Because Unit directly executes code, hardening the host environment is essential. Below are a few practical steps.

  • Run each application under a dedicated system user. Use the user field in the application definition to drop privileges.
  • Limit file system access. Bind‑mount only the required code directories and use readOnlyRootFilesystem when deploying via containers.
  • Enable TLS at the listener level. Unit can terminate HTTPS natively, removing the need for an external reverse proxy.

TLS Example

{
    "listeners": {
        "*:443": {
            "pass": "routes",
            "tls": {
                "certificate": "/etc/unit/ssl/cert.pem",
                "key": "/etc/unit/ssl/key.pem"
            }
        }
    },
    "routes": [ ... ]  /* same as before */
}

After updating the configuration, Unit will automatically reload the certificates without a service restart.

Pro tip: Rotate your TLS certs using a CI job that pushes a new JSON snippet. Because Unit reads the cert files on each request, you can replace them on disk and simply send an empty PATCH to force a reload.

Deploying Unit in Containers

Containerization is a natural fit for Unit’s lightweight footprint. Below is a minimal Dockerfile that bundles Unit with the Python runtime and our demo app.

FROM ubuntu:22.04

# Install Unit core and Python module
RUN apt-get update && apt-get install -y \
    curl gnupg2 ca-certificates \
    && curl -s https://packages.nginx.org/unit/ubuntu/gpg.key | apt-key add - \
    && echo "deb https://packages.nginx.org/unit/ubuntu/ $(lsb_release -cs) unit" > /etc/apt/sources.list.d/unit.list \
    && apt-get update && apt-get install -y unit unit-python3 \
    && rm -rf /var/lib/apt/lists/*

# Copy app
WORKDIR /app
COPY app.py /app/
COPY unit-config.json /app/

# Expose Unit control socket and HTTP listener
EXPOSE 80
EXPOSE 8080

# Start Unit with config loaded from file
CMD ["unitd", "--no-daemon", "--control", "unix:/var/run/control.unit.sock", "--config", "/app/unit-config.json"]

Build and run:

docker build -t unit-demo .
docker run -d -p 80:80 unit-demo

The container will serve the Flask endpoint on port 80, and you can still reach the control socket via docker exec for live updates.

Real‑World Use Cases

1. API Gateway for Polyglot Micro‑services
Enterprises often have a mix of legacy Java services, new Node.js functions, and Python data pipelines. Unit can act as a single entry point, routing /v1/* to Java, /v2/* to Node, and /ml/* to Python, all while providing TLS termination and request logging.

2. Serverless‑like Function Execution
Because Unit can spawn external processes on demand, you can implement a lightweight function‑as‑a‑service platform. Each function lives in its own directory, and a small controller updates Unit’s routes whenever a new function is deployed.

3. Edge‑Optimized Content Delivery
Unit’s static file handling is fast, but its real edge comes from serving dynamic content directly at the edge node. Deploy Unit on CDN edge servers, keep the configuration JSON in a distributed key‑value store, and push updates globally in seconds.

Pro Tips & Common Pitfalls

  • Validate JSON before pushing. A stray comma will cause the entire config to be rejected.
  • Use the Unix socket for control. It’s faster and avoids exposing the control API on a network port.
  • Separate logs per application. Unit writes logs to /var/log/unit.log; configure log sections in JSON to route them to different files for easier debugging.
  • Watch file descriptor limits. Each worker consumes a FD; increase ulimit -n on high‑traffic servers.
Remember: Unit’s dynamic nature is powerful, but it also means configuration drift can happen quickly. Adopt a version‑controlled JSON source (Git) and automate pushes via CI to keep your production state reproducible.

Conclusion

Nginx Unit bridges the gap between traditional web servers and modern application runtimes, offering a single, dynamic platform for polyglot services. By leveraging its JSON control API, you can add routes, scale processes, and rotate TLS certificates without ever touching the underlying daemon. Whether you’re modernizing a legacy stack, building a lightweight API gateway, or experimenting with serverless‑style functions, Unit provides the flexibility and performance you need. Start with a simple Flask app, explore multi‑language configurations, and let Unit’s live‑reconfiguration capabilities accelerate your deployment pipeline.

Share this article