Directus 11: Headless CMS on Any Database
PROGRAMMING LANGUAGES March 29, 2026, 5:30 a.m.

Directus 11: Headless CMS on Any Database

Directus 11 takes the concept of a headless CMS and makes it truly database‑agnostic, letting you turn any SQL‑compatible store into a flexible content API. Whether you’re building a storefront, a mobile app, or an internal dashboard, Directus lets you focus on data, not on custom back‑ends.

What Makes Directus Different?

Traditional headless CMS platforms often lock you into a specific database or schema, which can become a bottleneck when you need to integrate legacy data. Directus, on the other hand, sits *on top* of your existing database, exposing tables as collections without altering the underlying structure.

Because it works with MySQL, PostgreSQL, SQLite, MariaDB, MSSQL, and even Oracle (via community adapters), you can migrate projects between environments without re‑architecting the data layer. This “data‑first” philosophy is the core reason many teams adopt Directus for rapid prototyping and production‑grade APIs.

Getting Started: Installing Directus 11

Directus ships as a Node.js application, and the easiest way to spin it up is with Docker. The following command pulls the latest 11.x image and runs it with a PostgreSQL backend:

docker run -d \
  --name directus \
  -p 8055:8055 \
  -e KEY=YOUR_RANDOM_KEY \
  -e ADMIN_EMAIL=admin@example.com \
  -e ADMIN_PASSWORD=SuperSecret123 \
  -e DB_CLIENT=pg \
  -e DB_HOST=postgres \
  -e DB_PORT=5432 \
  -e DB_DATABASE=directus \
  -e DB_USER=directus \
  -e DB_PASSWORD=directuspwd \
  directus/directus:11

Replace the environment variables with values that match your infrastructure. Once the container is healthy, open http://localhost:8055 and finish the setup wizard. The wizard will automatically create the required system tables inside the database you pointed to.

Connecting to Any Supported Database

Directus does not impose a specific schema, but it does expect a few system tables (e.g., directus_users, directus_collections) to manage permissions and UI configuration. You can either let Directus create these tables automatically, or you can provision them manually if you prefer a more controlled deployment.

Here’s a quick example of how to connect Directus to a MySQL database that already hosts a legacy products table:

# .env file for a Directus instance
KEY=YOUR_RANDOM_KEY
ADMIN_EMAIL=admin@example.com
ADMIN_PASSWORD=SuperSecret123

DB_CLIENT=mysql
DB_HOST=127.0.0.1
DB_PORT=3306
DB_DATABASE=legacy_db
DB_USER=legacy_user
DB_PASSWORD=legacy_pwd

After setting the environment variables, restart the Directus container. Directus will introspect the existing tables and list them in the admin UI, ready for you to expose as collections.

Modeling Content: Collections and Fields

In Directus terminology, a *collection* maps to a database table (or view), and a *field* corresponds to a column. The admin UI lets you add metadata—labels, validation rules, interface types—without touching the database directly.

For developers who prefer code, the Collections API provides a programmatic way to define schema. Below is a Python snippet that creates a new blog_posts collection with a rich‑text body and a many‑to‑many relation to tags:

import requests, json

BASE_URL = "http://localhost:8055"
TOKEN = "YOUR_AUTH_TOKEN"

headers = {
    "Authorization": f"Bearer {TOKEN}",
    "Content-Type": "application/json"
}

# Define the collection
collection_payload = {
    "collection": "blog_posts",
    "meta": {
        "icon": "article",
        "note": "Posts for the public blog"
    },
    "schema": {
        "name": "blog_posts",
        "fields": [
            {"field": "id", "type": "integer", "primary_key": True, "auto_increment": True},
            {"field": "title", "type": "string", "required": True, "max_length": 255},
            {"field": "slug", "type": "string", "unique": True, "max_length": 150},
            {"field": "content", "type": "text"},
            {"field": "published_at", "type": "timestamp"},
            {"field": "author", "type": "uuid", "interface": "select-dropdown"},
            {"field": "tags", "type": "json", "interface": "tags"}
        ]
    }
}

response = requests.post(f"{BASE_URL}/collections", headers=headers, data=json.dumps(collection_payload))
print(response.json())

Running this script adds the collection to Directus, and the UI instantly reflects the new fields, ready for content editors.

Fetching Data: REST vs GraphQL

Directus shines by offering both a powerful REST API and a fully‑featured GraphQL endpoint. Choose REST for quick, curl‑friendly calls, or GraphQL when you need precise payloads and nested relationships.

REST Example – Getting Published Blog Posts

import requests

resp = requests.get(
    "http://localhost:8055/items/blog_posts",
    params={"filter[published_at][gt]": "2023-01-01", "limit": 10},
    headers={"Authorization": f"Bearer {TOKEN}"}
)

posts = resp.json()["data"]
for p in posts:
    print(f"{p['title']} – {p['published_at']}")

This call returns a JSON array of posts published after January 1, 2023, limited to ten items. The filter syntax follows Directus’s query language, making complex queries readable.

GraphQL Example – Fetching Posts with Tags

import requests, json

query = """
{
  blog_posts(limit: 5, filter: {published_at: {_gt: "2023-01-01"}}) {
    title
    slug
    content
    tags
  }
}
"""

resp = requests.post(
    "http://localhost:8055/graphql",
    json={"query": query},
    headers={"Authorization": f"Bearer {TOKEN}"}
)

print(json.dumps(resp.json(), indent=2))

GraphQL lets you request exactly the fields you need, reducing bandwidth for mobile clients. Directus also supports subscriptions via websockets for real‑time updates, a handy feature for collaborative editing tools.

Real‑World Use Cases

E‑commerce catalog – A retailer can point Directus at an existing MySQL product table, expose it via API, and let the front‑end team fetch products, categories, and inventory without building a separate service.

Corporate intranet – HR can maintain employee records in a PostgreSQL database; Directus provides a UI for non‑technical staff to edit bios, upload photos, and manage department hierarchies, all while preserving the original data model.

IoT telemetry dashboard – Sensor data often lands in a time‑series database like TimescaleDB (PostgreSQL extension). Directus can expose a read‑only collection for dashboards, letting engineers query recent measurements without exposing raw SQL queries.

Pro Tips for Production Deployments

Tip: Always enable role‑based access control (RBAC) before exposing the API publicly. Create a public role with read‑only permissions for the collections you want to share, and a separate editor role with create/update rights. This prevents accidental data mutations.
Tip: Use the built‑in hooks system to run custom logic on create, update, or delete events. For example, you can trigger a webhook to clear a CDN cache whenever a blog post is published.

Security Best Practices

Directus ships with JWT authentication, but you should enforce HTTPS, rotate the KEY environment variable regularly, and limit token lifetimes. Additionally, enable CORS only for trusted origins and configure a rate‑limiting reverse proxy (e.g., Nginx) to mitigate abuse.

For database‑level security, grant the Directus user only the privileges it needs: SELECT, INSERT, UPDATE, DELETE on application tables, plus CREATE and ALTER on the Directus system tables during initial setup. Once the schema is stable, you can revoke DDL rights.

Scaling Directus

Because Directus is stateless, you can horizontally scale it behind a load balancer. Store session data (if you enable session‑based auth) in Redis, and point all instances to the same database. For high‑traffic APIs, consider enabling the built‑in caching layer or using an external CDN for static assets.

When using PostgreSQL, leverage read replicas for heavy read workloads. Directus’s configuration allows you to define a separate read‑only connection string, ensuring write operations always go to the primary node.

Extending Directus with Custom Endpoints

Sometimes the out‑of‑the‑box API isn’t enough. Directus lets you add custom endpoints via the extensions folder. Below is a minimal Node.js example that adds a /stats/visits endpoint returning a simple JSON payload.

# /directus/extensions/endpoints/visits/index.js
module.exports = function registerEndpoint(router, { services }) {
  const { ItemsService } = services;

  router.get('/stats/visits', async (req, res) => {
    const visits = await new ItemsService('page_visits', { schema: req.schema }).readMany();
    const total = visits.reduce((sum, v) => sum + v.count, 0);
    res.json({ totalVisits: total });
  });
};

After placing the file, restart Directus and the new endpoint becomes available at http://localhost:8055/stats/visits. This extensibility is perfect for integrating third‑party analytics or custom business logic.

Deploying to the Cloud

Most cloud providers support Docker, making Directus deployment straightforward. Below is a concise docker‑compose.yml for a production‑grade stack with PostgreSQL, Directus, and a Redis cache for session storage.

version: "3.8"

services:
  db:
    image: postgres:15
    environment:
      POSTGRES_DB: directus
      POSTGRES_USER: directus
      POSTGRES_PASSWORD: directuspwd
    volumes:
      - pgdata:/var/lib/postgresql/data

  redis:
    image: redis:7
    restart: always

  directus:
    image: directus/directus:11
    depends_on:
      - db
      - redis
    ports:
      - "8055:8055"
    environment:
      KEY: YOUR_RANDOM_KEY
      ADMIN_EMAIL: admin@example.com
      ADMIN_PASSWORD: SuperSecret123
      DB_CLIENT: pg
      DB_HOST: db
      DB_PORT: 5432
      DB_DATABASE: directus
      DB_USER: directus
      DB_PASSWORD: directuspwd
      CACHE_ENABLED: true
      CACHE_STORE: redis
      CACHE_REDIS_HOST: redis
    restart: unless-stopped

volumes:
  pgdata:

Deploy this stack with docker compose up -d, and you have a resilient Directus instance ready for production traffic.

Monitoring and Observability

Directus emits structured logs to stdout, which can be collected by tools like Loki or Papertrail. For metrics, enable the /metrics endpoint (requires the @directus/extensions-metrics package) and scrape it with Prometheus. Typical dashboards track request latency, error rates, and database connection pool usage.

Integrate health checks into your orchestration platform: Directus responds to /status with a JSON payload indicating the health of the database connection and cache layer. This helps Kubernetes automatically restart unhealthy pods.

Internationalization (i18n) Made Simple

Directus includes built‑in support for multilingual fields. When defining a collection, you can mark a field as translatable: true. The UI then shows a language selector, and the API returns a language‑specific payload based on the Accept-Language header.

Here’s a quick example of adding a translatable title field to the blog_posts collection:

# Update the collection schema via the API
payload = {
    "schema": {
        "fields": [
            {"field": "title", "type": "string", "translatable": True, "required": True}
        ]
    }
}
requests.patch(f"{BASE_URL}/collections/blog_posts", headers=headers, json=payload)

Now content editors can provide titles in English, Spanish, French, etc., and front‑ends can request the appropriate language without extra code.

Backup Strategies

Since Directus does not store content outside the database, backup procedures are identical to those of your chosen DBMS. Schedule nightly dumps (e.g., pg_dump for PostgreSQL) and store them in an off‑site object store. For added safety, also export the Directus configuration via the /settings/export endpoint, which captures custom roles, hooks, and UI customizations.

Common Pitfalls and How to Avoid Them

  • Over‑exposing tables: Directus will list every table it can see. Use the UI to hide internal tables or set directus_hidden flags to keep them out of the public API.
  • Schema drift: If you alter a table directly in the database, Directus may not automatically update its field metadata. Run the directus schema refresh CLI command after manual migrations.
  • Large payloads: Fetching massive collections without pagination can overwhelm clients. Always use limit and offset (or GraphQL cursors) for paginated results.

Advanced Feature: Dynamic Views as Collections

Directus can treat SQL views as read‑only collections, which is perfect for reporting scenarios. Create a view that joins orders with customers, then register it in Directus. The API will expose the view just like a regular table, but attempts to write will be blocked.

Example view definition (PostgreSQL):

CREATE VIEW order_summary AS
SELECT
  o.id AS order_id,
  c.name AS customer_name,
  o.total,
  o.created_at
FROM orders o
JOIN customers c ON o.customer_id = c.id;

After the view exists, refresh Directus’s schema and the order_summary collection appears in the UI, ready for analytics dashboards.

Testing Your Directus API

Automated testing ensures that content contracts remain stable. Below is a pytest example that validates the /items/blog_posts endpoint returns a 200 status and contains the expected fields.

import pytest, requests

BASE = "http://localhost:8055"
TOKEN = "YOUR_AUTH_TOKEN"

@pytest.fixture
def auth_headers():
    return {"Authorization": f"Bearer {TOKEN}"}

def test_blog_posts_schema(auth_headers):
    resp = requests.get(f"{BASE}/items/blog_posts", headers=auth_headers, params={"limit": 1})
    assert resp.status_code == 200
    data = resp.json()["data"][0]
    for field in ["id", "title", "slug", "content", "published_at"]:
        assert field in data

Integrate this test into your CI pipeline to catch breaking changes before they reach production.

Conclusion

Directus 11 delivers a truly headless CMS experience that respects your existing data architecture. By supporting any SQL database, providing both REST and GraphQL interfaces, and offering a rich extensibility model, it empowers developers to build scalable, secure, and maintainable content APIs

Share this article