MCP Tools: Build Claude AI Plugins with Python
PROGRAMMING LANGUAGES March 7, 2026, 11:30 a.m.

MCP Tools: Build Claude AI Plugins with Python

Welcome to the world of MCP (Claude Plugin) tools, where Python meets Anthropic’s Claude to create powerful, context‑aware extensions. In this guide we’ll walk through the core concepts, set up a clean development environment, and build two end‑to‑end plugins that showcase real‑world utility. By the end, you’ll be comfortable crafting your own Claude‑powered tools, testing them locally, and deploying them securely for production use.

Understanding MCP Tools

The MCP framework is Anthropic’s official way to expose Claude’s reasoning capabilities as callable functions. Think of a plugin as a small, self‑contained micro‑service that Claude can invoke during a conversation, passing structured arguments and receiving deterministic results.

Key components of an MCP plugin include:

  • Manifest: A JSON file describing the plugin’s name, description, authentication, and the OpenAPI‑style schema for each endpoint.
  • Endpoint Handlers: Python functions (often built with FastAPI or Flask) that receive validated payloads, perform the business logic, and return a JSON response.
  • Authentication Layer: Optional API keys, OAuth, or signed JWTs that let Claude verify the caller’s identity before invoking the function.
  • Testing Harness: Unit tests and integration tests that simulate Claude’s request format to ensure your plugin behaves predictably.

When Claude needs extra information—like the current temperature or a sentiment score—it can request the relevant function, pass arguments, and incorporate the response into its next reply. This back‑and‑forth is seamless to the end user, yet it gives developers a structured way to extend AI capabilities.

Setting Up Your Development Environment

Before diving into code, make sure you have the following installed:

  1. Python 3.11 or newer (the latest minor release is recommended for type‑checking improvements).
  2. pip for package management.
  3. FastAPI and Uvicorn for the web server.
  4. anthropic SDK (pip install anthropic) to interact with Claude during testing.
  5. pytest for automated testing.

Here’s a quick script to bootstrap a virtual environment and install the required packages:

python -m venv .venv
source .venv/bin/activate  # On Windows use `.venv\Scripts\activate`
pip install --upgrade pip
pip install fastapi uvicorn anthropic pytest

After activation, create a project folder structure that mirrors typical Python packages:

my_claude_plugin/
│
├─ app/
│   ├─ __init__.py
│   ├─ main.py          # FastAPI entry point
│   ├─ routes.py        # Endpoint definitions
│   └─ utils.py         # Helper functions
│
├─ tests/
│   └─ test_routes.py
│
├─ manifest.json        # MCP manifest
└─ requirements.txt

With this scaffold in place, you’re ready to write your first plugin.

Building Your First Claude Plugin: Weather Lookup

Our first example will expose a simple weather lookup endpoint. Claude can ask the plugin for the current temperature in a city, and the plugin will call a public weather API (e.g., OpenWeatherMap) to fetch the data.

Step 1: Craft the Manifest

The manifest tells Claude what the plugin does and how to call it. Save the following JSON as manifest.json in the project root:

{
  "schema_version": "v1",
  "name_for_human": "Weather Buddy",
  "name_for_model": "weather_buddy",
  "description_for_human": "Provides real‑time weather information for any city.",
  "description_for_model": "Fetches current temperature, humidity, and weather description for a given city name.",
  "auth": {
    "type": "none"
  },
  "api": {
    "type": "openapi",
    "url": "http://localhost:8000/openapi.json",
    "has_user_authentication": false
  },
  "logo_url": "https://example.com/logo.png",
  "contact_email": "support@example.com",
  "legal_info_url": "https://example.com/legal"
}

Note the url field points to the automatically generated OpenAPI spec that FastAPI serves at /openapi.json. This enables Claude to understand the request schema without extra documentation.

Step 2: Implement the Endpoint

Create app/routes.py with the following logic. It validates the incoming payload, calls the external weather service, and returns a concise JSON response.

from fastapi import APIRouter, HTTPException
import httpx
import os

router = APIRouter()

WEATHER_API_KEY = os.getenv("OPENWEATHER_API_KEY")
BASE_URL = "https://api.openweathermap.org/data/2.5/weather"

@router.get("/weather", summary="Get current weather for a city")
async def get_weather(city: str):
    """
    Parameters
    ----------
    city: str
        Name of the city (e.g., "Paris").

    Returns
    -------
    dict
        temperature (°C), humidity (%), description (str)
    """
    if not WEATHER_API_KEY:
        raise HTTPException(status_code=500, detail="Weather API key not configured")

    params = {
        "q": city,
        "appid": WEATHER_API_KEY,
        "units": "metric"
    }

    async with httpx.AsyncClient() as client:
        response = await client.get(BASE_URL, params=params)

    if response.status_code != 200:
        raise HTTPException(status_code=404, detail=f"City '{city}' not found")

    data = response.json()
    result = {
        "city": city,
        "temperature_celsius": data["main"]["temp"],
        "humidity_percent": data["main"]["humidity"],
        "description": data["weather"][0]["description"]
    }
    return result

Step 3: Wire Up FastAPI

In app/main.py, mount the router and expose the OpenAPI schema.

from fastapi import FastAPI
from .routes import router as weather_router

app = FastAPI(
    title="Weather Buddy MCP",
    description="A simple Claude plugin that returns current weather data."
)

app.include_router(weather_router, prefix="/api")

Run the server locally with:

uvicorn app.main:app --reload --port 8000

Visit http://localhost:8000/docs to see the interactive Swagger UI. Claude will later use the same OpenAPI spec to know that the endpoint expects a city query parameter.

Step 4: Test the Plugin with Claude

Use the Anthropic SDK to simulate a Claude request. The SDK automatically formats the payload according to the manifest.

import os
from anthropic import Anthropic, ClaudeMessage

client = Anthropic(api_key=os.getenv("ANTHROPIC_API_KEY"))

def ask_claude_for_weather(city: str):
    # Construct a message that hints Claude to call the plugin
    user_msg = f"What’s the weather like in {city}?"
    response = client.messages.create(
        model="claude-3-5-sonnet-20240620",
        max_tokens=256,
        messages=[{"role": "user", "content": user_msg}],
        tools=[{
            "type": "function",
            "function": {
                "name": "weather_buddy.get_weather",
                "description": "Fetch current weather for a city",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "city": {"type": "string", "description": "City name"}
                    },
                    "required": ["city"]
                }
            }
        }]
    )
    return response

print(ask_claude_for_weather("Tokyo"))

Claude will decide whether to invoke weather_buddy.get_weather. If it does, the SDK sends a function call request, your FastAPI server handles it, and Claude incorporates the result into its final answer.

Pro tip: During local development, set ANTHROPIC_API_KEY and OPENWEATHER_API_KEY as environment variables in a .env file and load them with python-dotenv. This keeps secrets out of source control.

Advanced Plugin: Sentiment Analyzer with Claude

Now let’s build a more sophisticated plugin that performs sentiment analysis on arbitrary text. Instead of calling an external API, we’ll leverage Claude itself as a “function‑as‑a‑service”—the plugin forwards the user’s text back to Claude with a system prompt that forces a deterministic sentiment output.

Why Use Claude Inside a Plugin?

Sometimes you need a deterministic, low‑latency model for a specific sub‑task (e.g., classification) while the main conversation runs on a larger model. By wrapping Claude in a micro‑service, you gain:

  • Clear separation of concerns – the main conversation stays focused, while the plugin handles classification.
  • Reusability – any client (not just Claude) can call the sentiment endpoint.
  • Version control – you can pin a specific Claude model version for the plugin, independent of the conversation model.

Manifest for Sentiment Plugin

Save this as manifest_sentiment.json. Note the auth.type set to service_http because the plugin will require an internal API key for external callers.

{
  "schema_version": "v1",
  "name_for_human": "Sentiment Analyzer",
  "name_for_model": "sentiment_analyzer",
  "description_for_human": "Detects positive, neutral, or negative sentiment in a piece of text.",
  "description_for_model": "Returns a sentiment label and confidence score for the provided text.",
  "auth": {
    "type": "service_http",
    "authorization_type": "bearer",
    "verification_tokens": {
      "anthropic": "YOUR_ANTHROPIC_VERIFICATION_TOKEN"
    }
  },
  "api": {
    "type": "openapi",
    "url": "http://localhost:8001/openapi.json",
    "has_user_authentication": false
  },
  "logo_url": "https://example.com/sentiment.png",
  "contact_email": "support@example.com",
  "legal_info_url": "https://example.com/legal"
}

Endpoint Implementation

In a new folder sentiment_app/, create main.py and routes.py. The route will accept a POST request with raw text, call Claude, and return a structured sentiment payload.

# sentiment_app/routes.py
from fastapi import APIRouter, HTTPException, Header
import os
from anthropic import Anthropic

router = APIRouter()
client = Anthropic(api_key=os.getenv("ANTHROPIC_API_KEY"))

EXPECTED_TOKEN = os.getenv("PLUGIN_SERVICE_TOKEN")

def _verify_token(authorization: str | None):
    if not authorization or not authorization.startswith("Bearer "):
        raise HTTPException(status_code=401, detail="Missing Bearer token")
    token = authorization.split(" ", 1)[1]
    if token != EXPECTED_TOKEN:
        raise HTTPException(status_code=403, detail="Invalid token")

@router.post("/analyze", summary="Analyze sentiment of a text")
async def analyze_sentiment(
    payload: dict,
    authorization: str = Header(None)
):
    _verify_token(authorization)

    text = payload.get("text")
    if not text:
        raise HTTPException(status_code=400, detail="Field 'text' is required")

    # Prompt Claude to output JSON with a strict schema
    system_prompt = (
        "You are a sentiment classifier. Return a JSON object with two keys: "
        "'label' (one of 'positive', 'neutral', 'negative') and "
        "'confidence' (float between 0 and 1). Do not add any extra text."
    )
    try:
        response = client.messages.create(
            model="claude-3-opus-20240229",
            max_tokens=64,
            system=system_prompt,
            messages=[{"role": "user", "content": text}]
        )
        # Claude returns a string; attempt to parse as JSON
        import json
        result = json.loads(response.content[0].text)
        return result
    except Exception as e:
        raise HTTPException(status_code=500, detail=str(e))
# sentiment_app/main.py
from fastapi import FastAPI
from .routes import router as sentiment_router

app = FastAPI(
    title="Sentiment Analyzer MCP",
    description="Wraps Claude to provide deterministic sentiment analysis."
)

app.include_router(sentiment_router, prefix="/api")

Run the service on a different port to avoid clashes:

uvicorn sentiment_app.main:app --reload --port 8001

Calling the Sentiment Plugin from Claude

Now that the plugin is live, we need to let Claude know how to call it. In the main conversation (the one that might be running on a larger model), you add the plugin’s manifest URL to the tools list. Claude will see the analyze_sentiment function signature and decide whether to invoke it.

def ask_claude_to_classify(text: str):
    response = client.messages.create(
        model="claude-3-5-sonnet-20240620",
        max_tokens=256,
        messages=[{"role": "user", "content": f"Classify this sentence: \"{text}\""}],
        tools=[{
            "type": "function",
            "function": {
                "name": "sentiment_analyzer.analyze_sentiment",
                "description": "Classify sentiment of a given text",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "text": {"type": "string", "description": "The text to classify"}
                    },
                    "required": ["text"]
                }
            }
        }]
    )
    return response

print(ask_claude_to_classify("I love the new design!"))

Claude will generate a function call payload, your sentiment service validates the bearer token, forwards the text to Claude (the internal call), and returns a JSON with label and confidence. Claude then incorporates that structured data into its final answer, e.g., “The sentiment is **positive** with confidence 0.94.”

Pro tip: Keep the internal Claude model version stable (e.g., pin to claude-3-opus-20240229) to guarantee reproducible sentiment scores across deployments.

Real‑World Use Cases & Best Practices

Now that you have two working plugins, let’s explore scenarios where MCP tools shine.

  • Customer Support Automation: Combine a knowledge‑base lookup plugin with a sentiment analyzer to triage tickets. Claude can fetch relevant articles, gauge user frustration, and suggest escalation paths—all without human intervention.
  • Financial Data Enrichment: Build a plugin that pulls live market data, then let Claude generate natural‑language summaries or risk assessments for portfolio managers.
  • Content Moderation Pipelines: Use a profanity‑filter plugin followed by a sentiment plugin to flag potentially harmful user‑generated content before it reaches a public forum.

When scaling plugins to production, keep these engineering considerations in mind:

  1. Rate Limiting & Quotas: Protect third‑party APIs (e.g., weather services) with request throttling. FastAPI’s Depends can integrate slowapi for
Share this article