AI-powered chatbots are rapidly becoming a core part of modern applications. Whether you're building a support assistant, a learning tool, or a productivity bot, integrating an AI chatbot is now easier than ever. In this guide, we’ll use FastAPI to showcase how to build a simple web-based chatbot powered by OpenRouter's API (a free, OpenAI-compatible endpoint). However, the same logic can be applied in frameworks like Flask or Django.
🔧 Step 1: Install Required Packages
pip install fastapi uvicorn jinja2 aiofiles openai python-dotenv
This installs FastAPI for backend handling, Jinja2 for templating, `openai` for API integration, and `dotenv` for secure API key loading.
📁 Step 2: Project Structure
.
├── main.py
├── templates/
│ └── chat.html
├── static/
├── .env
🔐 Step 3: Store OpenRouter API Key
Create a file named .env
and add your free API key from OpenRouter:
OPENROUTER_API_KEY=your_openrouter_api_key_here
You can get a key from https://openrouter.ai (requires login).
🧠 Step 4: Create main.py
with FastAPI Chat Logic
from fastapi import FastAPI, Request, Form
from fastapi.responses import HTMLResponse
from fastapi.templating import Jinja2Templates
from fastapi.staticfiles import StaticFiles
import openai, os
from dotenv import load_dotenv
load_dotenv()
openai.api_key = os.getenv("OPENROUTER_API_KEY")
openai.api_base = "https://openrouter.ai/api/v1"
app = FastAPI()
templates = Jinja2Templates(directory="templates")
app.mount("/static", StaticFiles(directory="static"), name="static")
@app.get("/", response_class=HTMLResponse)
def index(request: Request):
return templates.TemplateResponse("chat.html", {"request": request})
@app.post("/", response_class=HTMLResponse)
async def chat(request: Request, user_input: str = Form(...)):
response = openai.ChatCompletion.create(
model="openchat/openchat-3.5", # You can choose other models too
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": user_input}
]
)
reply = response.choices[0].message["content"]
return templates.TemplateResponse("chat.html", {
"request": request,
"user_input": user_input,
"reply": reply
})
💬 Step 5: Create templates/chat.html
Template
<!DOCTYPE html>
<html>
<head>
<title>AI Chatbot</title>
<style>
body { font-family: Arial; padding: 2rem; }
input[type="text"] { width: 70%; padding: 10px; }
button { padding: 10px 20px; }
.msg { margin-top: 1rem; }
</style>
</head>
<body>
<h2>AI Chatbot using OpenRouter API</h2>
<form method="post">
<input type="text" name="user_input" placeholder="Type your message..." required />
<button type="submit">Send</button>
</form>
{% if reply %}
<div class="msg">
<p><strong>You:</strong> {{ user_input }}</p>
<p><strong>AI:</strong> {{ reply }}</p>
</div>
{% endif %}
</body>
</html>
🚀 Step 6: Run Your Chatbot
uvicorn main:app --reload
Navigate to http://127.0.0.1:8000
and start chatting with the AI!
📌 Notes:
mistralai/mixtral
or anthropic/claude-3-sonnet
by changing the model
name in ChatCompletion.create()
.api_base
and api_key
.
✅ Conclusion
You’ve now built a lightweight AI chatbot using FastAPI and OpenRouter. This setup is perfect for learning and can be extended with features like session history, authentication, or voice input. With OpenRouter’s free key support and FastAPI’s blazing-fast performance, you can create smart assistants with minimal setup.