🧩LangMiddle
Production Middleware for LangGraph

🧩 LangMiddle

Supercharge your LangGraph agents with plug‑and‑play memory, context management, and chat persistence

Get Started in 30 Seconds

Install LangMiddle and start building production-ready agents

Terminal
pip install langmiddle

Zero Config Start

Works out-of-the-box with in-memory SQLite - no database setup

Production Ready

JWT auth, RLS support, type-safe APIs, comprehensive logging

Everything You Need for Production AI Agents

LangMiddle Portal provides a powerful web interface to manage your agent's data

Persistent Chat History

Browse, search, and manage all your AI conversation threads. Multi-backend support: SQLite, PostgreSQL, Supabase, Firebase.

🧠

Semantic Memory

Automatic fact extraction, deduplication, and context injection. Vector-based search with relevance scoring and adaptive formatting.

Production Security

JWT authentication, row-level security (RLS), type-safe APIs with full Pydantic validation, and comprehensive logging.

Simple Setup, Powerful Results

Add middleware to your LangGraph agent and start building

Basic Usage
from langchain.agents import create_agent
from langmiddle.history import ChatSaver, StorageContext

agent = create_agent(
    model="openai:gpt-4o",
    tools=[],
    context_schema=StorageContext,
    middleware=[
        ChatSaver()  # Uses in-memory SQLite by default
    ],
),

# Chat history automatically saved!
agent.invoke(
    input={"messages": [{"role": "user", "content": "Hello!"}]},
    context=StorageContext(
        thread_id="conversation-123",
        user_id="user-456"
    )
)
🧩

Manage Your Agent Data

Access your dashboard to view conversations, manage semantic memory, and track your agent's interactions.