Supercharge your LangGraph agents with plug‑and‑play memory, context management, and chat persistence
Install LangMiddle and start building production-ready agents
pip install langmiddleWorks out-of-the-box with in-memory SQLite - no database setup
JWT auth, RLS support, type-safe APIs, comprehensive logging
LangMiddle Portal provides a powerful web interface to manage your agent's data
Browse, search, and manage all your AI conversation threads. Multi-backend support: SQLite, PostgreSQL, Supabase, Firebase.
Automatic fact extraction, deduplication, and context injection. Vector-based search with relevance scoring and adaptive formatting.
JWT authentication, row-level security (RLS), type-safe APIs with full Pydantic validation, and comprehensive logging.
Add middleware to your LangGraph agent and start building
from langchain.agents import create_agent
from langmiddle.history import ChatSaver, StorageContext
agent = create_agent(
model="openai:gpt-4o",
tools=[],
context_schema=StorageContext,
middleware=[
ChatSaver() # Uses in-memory SQLite by default
],
),
# Chat history automatically saved!
agent.invoke(
input={"messages": [{"role": "user", "content": "Hello!"}]},
context=StorageContext(
thread_id="conversation-123",
user_id="user-456"
)
)