
a-mem-mcp
io.github.DiaaAj/a-mem-mcp
Self-evolving memory system for AI agents
Documentation
A-MEM: Self-evolving memory for coding agents
mcp-name: io.github.DiaaAj/a-mem-mcp
A-MEM is a self-evolving memory system for coding agents. Unlike simple vector stores, A-MEM automatically organizes knowledge into a Zettelkasten-style graph with dynamic relationships. Memories don't just get stored—they evolve and connect over time.
Currently tested with Claude Code. Support for other MCP-compatible agents is planned.
Quick Start
Install
pip install a-mem
Add to Claude Code
claude mcp add a-mem -s user -- a-mem-mcp \
-e LLM_BACKEND=openai \
-e LLM_MODEL=gpt-4o-mini \
-e OPENAI_API_KEY=sk-...
That's it! A session-start hook installs automatically to remind Claude to use memory.
Note: Memory is stored per-project in
./chroma_db. For global memory across all projects, see Memory Scope.
Uninstall
a-mem-uninstall-hook # Remove hooks first
pip uninstall a-mem
How It Works
t=0 t=1 t=2
◉───◉ ◉───◉
◉ │ ╱ │ ╲
◉ ◉──┼──◉
│
◉
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━▶
self-evolving memory
- Add a memory → A-MEM extracts keywords, context, and tags via LLM
- Find neighbors → Searches for semantically similar existing memories
- Evolve → Decides whether to link, strengthen connections, or update related memories
- Store → Persists to ChromaDB with full metadata and relationships
The result: a knowledge graph that grows smarter over time, not just bigger.
Features
Self-Evolving Memory Memories aren't static. When you add new knowledge, A-MEM automatically finds related memories and strengthens connections, updates context, and evolves tags.
Semantic + Structural Search Combines vector similarity with graph traversal. Find memories by meaning, then explore their connections.
MCP Tools
A-MEM exposes 6 tools to your coding agent:
| Tool | Description |
|---|---|
add_memory_note | Store new knowledge (async, returns immediately) |
search_memories | Semantic search across all memories |
search_memories_agentic | Search + follow graph connections |
read_memory_note | Get full details of a specific memory |
update_memory_note | Modify existing memory |
delete_memory_note | Remove a memory |
Example Usage
# The agent calls these automatically, but here's what happens:
# Store a memory (returns task_id immediately)
add_memory_note(content="Auth uses JWT in httpOnly cookies, validated by AuthMiddleware")
# Search later
search_memories(query="authentication flow", k=5)
# Deep search with connections
search_memories_agentic(query="security", k=5)
Advanced Configuration
JSON Config
For more control, edit ~/.claude/settings.json (global) or .claude/settings.local.json (project):
{
"mcpServers": {
"a-mem": {
"command": "a-mem-mcp",
"env": {
"LLM_BACKEND": "openai",
"LLM_MODEL": "gpt-4o-mini",
"OPENAI_API_KEY": "sk-..."
}
}
}
}
Environment Variables
| Variable | Description | Default |
|---|---|---|
LLM_BACKEND | openai, ollama, sglang, openrouter | openai |
LLM_MODEL | Model name | gpt-4o-mini |
OPENAI_API_KEY | OpenAI API key | — |
EMBEDDING_MODEL | Sentence transformer model | all-MiniLM-L6-v2 |
CHROMA_DB_PATH | Storage directory | ./chroma_db |
EVO_THRESHOLD | Evolution trigger threshold | 100 |
Memory Scope
- Project-specific (default): Each project gets isolated memory in
./chroma_db - Global: Share across projects by setting
CHROMA_DB_PATH=~/.local/share/a-mem/chroma_db
Alternative Backends
Ollama (local, free)
claude mcp add a-mem -s user -- a-mem-mcp \
-e LLM_BACKEND=ollama \
-e LLM_MODEL=llama2
OpenRouter (100+ models)
claude mcp add a-mem -s user -- a-mem-mcp \
-e LLM_BACKEND=openrouter \
-e LLM_MODEL=anthropic/claude-3.5-sonnet \
-e OPENROUTER_API_KEY=sk-or-...
Hook Management (Claude Code)
The session-start hook reminds Claude to use memory tools. It installs automatically with Claude Code, but you can manage it manually:
a-mem-install-hook # Install/reinstall hook
a-mem-uninstall-hook # Remove hook completely
Python API
Use A-MEM directly in Python (works with any agent or application):
from agentic_memory.memory_system import AgenticMemorySystem
memory = AgenticMemorySystem(
llm_backend="openai",
llm_model="gpt-4o-mini"
)
# Add (auto-generates keywords, tags, context)
memory_id = memory.add_note("FastAPI app uses dependency injection for DB sessions")
# Search
results = memory.search("database patterns", k=5)
# Read full details
note = memory.read(memory_id)
print(note.keywords, note.tags, note.links)
Research
A-MEM implements concepts from the paper:
A-MEM: Agentic Memory for LLM Agents Xu et al., 2025 arXiv:2502.12110
a-mempip install a-mem