MCP Memory Server for Claude Desktop, Cursor & Windsurf
Dakera exposes 83 MCP tools that give your AI assistant persistent memory across every conversation. Self-hosted, encrypted, zero code changes.
How It Works
Run Dakera
Pull the Docker image or download the single Rust binary. Dakera runs on port 3300 (REST) and exposes MCP over stdio.
docker run -d -p 3300:3300 ghcr.io/dakera-ai/dakera:latest
Add to Your MCP Config
Point your Claude Desktop, Cursor, or Windsurf MCP configuration to the Dakera binary. No SDKs required.
{
"mcpServers": {
"dakera": {
"command": "dakera",
"args": ["mcp"],
"env": {
"DAKERA_URL": "http://localhost:3300",
"DAKERA_API_KEY": "dk-your-key"
}
}
}
}
Your AI Remembers Everything
Your assistant automatically stores and recalls memories. Preferences, decisions, project context — all persisted across sessions with hybrid retrieval (HNSW vector + BM25 full-text + cross-encoder reranking).
Key Capabilities
| Capability | Details |
|---|---|
| MCP Tools | 83 tools: store, recall, search, graph traverse, sessions, namespaces, decay, vectors |
| Hybrid Retrieval | HNSW vector + BM25 full-text via Reciprocal Rank Fusion + cross-encoder reranking |
| On-Device Inference | ONNX runtime: MiniLM, BGE, E5 embedding models — no external API calls |
| Knowledge Graphs | GLiNER entity extraction, 4 edge types, BFS traversal |
| Memory Decay | 6 strategies: exponential, linear, logarithmic, step, periodic, custom |
| Security | AES-256-GCM encryption at rest, scoped API keys, rate limiting |
| Benchmark | 87.6% on LoCoMo (50 sessions, 1540 questions) |
Supported MCP Clients
- Claude Desktop — native MCP support via claude_desktop_config.json
- Cursor — MCP settings in .cursor/mcp.json
- Windsurf — MCP configuration in workspace settings
- Any MCP-compatible client — standard stdio transport
Example: Store and Recall via MCP
Once configured, your AI assistant uses the MCP tools directly. Behind the scenes, the tools call the Dakera REST API:
# Store a memory via REST API
curl -X POST http://localhost:3300/v1/memory \
-H "Authorization: Bearer dk-your-key" \
-H "Content-Type: application/json" \
-d '{
"content": "User prefers dark mode and vim keybindings",
"namespace": "preferences",
"metadata": {"source": "conversation", "importance": 0.9}
}'
# Recall relevant memories
curl -X POST http://localhost:3300/v1/memory/recall \
-H "Authorization: Bearer dk-your-key" \
-H "Content-Type: application/json" \
-d '{
"query": "What editor settings does the user prefer?",
"namespace": "preferences",
"top_k": 5
}'
Frequently Asked Questions
An MCP memory server implements the Model Context Protocol to provide persistent memory capabilities to AI assistants like Claude Desktop, Cursor, and Windsurf. It stores, recalls, and manages memories across conversations without requiring code changes to your AI client.
Dakera exposes 83 MCP tools covering memory storage, recall, search, knowledge graphs, sessions, namespaces, decay management, and vector operations. All tools are available in Claude Desktop, Cursor, and Windsurf.
No. Dakera works as a native MCP server. You add it to your claude_desktop_config.json or Cursor/Windsurf MCP settings and it immediately provides persistent memory to your AI assistant with zero code changes.
Dakera is fully self-hosted. All data stays on your machine or your own server. It runs as a single Rust binary with on-device ONNX inference for embeddings — nothing leaves your environment.
Give Your AI Persistent Memory in 2 Minutes
Run the Docker image, add the MCP config, done. No cloud, no code changes.
Get Started