Every new session, your AI starts from zero. The decisions you made yesterday, the patterns you taught it, the bugs you already solved together - gone. You re-explain. It re-discovers. You lose hours every week to an assistant with no long-term memory.
MAG fixes this. It gives your AI tools persistent memory that survives across sessions, across projects, and across tools. Open Claude on Monday. It already knows what you decided on Friday.
Quick Start
|
That's it. The installer auto-detects your AI tools and wires MAG in automatically — no manual JSON editing needed. Open your coding tool and your assistant already has persistent memory.
To store or search memories directly from the CLI:
# → "The retry logic should use exponential backoff with jitter" (score: 0.94)
Why Developers Choose MAG
- Switch from Claude to Cursor. Your context came with you. One memory store works across every MCP-compatible tool on your machine.
- Finds what you stored, even when you search differently than how you saved it. Hybrid retrieval fuses full-text, semantic, and graph search so you don't need exact wording.
- No accounts. No intermediaries. No "free tier." A single Rust binary, a single SQLite file. Install it, use it, own it.
- Your client signed an NDA with you. Not with Mem0's infrastructure. Zero third-party data routing by default. Your memory data never touches servers you don't control.
- Your memory, not your vendor's. Claude, Cursor, and ChatGPT are all building their own memory - but it stays inside their tool. MAG bridges all of them. One memory store, every tool, portable forever.
Works With
| Tool | macOS | Linux | Auto-configured |
|---|---|---|---|
| Claude Code | ✅ | ✅ | mag setup |
| Claude Desktop | ✅ | ✅ | mag setup |
| Cursor | ✅ | ✅ | mag setup |
| VS Code + Copilot | ✅ | ✅ | mag setup |
| Windsurf | ✅ | ✅ | mag setup |
| Cline | ✅ | ✅ | mag setup |
| Gemini CLI | ✅ | ✅ | mag setup |
| Zed | ✅ | ✅ | Manual |
| Codex (OpenAI) | ✅ | ✅ | mag setup |
Any tool that supports MCP can connect to MAG. Windows is untested - report your results.
Benchmarks
90.1% retrieval accuracy on the LoCoMo memory benchmark. Don't trust our number - run it yourself:
AutoMem's published score on the same benchmark is 90.5%. Full methodology, model comparisons, and historical runs in docs/benchmarks/.
Your Data, Your Control
Your memory data never touches third-party servers. Not ours, not anyone else's. Same guarantee whether you run MAG on your laptop or deploy it on your own infrastructure.
- Zero third-party data routing (API embedding models are optional, off by default)
- Single SQLite file, portable and inspectable
- Export everything with one command. Open it with any SQLite browser.
- MIT licensed, no tiers, no vendor lock-in
- The binary keeps working whether we maintain this project or not
See SECURITY.md for the full data-flow audit.
Deploy Your Way
| Mode | Description |
|---|---|
| Local (default) | Single binary on your machine. Zero config. |
| Daemon | mag serve --http for persistent HTTP access. Same binary, same file. |
| Self-hosted | Deploy on your own server or cloud. Same privacy guarantees at scale. |
| MAG Cloud | Coming soon. We run the infrastructure. You own the data. Same guarantees. |
Every mode: zero third-party data access, full data portability, MIT licensed.
Install
| Method | Command |
|---|---|
| Shell (macOS / Linux) | curl -fsSL https://raw.githubusercontent.com/George-RD/mag/main/install.sh | sh |
| Homebrew | brew install George-RD/mag/mag |
| npm | npm install -g mag-memory |
| uv | uv tool install mag-memory |
| pip | pip install mag-memory |
| Cargo | cargo install mag-memory |
From source (latest main): cargo install --git https://github.com/George-RD/mag.git
Prebuilt binaries: macOS (x64, ARM), Linux (x64, ARM), Windows (x64) on the Releases page.
Configure Your AI Tools
The installer runs mag setup automatically. To reconfigure at any time:
This detects installed AI tools, shows their configuration status, and writes the correct MCP config for each one. Use --non-interactive for CI or scripted environments.
Let your AI set it up: Paste this into any AI assistant and it will handle install + configuration for you:
Install and configure MAG by following https://github.com/George-RD/mag/blob/main/docs/SETUP.md
MAG runs as an MCP server. Add it to your client's config file:
Claude Code: claude mcp add mag -- mag serve
npx (no install):
Per-tool setup guides: Claude Desktop | Cursor | Claude Code | Windsurf | Cline
Learn More
- MCP Tools - all 16 tools MAG exposes over MCP
- Architecture - search pipeline, scoring, modules
- Benchmarks - full results, model comparisons, methodology
- Security - data-flow audit, threat model
- What to Store - get the most out of persistent memory
- Setup Guide - install, configure, and best practices
- Per-Tool Guides - detailed per-tool configuration
- Changelog - recent changes
- AGENTS.md - conventions, development commands
License
MIT