# Codex Memory
[](https://www.rust-lang.org/)
[](https://www.postgresql.org/)
[](https://opensource.org/licenses/GPL-3.0)
A high-performance, minimal memory storage service with MCP (Model Context Protocol) interface for Claude Desktop integration. Codex Memory provides reliable, deduplicated storage with automatic chunking for large content.
## ๐ Features
### Core Capabilities
- **๐๏ธ Reliable Text Storage** - PostgreSQL-backed storage with ACID compliance
- **๐ Content Deduplication** - SHA-256 hash-based automatic deduplication
- **๐ Smart File Chunking** - Automatic chunking with configurable overlap for large files
- **๐ท๏ธ Tag-Based Organization** - Flexible tagging system for categorization
- **๐ Parent-Child Relationships** - Maintains relationships between chunks and source documents
- **๐ค MCP Integration** - Native Model Context Protocol support for Claude Desktop
### Technical Features
- **Connection Pooling** - Optimized connection management (5 connections)
- **Async/Await Architecture** - Built on Tokio for high concurrency
- **Comprehensive Error Handling** - Proper Result types throughout
- **UTF-8 Safe Chunking** - Respects character boundaries in all operations
- **Full-Text Search** - PostgreSQL-powered search capabilities
- **Process Reliability** - Singleton process management with health monitoring
- **Graceful Shutdown** - SIGTERM/SIGINT handling with resource cleanup
- **Auto-Recovery** - Wrapper script with restart capabilities and rate limiting
### ๐ค Works With Codex-Dreams
- **Shared Database** - Uses the same PostgreSQL database for seamless integration
- **Complementary Roles** - Codex Memory for storage, Codex-Dreams for AI analysis
- **Zero Configuration** - Install both applications, they work together automatically
- **Enhanced Workflow** - Store with Codex Memory, analyze with Codex-Dreams
## ๐ Architecture
Codex follows a modular architecture focused on simplicity and reliability:
```
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Codex Memory Architecture โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โ
โ โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โ
โ โ CLI Interface โ โ MCP Server โ โ PostgreSQL โ โ
โ โ โ โ โ โ Database โ โ
โ โ โข store โโโโโบโ โข JSON-RPC 2.0 โโโโโบโ โ โ
โ โ โข get โ โ โข 5 MCP Tools โ โ โข Indexes โ โ
โ โ โข stats โ โ โข stdio I/O โ โ โข ACID โ โ
โ โ โข setup โ โ โ โ โ โ
โ โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โ
โ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
```
### ๐ง Companion Application: Codex-Dreams
Codex Memory works seamlessly with [**Codex-Dreams**](https://github.com/Ladvien/codex-dreams), a companion application that adds advanced cognitive processing capabilities:
- **Memory Insights**: Generate intelligent insights from stored memories using LLM analysis
- **Pattern Recognition**: Identify trends and connections across your stored content
- **Memory Consolidation**: Advanced memory tiering and summarization
- **Semantic Analysis**: Deep understanding and categorization of memories
Codex-Dreams reads from the same database as Codex Memory, creating a powerful two-tier system:
- **Codex Memory** (this project): Fast, reliable storage and retrieval
- **Codex-Dreams**: Advanced AI-powered analysis and insights
Get started with both: Install Codex Memory first, then add [Codex-Dreams](https://github.com/Ladvien/codex-dreams) for cognitive features.
## ๐ ๏ธ Installation
### Prerequisites
- Rust 1.70 or higher
- PostgreSQL 14 or higher
- Claude Desktop (optional, for MCP integration)
### Quick Start
```bash
# Clone the repository
git clone https://github.com/Ladvien/codex-memory.git
cd codex-memory
# Set up environment
cp .env.example .env
# Edit .env with your database credentials
# Install
cargo install --path . --force
# Setup database (creates database, user, and tables)
codex-memory setup
# Run MCP server for Claude Desktop
codex-memory mcp
```
### Environment Configuration
Create a `.env` file with:
```bash
DATABASE_URL=postgresql://codex_user:codex_pass@localhost:5432/codex_db
RUST_LOG=info # Optional: debug, info, warn, error
```
## ๐ Usage
### Command Line Interface
```bash
# Store content with metadata
codex-memory store "Your content here" \
--context "Meeting notes" \
--summary "Q4 planning discussion" \
--tags "meeting,planning,q4"
# Retrieve content by ID
codex-memory get <UUID>
# View storage statistics
codex-memory stats
# Run MCP server for Claude Desktop
codex-memory mcp
```
### ๐ Companion Workflow with Codex-Dreams
For the full cognitive memory experience, use both applications together:
```bash
# 1. Store memories with Codex Memory (fast, reliable)
codex-memory store "Research findings on neural networks" \
--context "AI Research" \
--summary "Key insights from latest papers" \
--tags "ai,research,neural-networks"
# 2. Generate insights with Codex-Dreams (AI-powered analysis)
codex-dreams generate-insights --time-period week
codex-dreams show-insights --limit 5
codex-dreams search-insights "neural networks" --limit 10
```
> **๐ก Workflow Tip**: Use Codex Memory for day-to-day storage and retrieval, then run Codex-Dreams periodically to generate insights and discover patterns across your stored memories.
### MCP Tools (Claude Desktop)
Codex provides 5 MCP tools:
| `store_memory` | Store text with metadata | content, context, summary, tags |
| `get_memory` | Retrieve by ID | id (UUID) |
| `delete_memory` | Remove by ID | id (UUID) |
| `get_statistics` | Get storage stats | none |
| `store_file` | Chunk and store files | file_path, chunk_size, overlap, tags |
### Claude Desktop Configuration
Add to your Claude Desktop config:
```json
{
"mcpServers": {
"codex-memory": {
"command": "/path/to/codex-memory",
"args": ["mcp"],
"env": {
"DATABASE_URL": "postgresql://codex_user:codex_pass@localhost:5432/codex_db"
}
}
}
}
```
## ๐๏ธ API
### Rust API Example
```rust
use codex_memory::{Storage, Config, create_pool};
use std::sync::Arc;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create connection pool
let config = Config::from_env()?;
let pool = create_pool(&config.database_url).await?;
let storage = Arc::new(Storage::new(pool));
// Store content
let id = storage.store(
"Content to store",
"Context information".to_string(),
"Brief summary".to_string(),
Some(vec!["tag1".to_string(), "tag2".to_string()])
).await?;
// Retrieve content
if let Some(memory) = storage.get(id).await? {
println!("Retrieved: {}", memory.content);
}
Ok(())
}
```
### Database Schema
```sql
CREATE TABLE memories (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
content TEXT NOT NULL,
content_hash VARCHAR(64) NOT NULL UNIQUE,
context TEXT NOT NULL,
summary TEXT NOT NULL,
metadata JSONB DEFAULT '{}',
tags TEXT[] DEFAULT '{}',
chunk_index INTEGER DEFAULT NULL,
total_chunks INTEGER DEFAULT NULL,
parent_id UUID DEFAULT NULL,
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW()
);
```
## ๐งช Testing
```bash
# Run all tests
cargo test
# Run with output
cargo test -- --nocapture
# Run specific test suite
cargo test integration
cargo test unit
cargo test edge_cases
# Run with coverage (requires cargo-tarpaulin)
cargo tarpaulin --out Html
```
## ๐ Performance
### Benchmarks
| Store (small) | ~5ms | Including deduplication |
| Store (chunked) | ~10ms/chunk | 8KB chunks |
| Retrieve | ~2ms | By UUID |
| Delete | ~3ms | Single operation |
| Statistics | ~15ms | Aggregate query |
### Optimization Features
- Connection pooling (20 connections, 2 min)
- Prepared statements
- Index optimization (B-tree and GIN indexes)
- SHA-256 content deduplication
- Async I/O throughout
## ๐ง Development
### Building from Source
```bash
# Development build
cargo build
# Release build (optimized)
cargo build --release
# Run clippy lints
cargo clippy -- -D warnings
# Format code
cargo fmt
# Security audit
cargo audit
```
### Project Structure
```
codex-memory/
โโโ src/
โ โโโ main.rs # CLI entry point
โ โโโ lib.rs # Library exports
โ โโโ storage.rs # Core storage logic
โ โโโ models.rs # Data structures
โ โโโ database/ # Database operations
โ โโโ mcp_server/ # MCP protocol implementation
โ โโโ chunking.rs # File chunking logic
โโโ tests/
โ โโโ unit/ # Unit tests
โ โโโ integration/ # Integration tests
โ โโโ edge_cases/ # Edge case tests
โโโ migrations/ # Database migrations
```
## ๐ค Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
1. Fork the repository
2. Create your feature branch (`git checkout -b feature/AmazingFeature`)
3. Commit your changes (`git commit -m 'Add some AmazingFeature'`)
4. Push to the branch (`git push origin feature/AmazingFeature`)
5. Open a Pull Request
## ๐ License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## ๐ Related Projects
- [**Codex-Dreams**](https://github.com/Ladvien/codex-dreams) - Companion app for AI-powered memory insights and cognitive processing
- [Claude Desktop](https://claude.ai/desktop) - Anthropic's Claude Desktop application
- [MCP Specification](https://modelcontextprotocol.io) - Model Context Protocol specification
## ๐ Support
- **Issues**: [GitHub Issues](https://github.com/Ladvien/codex-memory/issues)
- **Discussions**: [GitHub Discussions](https://github.com/Ladvien/codex-memory/discussions)
- **Documentation**: See [ARCHITECTURE.md](ARCHITECTURE.md) for detailed system design
## ๐ Acknowledgments
- Built with Rust and PostgreSQL
- MCP protocol for LLM integration
- Tokio for async runtime
- SQLx for database operations
---
**๐ก Pro Tip**: Maximize your memory system by using both applications together. Codex Memory handles fast storage and retrieval, while [Codex-Dreams](https://github.com/Ladvien/codex-dreams) adds intelligent analysis and insights generation.