codex-memory 3.0.0

A simple memory storage service with MCP interface for Claude Desktop
Documentation

Codex Memory

Rust PostgreSQL License: GPL-3.0

A high-performance, minimal memory storage service with MCP (Model Context Protocol) interface for Claude Desktop integration. Codex Memory provides reliable, deduplicated storage with automatic chunking for large content.

๐Ÿš€ Features

Core Capabilities

  • ๐Ÿ—„๏ธ Reliable Text Storage - PostgreSQL-backed storage with ACID compliance
  • ๐Ÿ”’ Content Deduplication - SHA-256 hash-based automatic deduplication
  • ๐Ÿ“„ Smart File Chunking - Automatic chunking with configurable overlap for large files
  • ๐Ÿท๏ธ Tag-Based Organization - Flexible tagging system for categorization
  • ๐Ÿ”— Parent-Child Relationships - Maintains relationships between chunks and source documents
  • ๐Ÿค– MCP Integration - Native Model Context Protocol support for Claude Desktop

Technical Features

  • Connection Pooling - Optimized connection management (20 connections)
  • Async/Await Architecture - Built on Tokio for high concurrency
  • Comprehensive Error Handling - Proper Result types throughout
  • UTF-8 Safe Chunking - Respects character boundaries in all operations
  • Full-Text Search - PostgreSQL-powered search capabilities

๐Ÿค Works With Codex-Dreams

  • Shared Database - Uses the same PostgreSQL database for seamless integration
  • Complementary Roles - Codex Memory for storage, Codex-Dreams for AI analysis
  • Zero Configuration - Install both applications, they work together automatically
  • Enhanced Workflow - Store with Codex Memory, analyze with Codex-Dreams

๐Ÿ“‹ Architecture

Codex follows a modular architecture focused on simplicity and reliability:

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                      Codex Memory Architecture                   โ”‚
โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
โ”‚                                                                 โ”‚
โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”    โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”    โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ” โ”‚
โ”‚  โ”‚   CLI Interface โ”‚    โ”‚   MCP Server    โ”‚    โ”‚  PostgreSQL โ”‚ โ”‚
โ”‚  โ”‚                 โ”‚    โ”‚                 โ”‚    โ”‚  Database   โ”‚ โ”‚
โ”‚  โ”‚  โ€ข store        โ”‚โ—„โ”€โ”€โ–บโ”‚  โ€ข JSON-RPC 2.0 โ”‚โ—„โ”€โ”€โ–บโ”‚             โ”‚ โ”‚
โ”‚  โ”‚  โ€ข get          โ”‚    โ”‚  โ€ข 5 MCP Tools  โ”‚    โ”‚  โ€ข Indexes  โ”‚ โ”‚
โ”‚  โ”‚  โ€ข stats        โ”‚    โ”‚  โ€ข stdio I/O    โ”‚    โ”‚  โ€ข ACID     โ”‚ โ”‚
โ”‚  โ”‚  โ€ข setup        โ”‚    โ”‚                 โ”‚    โ”‚             โ”‚ โ”‚
โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜    โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜    โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜ โ”‚
โ”‚                                                                 โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

๐Ÿง  Companion Application: Codex-Dreams

Codex Memory works seamlessly with Codex-Dreams, a companion application that adds advanced cognitive processing capabilities:

  • Memory Insights: Generate intelligent insights from stored memories using LLM analysis
  • Pattern Recognition: Identify trends and connections across your stored content
  • Memory Consolidation: Advanced memory tiering and summarization
  • Semantic Analysis: Deep understanding and categorization of memories

Codex-Dreams reads from the same database as Codex Memory, creating a powerful two-tier system:

  • Codex Memory (this project): Fast, reliable storage and retrieval
  • Codex-Dreams: Advanced AI-powered analysis and insights

Get started with both: Install Codex Memory first, then add Codex-Dreams for cognitive features.

๐Ÿ› ๏ธ Installation

Prerequisites

  • Rust 1.70 or higher
  • PostgreSQL 14 or higher
  • Claude Desktop (optional, for MCP integration)

Quick Start

# Clone the repository
git clone https://github.com/Ladvien/codex-memory.git
cd codex-memory

# Set up environment
cp .env.example .env
# Edit .env with your database credentials

# Install
cargo install --path . --force

# Setup database (creates database, user, and tables)
codex-memory setup

# Run MCP server for Claude Desktop
codex-memory mcp

Environment Configuration

Create a .env file with:

DATABASE_URL=postgresql://codex_user:codex_pass@localhost:5432/codex_db
RUST_LOG=info  # Optional: debug, info, warn, error

๐Ÿ“– Usage

Command Line Interface

# Store content with metadata
codex-memory store "Your content here" \
  --context "Meeting notes" \
  --summary "Q4 planning discussion" \
  --tags "meeting,planning,q4"

# Retrieve content by ID
codex-memory get <UUID>

# View storage statistics
codex-memory stats

# Run MCP server for Claude Desktop
codex-memory mcp

๐Ÿ”„ Companion Workflow with Codex-Dreams

For the full cognitive memory experience, use both applications together:

# 1. Store memories with Codex Memory (fast, reliable)
codex-memory store "Research findings on neural networks" \
  --context "AI Research" \
  --summary "Key insights from latest papers" \
  --tags "ai,research,neural-networks"

# 2. Generate insights with Codex-Dreams (AI-powered analysis)
codex-dreams generate-insights --time-period week
codex-dreams show-insights --limit 5
codex-dreams search-insights "neural networks" --limit 10

๐Ÿ’ก Workflow Tip: Use Codex Memory for day-to-day storage and retrieval, then run Codex-Dreams periodically to generate insights and discover patterns across your stored memories.

MCP Tools (Claude Desktop)

Codex provides 5 MCP tools:

Tool Description Parameters
store_memory Store text with metadata content, context, summary, tags
get_memory Retrieve by ID id (UUID)
delete_memory Remove by ID id (UUID)
get_statistics Get storage stats none
store_file Chunk and store files file_path, chunk_size, overlap, tags

Claude Desktop Configuration

Add to your Claude Desktop config:

{
  "mcpServers": {
    "codex-memory": {
      "command": "/path/to/codex-memory",
      "args": ["mcp"],
      "env": {
        "DATABASE_URL": "postgresql://codex_user:codex_pass@localhost:5432/codex_db"
      }
    }
  }
}

๐Ÿ—๏ธ API

Rust API Example

use codex_memory::{Storage, Config, create_pool};
use std::sync::Arc;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    // Create connection pool
    let config = Config::from_env()?;
    let pool = create_pool(&config.database_url).await?;
    let storage = Arc::new(Storage::new(pool));
    
    // Store content
    let id = storage.store(
        "Content to store",
        "Context information".to_string(),
        "Brief summary".to_string(),
        Some(vec!["tag1".to_string(), "tag2".to_string()])
    ).await?;
    
    // Retrieve content
    if let Some(memory) = storage.get(id).await? {
        println!("Retrieved: {}", memory.content);
    }
    
    Ok(())
}

Database Schema

CREATE TABLE memories (
    id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
    content TEXT NOT NULL,
    content_hash VARCHAR(64) NOT NULL UNIQUE,
    context TEXT NOT NULL,
    summary TEXT NOT NULL,
    metadata JSONB DEFAULT '{}',
    tags TEXT[] DEFAULT '{}',
    chunk_index INTEGER DEFAULT NULL,
    total_chunks INTEGER DEFAULT NULL,
    parent_id UUID DEFAULT NULL,
    created_at TIMESTAMPTZ DEFAULT NOW(),
    updated_at TIMESTAMPTZ DEFAULT NOW()
);

๐Ÿงช Testing

# Run all tests
cargo test

# Run with output
cargo test -- --nocapture

# Run specific test suite
cargo test integration
cargo test unit
cargo test edge_cases

# Run with coverage (requires cargo-tarpaulin)
cargo tarpaulin --out Html

๐Ÿš€ Performance

Benchmarks

Operation Performance Notes
Store (small) ~5ms Including deduplication
Store (chunked) ~10ms/chunk 8KB chunks
Retrieve ~2ms By UUID
Delete ~3ms Single operation
Statistics ~15ms Aggregate query

Optimization Features

  • Connection pooling (20 connections, 2 min)
  • Prepared statements
  • Index optimization (B-tree and GIN indexes)
  • SHA-256 content deduplication
  • Async I/O throughout

๐Ÿ”ง Development

Building from Source

# Development build
cargo build

# Release build (optimized)
cargo build --release

# Run clippy lints
cargo clippy -- -D warnings

# Format code
cargo fmt

# Security audit
cargo audit

Project Structure

codex-memory/
โ”œโ”€โ”€ src/
โ”‚   โ”œโ”€โ”€ main.rs           # CLI entry point
โ”‚   โ”œโ”€โ”€ lib.rs            # Library exports
โ”‚   โ”œโ”€โ”€ storage.rs        # Core storage logic
โ”‚   โ”œโ”€โ”€ models.rs         # Data structures
โ”‚   โ”œโ”€โ”€ database/         # Database operations
โ”‚   โ”œโ”€โ”€ mcp_server/       # MCP protocol implementation
โ”‚   โ””โ”€โ”€ chunking.rs       # File chunking logic
โ”œโ”€โ”€ tests/
โ”‚   โ”œโ”€โ”€ unit/             # Unit tests
โ”‚   โ”œโ”€โ”€ integration/      # Integration tests
โ”‚   โ””โ”€โ”€ edge_cases/       # Edge case tests
โ””โ”€โ”€ migrations/           # Database migrations

๐Ÿค Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/AmazingFeature)
  3. Commit your changes (git commit -m 'Add some AmazingFeature')
  4. Push to the branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

๐Ÿ”— Related Projects

๐Ÿ“ž Support

๐Ÿ™ Acknowledgments

  • Built with Rust and PostgreSQL
  • MCP protocol for LLM integration
  • Tokio for async runtime
  • SQLx for database operations

๐Ÿ’ก Pro Tip: Maximize your memory system by using both applications together. Codex Memory handles fast storage and retrieval, while Codex-Dreams adds intelligent analysis and insights generation.