LocalGPT
A Local focused AI assistant with persistent memory and continuous operation capabilities (or reshaped OpenClaw in Rust).
Features
- Local-only operation - Runs entirely on your machine
- Persistent memory - Markdown-based knowledge store with FTS search
- Multiple LLM providers - OpenAI, Anthropic, and Ollama support
- Heartbeat runner - Autonomous task execution
- HTTP API - REST endpoints for programmatic access
- Small footprint - ~7MB binary, minimal dependencies
Installation
# Build from source
# Install globally
Quick Start
# Initialize configuration
# Start interactive chat
# Ask a single question
# Run daemon with HTTP server
Configuration
Configuration is stored at ~/.localgpt/config.toml:
[]
= "claude-cli/opus"
= 128000
= 8000
[]
= "${OPENAI_API_KEY}"
[]
= "${ANTHROPIC_API_KEY}"
[]
= true
= "30m"
= { = "09:00", = "22:00" }
[]
= "~/.localgpt/workspace"
[]
= true
= 31327
= "127.0.0.1"
Memory System
LocalGPT uses plain markdown files as the source of truth:
~/.localgpt/workspace/
├── MEMORY.md # Curated long-term knowledge
├── HEARTBEAT.md # Pending tasks/reminders
└── memory/
├── 2024-01-15.md # Daily append-only logs
└── ...
Memory files are indexed with SQLite FTS5 for fast keyword search.
CLI Commands
# Chat
# Daemon
# Memory
# Config
HTTP API
When the daemon is running, the following endpoints are available:
GET /health- Health checkGET /api/status- Server statusPOST /api/chat- Chat with the assistantGET /api/memory/search?q=<query>- Search memoryGET /api/memory/stats- Memory statistics
License
Apache-2.0