# Mirror-Log
An append-only event log for capturing thoughts, notes, and data you do not want to lose.
`mirror-log` is local-first, SQLite-backed, and designed to be boring in the best way: easy to inspect, easy to script, and hard to accidentally lose context.
## Quick Start
```bash
# Add one event
mirror-log add "Overhead allocation needs review" --source journal
# Add from file
mirror-log add-file notes.md --source meetings
# Bulk import from stdin (one line = one event)
# Show recent
mirror-log show --last 10
# Search full events
mirror-log search "overhead"
# Search chunked content
mirror-log search "allocation" --chunks
# Ingestion stats (total/unique/duplicates)
mirror-log stats
# Database summary
mirror-log info
# Integrity verification (hash + relational checks)
mirror-log verify
```
## Installation
```bash
# Clone and build
git clone https://github.com/CromboJambo/mirror-log
cd mirror-log
cargo build --release
# Binary location: target/release/mirror-log
# Or install locally
cargo install --path .
```
## Documentation
- **[User Guide](docs/USER_GUIDE.md)** - Comprehensive documentation with examples, advanced features, and best practices
- **[API Documentation](src/lib.rs)** - Library usage for programmatic access
## Core Principles
- **Append-only**: Events are never updated or deleted
- **SQLite is source of truth**: Your data stays local and inspectable
- **No hidden layers**: Direct SQL remains first-class
- **Source-aware logging**: Every event tracks where it came from
## Data Model
### Main Tables
**Events Table**
- `id TEXT PRIMARY KEY` (UUID)
- `timestamp INTEGER NOT NULL` (event timestamp)
- `source TEXT NOT NULL`
- `content TEXT NOT NULL`
- `meta TEXT NULL`
- `ingested_at INTEGER NOT NULL`
- `content_hash TEXT NULL` (SHA256 for dedupe analytics)
**Chunks Table**
- Stores chunked slices of event content
- Used by `search --chunks` and large-content workflows
### Direct SQLite Access
```bash
sqlite3 mirror.db
# Query events
SELECT datetime(timestamp, 'unixepoch'), source, content
FROM events
ORDER BY timestamp DESC
LIMIT 10;
# Count by source
SELECT source, COUNT(*)
FROM events
GROUP BY source
ORDER BY COUNT(*) DESC;
# Get stats
SELECT COUNT(*) AS total,
COUNT(DISTINCT content_hash) AS unique_events
FROM events;
```
## Development
```bash
cargo fmt
cargo clippy --all-targets --all-features -- -D warnings
cargo test
```
## License
AGPL-3.0-or-later. See `LICENSE`.