Expand description
Smart Tree Daemon - System-wide AI Context Service
Runs smart-tree as a persistent background service that any AI can connect to. Provides:
- HTTP API for context queries
- WebSocket for real-time updates
- Foken GPU credit tracking
- HTTP MCP - Full MCP protocol over HTTP (not just stdio!)
- LLM Proxy - Unified interface to multiple AI providers with memory!
- Collaboration Station - Multi-AI real-time collaboration with Hot Tub mode! 🛁
- The Custodian - Watches all operations for suspicious patterns 🧹
- GitHub Auth - OAuth for i1.is/aye.is identity
“The always-on brain for your system!” - Cheet
§Architecture
All AI features route through the daemon for persistent memory and unified state. The LLM proxy (OpenAI-compatible at /v1/chat/completions) is integrated directly. Collaboration hub enables humans and AIs to work together in real-time. The Custodian monitors all MCP operations for data exfiltration and supply chain attacks.
Structs§
- Credit
Tracker - Credit tracker for Foken earnings
- Daemon
Config - Daemon configuration
- Daemon
State - Daemon state - The unified AI brain
- Directory
Info - Project
Info - System
Context - System-wide context
- Transaction
Functions§
- load_
all_ tokens - Load all available valid tokens (for servers to accept any valid local token).
- load_
or_ create_ token - Load or generate the daemon auth token. Creates a new random token on first run and persists it.
- load_
token - Load existing token (for clients). Returns None if no token file exists. Prioritizes the system-level daemon token.
- start_
daemon - Start the daemon server
- token_
path - Get the path to the daemon auth token file. Respects ST_TOKEN_PATH env var (for systemd StateDirectory), falls back to ~/.st/daemon.token