Skip to main content

Module daemon

Module daemon 

Source
Expand description

Smart Tree Daemon - System-wide AI Context Service

Runs smart-tree as a persistent background service that any AI can connect to. Provides:

  • HTTP API for context queries
  • WebSocket for real-time updates
  • Foken GPU credit tracking
  • HTTP MCP - Full MCP protocol over HTTP (not just stdio!)
  • LLM Proxy - Unified interface to multiple AI providers with memory!
  • Collaboration Station - Multi-AI real-time collaboration with Hot Tub mode! 🛁
  • The Custodian - Watches all operations for suspicious patterns 🧹
  • GitHub Auth - OAuth for i1.is/aye.is identity

“The always-on brain for your system!” - Cheet

§Architecture

All AI features route through the daemon for persistent memory and unified state. The LLM proxy (OpenAI-compatible at /v1/chat/completions) is integrated directly. Collaboration hub enables humans and AIs to work together in real-time. The Custodian monitors all MCP operations for data exfiltration and supply chain attacks.

Structs§

CreditTracker
Credit tracker for Foken earnings
DaemonConfig
Daemon configuration
DaemonState
Daemon state - The unified AI brain
DirectoryInfo
ProjectInfo
SystemContext
System-wide context
Transaction

Functions§

load_all_tokens
Load all available valid tokens (for servers to accept any valid local token).
load_or_create_token
Load or generate the daemon auth token. Creates a new random token on first run and persists it.
load_token
Load existing token (for clients). Returns None if no token file exists. Prioritizes the system-level daemon token.
start_daemon
Start the daemon server
token_path
Get the path to the daemon auth token file. Respects ST_TOKEN_PATH env var (for systemd StateDirectory), falls back to ~/.st/daemon.token