Skip to main content

Module compress

Module compress 

Source
Expand description

LLM-powered observation compression for Codemem.

Compresses raw tool observations into concise structural summaries using a configured LLM provider (Ollama, OpenAI-compatible, or Anthropic). Falls back to raw content on failure or when not configured.

§Configuration (environment variables)

  • CODEMEM_COMPRESS_PROVIDER: ollama | openai | anthropic (default: disabled)
  • CODEMEM_COMPRESS_MODEL: model name (defaults: llama3.2, gpt-4o-mini, claude-haiku-4-5-20251001)
  • CODEMEM_COMPRESS_URL: base URL override (defaults: http://localhost:11434, https://api.openai.com/v1)
  • CODEMEM_API_KEY / OPENAI_API_KEY / ANTHROPIC_API_KEY: API key for cloud providers

Enums§

CompressProvider