phago-llm
LLM integration for Phago semantic intelligence.
Overview
This crate provides LLM backends for enhanced concept extraction:
- OllamaBackend — Local LLM via Ollama (no API key needed)
- ClaudeBackend — Anthropic Claude API
- OpenAiBackend — OpenAI GPT API
- MockBackend — Testing without real LLM calls
Usage
use ;
async
Features
| Feature | Description |
|---|---|
local |
Ollama backend |
api |
Claude and OpenAI backends |
full |
All backends |
Part of Phago
This is a subcrate of phago. For most use cases, depend on the main phago crate with the llm feature instead.
License
MIT