pulsehive-openai
OpenAI-compatible LLM provider for PulseHive.
Works with any OpenAI-compatible API — not just OpenAI:
| Provider | Base URL |
|---|---|
| OpenAI | https://api.openai.com/v1 (default) |
| Azure OpenAI | https://{resource}.openai.azure.com/... |
| Ollama | http://localhost:11434/v1 |
| vLLM | http://localhost:8000/v1 |
| LM Studio | http://localhost:1234/v1 |
| Groq | https://api.groq.com/openai/v1 |
| Together | https://api.together.xyz/v1 |
Usage
[]
= { = "1.0", = ["openai"] }
use ;
// OpenAI
let provider = new;
// Ollama (local)
let provider = new;
Register with HiveMind:
let hive = builder
.substrate_path
.llm_provider
.build?;
Features
- Chat completions with tool calling support
- SSE streaming responses
- Automatic retry on 429 (rate limit) and 5xx errors
- Configurable base URL for any OpenAI-compatible endpoint
Links
License
AGPL-3.0-only