vex-llm
LLM provider integrations for the VEX Protocol.
Supported Providers
- OpenAI - GPT-4, GPT-3.5, etc.
- Ollama - Local LLM inference
- DeepSeek - DeepSeek models
- Mistral - Mistral AI models
- Mock - Testing provider
Installation
[]
= "0.1"
# With OpenAI support
= { = "0.1", = ["openai"] }
Quick Start
use ;
async
License
MIT License - see LICENSE for details.