Module providers

Module providers 

Source
Expand description

LLM Provider integrations for Chat System Manager

Supports multiple chat providers:

§Local Providers

  • VS Code Copilot Chat (default)
  • Cursor
  • Ollama
  • vLLM
  • Azure AI Foundry (Foundry Local)
  • OpenAI API compatible servers
  • LM Studio
  • LocalAI

§Cloud Providers (conversation history import)

  • ChatGPT (OpenAI)
  • Claude (Anthropic)
  • Perplexity
  • DeepSeek
  • Gemini (Google)
  • Qwen (Alibaba)
  • Mistral
  • Cohere
  • Groq
  • Together AI

Re-exports§

pub use cloud::CloudConversation;
pub use cloud::CloudMessage;
pub use cloud::CloudProvider;
pub use cloud::FetchOptions;
pub use config::ProviderType;
pub use config::CsmConfig;
pub use config::ProviderConfig;
pub use discovery::discover_all_providers;
pub use session_format::GenericMessage;
pub use session_format::GenericSession;

Modules§

cloud
Cloud-based LLM provider integrations
config
Provider configuration and types
cursor
Cursor IDE chat provider
discovery
Provider discovery utilities
ollama
Ollama provider for local LLM inference
openai_compat
OpenAI-compatible provider support
session_format
Session format conversion utilities

Structs§

ProviderRegistry
Registry of available providers

Traits§

ChatProvider
Trait for LLM chat providers