Expand description
Cloud-based LLM provider integrations
This module provides clients for fetching conversation histories from cloud-based LLM services like ChatGPT, Claude, Perplexity, etc.
§Supported Providers
- Microsoft 365 Copilot - Enterprise AI assistant in Office apps
- ChatGPT - OpenAI’s ChatGPT web interface conversations
- Anthropic - Claude conversations
- Perplexity - Perplexity AI conversations
- DeepSeek - DeepSeek chat history
- Qwen - Alibaba Qwen conversations
- Gemini - Google Gemini conversations
- Mistral - Mistral AI conversations
- Cohere - Cohere chat history
- Grok - xAI Grok conversations
- Groq - Groq conversations
- Together - Together AI conversations
- Fireworks - Fireworks AI conversations
§Authentication
Most providers require API keys or session tokens. These can be provided via:
- Environment variables (e.g.,
OPENAI_API_KEY) - Configuration file (
~/.config/csm/config.json) - Command-line arguments
Re-exports§
pub use anthropic::AnthropicProvider;pub use chatgpt::ChatGPTProvider;pub use common::CloudConversation;pub use common::CloudMessage;pub use common::CloudProvider;pub use common::FetchOptions;pub use deepseek::DeepSeekProvider;pub use gemini::GeminiProvider;pub use m365copilot::M365CopilotProvider;pub use perplexity::PerplexityProvider;
Modules§
- anthropic
- Anthropic (Claude) cloud provider
- chatgpt
- ChatGPT (OpenAI) cloud provider
- common
- Common types and traits for cloud providers
- deepseek
- DeepSeek cloud provider
- gemini
- Google Gemini cloud provider
- m365copilot
- Microsoft 365 Copilot cloud provider
- perplexity
- Perplexity AI cloud provider
Functions§
- fetch_
conversations - Fetch all conversations from a cloud provider
- get_
cloud_ provider - Get a cloud provider by type