# CoderLib
[](https://crates.io/crates/coderlib)
[](https://docs.rs/coderlib)
[](#license)
[](https://github.com/mexyusef/coderlib/actions)
**CoderLib** is a comprehensive Rust library for LLM-powered code generation, analysis, and editing.
## 🚀 Key Features
### 🤖 **Multi-Provider LLM Support**
- **OpenAI** (GPT-3/4 family) with function calling
- **Anthropic** (Claude Sonnet/Haiku/Opus family)
- **Google** (Gemini Flash/Pro family)
- **Azure OpenAI** with enterprise-grade security
- **Local Models** (Ollama, LM Studio, OpenAI-compatible APIs)
- **Custom Providers** with extensible provider system
### 🔧 **Advanced Tool System**
- **File Operations** - Read, write, search, and modify files safely
- **Code Analysis** - Tree-sitter based parsing for 8+ languages
- **Git Integration** - Repository operations and version control
- **Shell Commands** - Secure command execution with validation
- **Project Analysis** - Understand project structure and dependencies
- **Custom Tools** - Extensible plugin architecture
### 🛡️ **Enterprise-Ready Security**
- **Permission System** - Fine-grained access control
- **Path Validation** - Prevents directory traversal attacks
- **Command Filtering** - Blocks dangerous operations
- **Rate Limiting** - Configurable request throttling
- **Audit Logging** - Complete operation tracking
### 📊 **Session & Context Management**
- **Persistent Sessions** - SQLite-backed conversation history
- **Context Awareness** - Intelligent context gathering and management
- **Auto-Summarization** - Automatic conversation summarization
- **Token Management** - Usage tracking and optimization
- **Memory Efficiency** - Smart context window management
### 🔌 **LSP Integration**
- **Language Server Protocol** - Full LSP client and server support
- **Real-time Diagnostics** - Error detection and reporting
- **Code Completion** - IntelliSense and auto-completion
- **Symbol Navigation** - Go-to-definition and references
- **Refactoring Support** - Code transformations and improvements
### 🌐 **MCP Bridge**
- **Model Context Protocol** - Seamless MCP server integration
- **Tool Interoperability** - Connect with external MCP tools
- **Protocol Compliance** - Full MCP specification support
- **Transport Flexibility** - HTTP, WebSocket, and Stdio transports
## 📦 Installation
Add CoderLib to your `Cargo.toml`:
```toml
[dependencies]
coderlib = "0.1.0"
tokio = { version = "1.0", features = ["full"] }
```
### Feature Flags
Enable specific functionality based on your needs:
```toml
[dependencies]
coderlib = { version = "0.1.0", features = ["full"] }
```
Available features:
- `tools` - File operations and code analysis tools (default)
- `lsp` - Language Server Protocol support (default)
- `mcp` - Model Context Protocol integration (default)
- `full` - All features enabled
## 🚀 Quick Start
### Basic LLM Integration
```rust
use coderlib::{CoderLib, CoderLibConfig, CodeRequest};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Initialize with default configuration
let config = CoderLibConfig::default();
let coder_lib = CoderLib::new(config).await?;
// Create a session
let session_id = coder_lib.create_session(Some("My Session".to_string())).await?;
// Make a request
let request = CodeRequest {
session_id,
content: "Write a hello world function in Rust".to_string(),
attachments: Vec::new(),
model: None,
context: Default::default(),
};
// Process and get streaming response
let mut response_stream = coder_lib.process_request(request).await?;
while let Ok(response) = response_stream.recv().await {
print!("{}", response.content);
if response.is_complete {
break;
}
}
Ok(())
}
```
### Tool Usage Example
```rust
use coderlib::tools::{ToolRouter, FileOperationsTool};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let mut router = ToolRouter::new();
// Register file operations tool
router.register_tool(Box::new(FileOperationsTool::new()));
// Execute a tool
let result = router.execute_tool(
"read_file",
serde_json::json!({
"path": "src/main.rs"
})
).await?;
println!("File content: {}", result);
Ok(())
}
```
### LSP Client Example
```rust
use coderlib::lsp::{LspClient, LspConfig};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let config = LspConfig {
server_command: "rust-analyzer".to_string(),
server_args: vec![],
root_uri: "file:///path/to/project".to_string(),
};
let mut client = LspClient::new(config).await?;
client.initialize().await?;
// Get diagnostics for a file
let diagnostics = client.get_diagnostics("src/main.rs").await?;
println!("Found {} diagnostics", diagnostics.len());
Ok(())
}
```
## ⚙️ Configuration
CoderLib supports flexible configuration through TOML files or programmatic setup:
### Configuration File (`coderlib.toml`)
```toml
debug = false
log_level = "info"
# OpenAI Provider
[providers.openai]
enabled = true
api_key = "your-api-key"
default_model = "gpt-4"
max_tokens = 4000
timeout = 30
[providers.openai.settings]
base_url = "https://api.openai.com/v1"
# Anthropic Provider
[providers.anthropic]
enabled = true
api_key = "your-anthropic-key"
default_model = "claude-3-5-sonnet-20241022"
max_tokens = 4000
# Local Model Provider
[providers.local]
enabled = true
base_url = "http://localhost:11434" # Ollama default
default_model = "llama3.1:8b"
# Storage Configuration
[storage]
storage_type = "sqlite"
database_path = "coderlib.db"
# Tool Configuration
[tools]
shell_enabled = true
file_operations_enabled = true
max_file_size = 10485760
allowed_extensions = [".rs", ".py", ".js", ".ts", ".md"]
# Permission System
[permissions]
require_confirmation = true
dangerous_commands_blocked = true
allowed_directories = ["/home/user/projects", "/tmp"]
# LSP Configuration
[lsp]
rust_analyzer_path = "rust-analyzer"
typescript_server_path = "typescript-language-server"
python_server_path = "pylsp"
```
### Programmatic Configuration
```rust
use coderlib::{CoderLibConfig, ProviderConfig, ProviderType};
let config = CoderLibConfig {
debug: false,
log_level: "info".to_string(),
providers: vec![
ProviderConfig {
provider_type: ProviderType::OpenAI,
api_key: Some("your-api-key".to_string()),
base_url: Some("https://api.openai.com/v1".to_string()),
default_model: "gpt-4".to_string(),
enabled: true,
..Default::default()
}
],
..Default::default()
};
```
## 🏗️ Architecture
CoderLib follows a modular, plugin-based architecture designed for flexibility and extensibility:
```
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ Host Editor │◄──►│ CoderLib │◄──►│ LLM Provider │
│ (Edit/IDE) │ │ Core │ │ (OpenAI, etc.) │
└─────────────────┘ └─────────────────┘ └─────────────────┘
│ │ │
│ │ │
▼ ▼ ▼
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ Integration │ │ Tool System │ │ Storage │
│ Layer │ │ (File, Git, │ │ (SQLite, │
│ (LSP, MCP) │ │ Code, Shell) │ │ Memory) │
└─────────────────┘ └─────────────────┘ └─────────────────┘
│ │ │
│ │ │
▼ ▼ ▼
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ Permission │ │ Session │ │ Event │
│ System │ │ Management │ │ System │
└─────────────────┘ └─────────────────┘ └─────────────────┘
```
### Core Components
- **🧠 Agent System** - Intelligent request processing with tool orchestration
- **🔌 Provider Layer** - Unified interface for multiple LLM providers
- **🛠️ Tool Registry** - Extensible tool system for code operations
- **💾 Session Manager** - Persistent conversation history and context
- **🔐 Permission System** - Fine-grained security and access control
- **📡 Event System** - Real-time communication and state management
- **🗄️ Storage Layer** - Pluggable persistence backends
### Integration Patterns
#### Editor Plugin Integration
```rust
use coderlib::integration::{EditHost, HostEvent, HostCommand};
struct AIAssistantPlugin {
coderlib: CoderLib,
}
impl EditHost for AIAssistantPlugin {
async fn handle_event(&mut self, event: HostEvent) -> Result<Option<HostCommand>> {
match event {
HostEvent::KeyPressed(key) if key == "F10" => {
// Show AI context menu
Ok(Some(HostCommand::ShowContextMenu {
items: vec![
"Explain Code".to_string(),
"Refactor".to_string(),
"Generate Tests".to_string(),
"Fix Issues".to_string(),
]
}))
}
HostEvent::MenuItemSelected(item) => {
self.handle_ai_request(item).await
}
_ => Ok(None),
}
}
}
```
#### MCP Server Integration
```rust
use coderlib::mcp::{McpBridge, McpServer};
#[tokio::main]
async fn main() -> Result<()> {
let coderlib = CoderLib::new(config).await?;
// Create MCP bridge
let bridge = McpBridge::new(coderlib);
// Start MCP server
let server = McpServer::new(bridge);
server.listen("127.0.0.1:8080").await?;
Ok(())
}
```
## 📚 Examples
The `examples/` directory contains comprehensive usage examples:
### Basic Examples
- **`basic_usage.rs`** - Simple interactive AI assistant
- **`provider_test.rs`** - Testing different LLM providers
- **`openai_test.rs`** - OpenAI-specific integration
- **`gemini_provider.rs`** - Google Gemini integration
### Advanced Examples
- **`comprehensive_tools_demo.rs`** - Full tool system showcase
- **`permission_system_demo.rs`** - Security and permissions
- **`auto_summarization_demo.rs`** - Context management
- **`custom_commands_demo.rs`** - Custom command templates
### Integration Examples
- **`edit_integration.rs`** - Editor plugin integration
- **`mcp_bridge_test.rs`** - MCP server integration
- **`lsp_integration_test.rs`** - Language server integration
### Running Examples
```bash
# Basic usage
cargo run --example basic_usage
# Tool system demo
cargo run --example comprehensive_tools_demo
# Permission system
cargo run --example permission_system_demo
# MCP bridge
cargo run --example mcp_bridge_test
```
## 🚀 Production Ready
CoderLib is production-ready with comprehensive features:
### ✅ **Completed Features**
- **Core Architecture** - Stable, modular design
- **Multi-Provider LLM Support** - OpenAI, Anthropic, Google, Local models
- **Advanced Tool System** - File ops, Git, code analysis, shell commands
- **Permission System** - Enterprise-grade security
- **Session Management** - Persistent conversations with auto-summarization
- **LSP Integration** - Full Language Server Protocol support
- **MCP Bridge** - Model Context Protocol compatibility
- **Configuration System** - Flexible TOML and programmatic config
- **Storage Backends** - SQLite with extensible architecture
- **Event System** - Real-time communication and state management
### 🔄 **Continuous Improvements**
- **Performance Optimization** - Ongoing performance enhancements
- **Additional Providers** - New LLM provider integrations
- **Enhanced Tools** - More sophisticated code analysis tools
- **Documentation** - Expanding guides and tutorials
- **Community Features** - Plugin marketplace and extensions
## 🤝 Contributing
We welcome contributions! Here's how to get started:
1. **Fork the repository**
2. **Create a feature branch**: `git checkout -b feature/amazing-feature`
3. **Make your changes** with tests
4. **Run the test suite**: `cargo test`
5. **Submit a pull request**
### Development Setup
```bash
git clone https://github.com/mexyusef/coderlib.git
cd coderlib
cargo build
cargo test
```
See [CONTRIBUTING.md](CONTRIBUTING.md) for detailed guidelines.
## 📄 License
Licensed under:
- **MIT License** ([LICENSE](LICENSE) or http://opensource.org/licenses/MIT)
at your option.
## 🙏 Acknowledgments
- **Designed with** safety, performance, and extensibility in mind
- **Community-driven** development model
## 📞 Support
- **Documentation**: [docs.rs/coderlib](https://docs.rs/coderlib)
- **Issues**: [GitHub Issues](https://github.com/mexyusef/coderlib/issues)
- **Discussions**: [GitHub Discussions](https://github.com/mexyusef/coderlib/discussions)
- **Crate**: [crates.io/crates/coderlib](https://crates.io/crates/coderlib)
---
**CoderLib** - A library for coder in the AI world. 🚀