OpenServe
OpenServe is a modern, high-performance, AI-enhanced file server built in Rust. It combines the speed and safety of Rust with intelligent features powered by OpenAI, providing a comprehensive solution for file management, search, and collaboration.
Features
Core Features
- High-Performance File Server: Built with Axum and Tokio for maximum performance
- RESTful API: Complete REST API for file operations
- WebSocket Support: Real-time file change notifications
- Authentication & Authorization: JWT-based auth with role-based access control
- Search Engine: Full-text search powered by Tantivy
- File Upload/Download: Secure file operations with validation
- Directory Management: Complete directory operations
AI-Powered Features
- Intelligent File Classification: Automatic file categorization
- Content Analysis: AI-powered content summarization and tagging
- Smart Search: Semantic search capabilities
- File Organization Suggestions: AI recommendations for file structure
- Chat Interface: Query your files using natural language
Security & Safety
- Path Traversal Protection: Secure file access controls
- Input Validation: Comprehensive request validation
- Rate Limiting: Built-in rate limiting for API endpoints
- CORS Support: Configurable cross-origin resource sharing
- TLS/SSL Support: HTTPS encryption support
Monitoring & Observability
- Structured Logging: JSON and text logging with tracing
- Metrics Collection: Prometheus metrics integration
- Health Checks: Built-in health monitoring
- Performance Monitoring: Request timing and performance metrics
DevOps Ready
- Docker Support: Multi-stage Dockerfile for production
- Docker Compose: Complete stack with Redis, Prometheus, Grafana
- Configuration Management: Environment-based configuration
- Automated Testing: Comprehensive test suite
- CI/CD Ready: GitHub Actions integration
Quick Start
Prerequisites
- Rust 1.75 or later
- Docker & Docker Compose (optional)
- OpenAI API Key (for AI features)
Installation
Option 1: Using Cargo
Option 2: From Source
Option 3: Using Docker
Basic Usage
# Start the server
# With AI features enabled
# With custom configuration
Configuration
Environment Variables
# Server Configuration
OPENSERVE_HOST=0.0.0.0
OPENSERVE_PORT=8080
OPENSERVE_SERVE_PATH=/app/files
# AI Configuration
OPENAI_API_KEY=your-api-key-here
OPENSERVE_AI_ENABLED=true
# Database Configuration
DATABASE_URL=sqlite:///app/data/openserve.db
REDIS_URL=redis://localhost:6379
# Logging
RUST_LOG=info
Configuration File (config.yml)
server:
host: "0.0.0.0"
port: 8080
serve_path: "./files"
max_upload_size: 104857600 # 100MB
enable_tls: false
ai:
enabled: true
api_key: "your-openai-api-key"
model: "gpt-4o-mini"
max_tokens: 2048
temperature: 0.7
auth:
enabled: true
jwt_secret: "your-secret-key"
session_timeout: 3600
allow_registration: false
storage:
database_url: "sqlite://./data.db"
redis_url: "redis://localhost:6379"
cache_size: 1000
index_path: "./index"
telemetry:
log_level: "info"
log_format: "json"
metrics_enabled: true
tracing_enabled: false
API Reference
File Operations
# List directory contents
# Upload file
# Download file
# Delete file
# Get file metadata
Search Operations
# Search files
&limit=10
# Semantic search
&semantic=true
# Get search statistics
AI Operations
# Analyze file content
{
}
# Chat with files
{
}
Authentication
# Login
{
}
# Register (if enabled)
{
}
Library Usage
OpenServe can also be used as a library in your Rust projects:
use *;
async
Testing
# Run all tests
# Run with coverage
# Run benchmarks
# Integration tests
Development
Project Structure
openserve-rs/
├── src/
│ ├── ai/ # AI service integration
│ ├── config/ # Configuration management
│ ├── error/ # Error handling
│ ├── handlers/ # HTTP request handlers
│ ├── middleware/ # Custom middleware
│ ├── models/ # Data models
│ ├── services/ # Business logic
│ ├── utils/ # Utility functions
│ ├── lib.rs # Library root
│ └── main.rs # Application entry point
├── tests/ # Integration tests
├── benches/ # Benchmarks
├── docker/ # Docker configuration
└── docs/ # Documentation
Building from Source
# Development build
# Release build
# With all features
# Cross-compilation
Contributing
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
Deployment
Docker Deployment
# Build and deploy
# Scale services
# View logs
Systemd Service
# Install service
Monitoring
Metrics
- Request count and latency
- File operation statistics
- Search performance metrics
- AI service usage
- System resource usage
Dashboards
- Grafana dashboards included
- Prometheus metrics collection
- Real-time monitoring
- Alerting rules
Health Checks
# Health endpoint
# Metrics endpoint
# Ready endpoint
Troubleshooting
Common Issues
Port already in use
# Find process using port
# Kill process
Permission denied
# Check file permissions
# Fix permissions
AI features not working
# Check API key
# Test API connection
Logs
# View logs
# Increase log level
RUST_LOG=debug
# Structured logging
RUST_LOG=info,openserve=debug
Performance
OpenServe is designed for high performance:
- Built with async Rust using Tokio
- Efficient file I/O with memory mapping
- Connection pooling and caching
- Optimized search indexing
- Zero-copy operations where possible
Security
Security is a top priority:
- Path traversal protection
- Input validation and sanitization
- JWT token authentication
- Rate limiting
- CORS protection
- TLS/SSL support
License
This project is licensed under the MIT License - see the LICENSE file for details.
Support
- Issues: GitHub Issues
- Documentation: docs.rs/openserve
Acknowledgments
- Axum - Web framework
- Tantivy - Search engine
- OpenAI - AI capabilities
- Tokio - Async runtime
- Serde - Serialization
Made with Rust