OpenCrates: AI-Powered Rust Crate Engine
OpenCrates is a comprehensive, AI-driven toolkit designed to revolutionize the Rust development experience. It empowers developers by automating and enhancing various stages of the crate lifecycle, from initial concept to deployment and maintenance. By leveraging cutting-edge AI models, OpenCrates assists in generating, analyzing, optimizing, and testing Rust code, making development faster, more efficient, and more robust.
Table of Contents
- Features
- Why OpenCrates?
- Installation
- Configuration
- Quick Start
- Command-Line Interface (CLI) Usage
- API Endpoints (when using
opencrates serve) - AI Integration Details
- Core Concepts & Architecture
- Development
- Troubleshooting
- License
Features
- AI-Driven Crate Generation: Scaffold new Rust projects (libraries, binaries) from natural language descriptions. Includes boilerplate, module structure, initial code,
Cargo.toml, andREADME.md. - Smart Code Analysis: Leverage AI to analyze existing Rust codebases. Identifies potential issues, suggests improvements in style, performance, and safety, and helps understand complex logic.
- Automated Optimization: Receive AI-powered suggestions for performance enhancements, dependency version updates, feature flag usage, and code refactoring for better efficiency.
- Intelligent Testing: Generate unit, integration, and benchmark tests based on code structure, function signatures, and specifications.
- Interactive CLI: A user-friendly command-line interface providing access to all functionalities with clear commands and options.
- Extensible Provider Model: Designed to integrate with various AI providers (currently OpenAI) and search services (crates.io, docs.rs).
- Modular Staged Processing: Crate generation and analysis follow a defined pipeline of stages: Conceptualization, Architecture, Generation, Optimization, and Testing.
- Comprehensive Configuration: Manage API keys, AI model preferences, server settings, database connections, and cache configurations via a TOML file or environment variables.
- Built-in Web Server: Expose OpenCrates functionalities via a RESTful API (using Axum) for integration with other tools or web UIs.
- Health & Metrics: Includes health check endpoints and Prometheus-compatible metrics for monitoring when run as a server.
- Database Integration: Stores crate metadata and other relevant information using SQLite by default, with
sqlxfor database operations. Schema includesid,name,description,version,features, and timestamps. - Caching Layer: Multi-level caching (in-memory, optional Redis) to improve performance and reduce redundant AI calls.
- Aider & AIChat CLI Integration: Directly invoke popular CLI tools like Aider (for AI pair programming) and AIChat (for general LLM interactions) with project context.
- Template Engine: Uses Handlebars for flexible and maintainable code and documentation templating.
Why OpenCrates?
OpenCrates aims to be your AI co-pilot for Rust development. It doesn't just write code; it helps you design, understand, improve, and test it. By automating repetitive tasks and providing intelligent insights, OpenCrates lets you focus on the creative and complex aspects of software engineering.
- Boost Productivity: Accelerate project setup and development tasks.
- Improve Code Quality: Get AI-driven feedback on best practices, performance, and potential bugs.
- Learn Faster: Understand new codebases or explore Rust features with AI assistance.
- Streamline Workflows: Integrate AI seamlessly into your existing Rust development process.
Installation
Using install.sh (Recommended for Linux/macOS)
This script will check for dependencies (Rust), install Aider (if missing and Python/pip are available), build OpenCrates from source, and set up a default configuration.
# Ensure you are in the root of the cloned OpenCrates repository
Ensure ~/.cargo/bin is in your PATH to use opencrates globally.
Using Docker
A Dockerfile and docker-compose.yml are provided for containerized deployment.
# Ensure you are in the root of the cloned OpenCrates repository
# Build the Docker image
# Run using Docker (example for server mode)
# Ensure OPENAI_API_KEY is set in your environment or a .env file
)
# Or use Docker Compose (recommended for services like Postgres/Redis)
# Create a .env file in the root with your OPENAI_API_KEY
# Example .env:
# OPENAI_API_KEY=sk-yourkeyhere
# POSTGRES_PASSWORD=mysecretpassword
# (docker-compose.yml uses POSTGRES_PASSWORD from .env if set)
The docker-compose.yml is set up to use a local SQLite database by default (persisted in ./data/opencrates.db on the host if you adjust the config.toml or OPENCRATES_DATABASE_URL). For PostgreSQL or Redis, uncomment and configure them in docker-compose.yml and config.toml.
Manual Build from Source
- Install Rust: If not already installed, get it from rustup.rs.
- Install Aider (Optional but Recommended for
aidercommand): - Install AIChat (Optional but Recommended for
aichatcommand):# Or other methods from aichat documentation - Clone the Repository:
- Build:
The binary will be attarget/release/opencrates. You can copy this to a directory in your PATH (e.g.,~/.cargo/bin).
Configuration
OpenCrates can be configured via a TOML file or environment variables. Environment variables override configuration file settings.
Environment Variables
OPENAI_API_KEY: (Required for most AI features) Your OpenAI API key.ANTHROPIC_API_KEY: (Optional) Your Anthropic API key.OPENCRATES_CONFIG_PATH: Path to the configuration TOML file (e.g.,/etc/opencrates/config.toml).OPENCRATES_LOG_LEVEL: Logging level (e.g.,info,debug,warn,error). Defaults toinfo.OPENCRATES_LOG_FORMAT: Logging format (textorjson). Defaults tojson.OPENCRATES_SERVER_HOST: Host for the API server. Defaults to127.0.0.1.OPENCRATES_SERVER_PORT: Port for the API server. Defaults to8080.OPENCRATES_DATABASE_URL: Database connection string.- SQLite example:
sqlite:./data/opencrates.db(ensure./datadirectory exists and is writable if using a relative path) - PostgreSQL example:
postgres://user:password@host:port/database
- SQLite example:
OPENCRATES_REDIS_URL: Redis connection URL for caching (e.g.,redis://127.0.0.1:6379/0). Required if Redis caching is desired.OPENCRATES_AI_DEFAULT_MODEL: Default AI model to use (e.g.,gpt-4-turbo).OPENCRATES_ENVIRONMENT_NAME: Environment name (e.g.,development,production).
Configuration File (opencrates.toml or config.toml)
By default, OpenCrates looks for config.toml in the current directory. You can specify a different path using the --config <PATH> CLI option or the OPENCRATES_CONFIG_PATH environment variable. The opencrates init command creates a default configuration file.
Example config.toml:
# Server configuration
[]
= "127.0.0.1"
= 8080
= 4 # Number of server workers, defaults to number of CPUs
# Database configuration
[]
= "sqlite:data/opencrates.db" # Relative to where opencrates is run, or use an absolute path
= 10
# AI provider configuration (OpenAI example)
[]
# openai_api_key can be set via OPENAI_API_KEY environment variable
# openai_api_key = "sk-yourActualOpenAIKey"
= "gpt-4-turbo" # Specify your preferred default model
= 4096
= 0.7
# anthropic_api_key = "sk-ant-yourActualAnthropicKey" # If using Anthropic
# Cache configuration
[]
= "redis://127.0.0.1:6379" # Optional: Uncomment and set for Redis backend
= 3600 # Default cache TTL in seconds (for in-memory and when Redis TTL isn't specified)
= 1000 # Max items for in-memory cache
= 104857600 # Max memory in bytes for in-memory cache (100MB)
# Logging configuration
[]
= "info" # trace, debug, info, warn, error
= "json" # or "text"
# Configuration for interacting with external registries like crates.io
[]
= "https://crates.io"
= 3600 # TTL for caching registry API responses
# Environment settings
[]
= "development" # "production", "staging", etc.
= true # Enables more verbose logging or debug features if true
Quick Start
- Install OpenCrates: Follow one of the Installation methods.
- Configure API Key:
- Set the
OPENAI_API_KEYenvironment variable: - Or, run
opencrates init .and edit theopenai_api_keyin the generatedconfig.toml.
- Set the
- Generate your first crate:
This will create amy_calculatordirectory with the new Rust project. - Analyze an existing project:
- Start the API server (optional):
You can then interact with the API, e.g.,curl http://localhost:8000/health.
Command-Line Interface (CLI) Usage
The main command is opencrates. Use opencrates --help or opencrates <COMMAND> --help for detailed options.
Global Options:
--config <PATH>: Path to the configuration TOML file.
opencrates init
Initializes a new OpenCrates configuration file.
- If
PATH_TO_CONFIG_FILEis omitted, it createsconfig.tomlin the current directory. - Example:
opencrates init ./my_opencrates_config.toml
opencrates generate
Generates a new Rust crate using AI.
--name, -n <NAME>: (Required) Name of the crate (e.g.,image_resizer).--description, -d <DESC>: (Required) A clear description of what the crate does.--features <FEATURES>: Comma-separated list of desired features or core dependencies (e.g.,serde,tokio,clap).--output-dir <PATH>: Directory where the new crate folder will be created (default: current directory).--template <TEMPLATE_NAME>: (Future Enhancement) Specify a custom base template.
opencrates analyze
Analyzes an existing Rust crate for structure, dependencies, metrics, and potential issues.
--path, -p <PATH>: (Required) Path to the root directory of the crate to analyze.
opencrates optimize
Suggests optimizations for an existing crate based on AI analysis.
--path <PATH>: (Required) Path to the crate root.--apply: (Future Enhancement) Automatically attempt to apply suggested optimizations.
opencrates test
Generates and/or runs tests for a crate.
--path <PATH>: (Required) Path to the crate root.--coverage: (Future Enhancement) Generate and report test coverage.--generate-only: (Future Enhancement) Only generate test files, do not run them.
opencrates search
Searches for crates on crates.io.
<QUERY>: The search term (e.g.,"http client").--limit <NUM>: Maximum number of results to display (default: 10).
opencrates chat
Engages in an interactive chat session with an AI model (via OpenAI).
"<PROMPT>": The initial prompt or question.--model, -m <MODEL_NAME>: Specify the AI model (e.g.,gpt-4-turbo). Defaults todefault_modelfrom config.--context <CONTEXT_STRING>: (Future Enhancement) Provide additional text context for the chat.
opencrates aider
Integrates with the Aider CLI tool for AI-assisted pair programming in a specified project.
Requires aider-chat to be installed and in PATH.
--path, -p <PROJECT_PATH>: Path to the Git repository or project (default: current directory)."<INSTRUCTIONS_FOR_AIDER>": The coding request (e.g.,"add a new function to lib.rs that takes two numbers and returns their sum, include unit tests.").
opencrates aichat
Integrates with the AIChat CLI tool for general LLM interactions.
Requires aichat to be installed and in PATH.
"<PROMPT>": The prompt for AIChat.--model, -m <MODEL_NAME>: Specify the model for AIChat (e.g.,openai:gpt-4,ollama:llama2).
opencrates serve
Starts the OpenCrates web server, exposing functionalities via a REST API.
--host <HOST>: Host to bind (default:127.0.0.1from config).--port <PORT>: Port to bind (default:8080from config).
API Endpoints (when using opencrates serve)
Base URL: http://<configured_host>:<configured_port> (e.g., http://localhost:8080)
All responses are JSON and follow the ApiResponse<T> structure:
{"success": true/false, "data": T_OR_NULL, "error": "message_OR_NULL", "metadata": {...}}
/health
- GET: Basic health check.
- Response:
ApiResponse<{"status": "healthy", "timestamp": "...", "version": "..."}>
- Response:
/health/live
- GET: Liveness probe.
- Response:
ApiResponse<"alive">
- Response:
/health/ready
- GET: Readiness probe, indicating if services are ready.
- Response:
ApiResponse<{"ready": true, "services": {"database": true, ...}}>
- Response:
/api/v1/crates
- POST: Create a new crate based on the provided specification.
- Request Body:
{"name": "string", "description": "string", "features": ["string"]} - Response:
ApiResponse<CrateContext>(contains generated crate details and structure)
- Request Body:
/api/v1/crates/{crate_path_or_id}
- GET: Retrieve analysis of an existing crate.
{crate_path_or_id}should be a URL-encoded path to the crate root for now.- Response:
ApiResponse<ProjectAnalysis>
- Response:
- (PUT, DELETE): Future enhancements for updating/deleting crates from a registry.
/api/v1/generate
- POST: Generate a new crate (same as
/api/v1/cratesPOST).- Request Body:
CreateCrateRequest({"name": ..., "description": ..., "features": ...}) - Response:
ApiResponse<CrateContext>
- Request Body:
/api/v1/generate/preview
- POST: Get a preview of the file structure and key elements for a crate generation request without full AI processing.
- Request Body:
CreateCrateRequest - Response:
ApiResponse<serde_json::Value>(contains preview data like estimated file list)
- Request Body:
/api/v1/analyze
- POST: Analyze an existing crate.
- Request Body:
{"path": "/url/encoded/path/to/crate/root"} - Response:
ApiResponse<ProjectAnalysis>
- Request Body:
/api/v1/optimize
- POST: Get optimization suggestions for a crate.
- Request Body:
{"path": "/url/encoded/path/to/crate/root"} - Response:
ApiResponse<String>(containing optimization suggestions)
- Request Body:
/api/v1/ai/chat
- POST: Send a message to the configured AI model for a chat-like interaction.
- Request Body:
{"message": "string", "context": "optional_string"} - Response:
ApiResponse<String>(AI's response message)
- Request Body:
/api/v1/ai/complete
- POST: Request a completion from the AI model.
- Request Body:
{"message": "string", "context": "optional_string"} - Response:
ApiResponse<String>(AI's completion)
- Request Body:
/metrics
- GET: Exposes application metrics in Prometheus format.
/api/v1/status
- GET: Get detailed system status including overall health, aggregated metrics, and resource usage.
- Response:
ApiResponse<SystemStatus>
- Response:
AI Integration Details
Supported Providers
- OpenAI: Primary integration using models like GPT-4, GPT-4-turbo, GPT-3.5-turbo. Requires an
OPENAI_API_KEYto be configured.
Future support for other providers like Anthropic Claude or local LLMs (via Ollama, etc.) is planned.
Using Aider Integration
The opencrates aider command acts as a wrapper to call the aider-chat tool.
- Ensure
aider-chatis installed (pip install aider-chat). - Navigate to your Git project directory.
- Run
opencrates aider "<Your instructions for Aider>". Example:opencrates aider --path ./my_project "Add a new function to lib.rs that takes two numbers and returns their sum, include unit tests."
Using AIChat Integration
The opencrates aichat command interfaces with the aichat CLI.
- Ensure
aichatis installed (e.g.,cargo install aichat). - Run
opencrates aichat "<Your prompt>" [--model <aichat_model_specifier>]. Example:opencrates aichat "What are the best practices for error handling in Rust async code?" --model openai:gpt-4-turbo
Core Concepts & Architecture
OpenCrates is built with a modular architecture:
Providers (AI, Search)
LLMProviderTrait: Defines a common interface for interacting with Large Language Models.OpenAIProvider: Implementation for OpenAI's API.
SearchProviderTrait: Defines an interface for searching external resources.WebSearchProvider: Implements search for crates.io and docs.rs (currently basic).
Stages (Conceptualization, Architecture, etc.)
The crate generation process is a pipeline of distinct stages, each transforming the crate design:
ConceptualizationStage: Takes aCrateSpec(name, description, features) and uses AI to define core concepts, data structures, and algorithms, outputting aConceptModel.ArchitectStage: Takes aConceptModeland designs the high-level architecture: module structure, key interfaces, and data flow, outputting anArchitecturemodel.CrateGenerationStage: Takes anArchitecturemodel, generates code for each module and interface using AI, and renders static files using templates, outputting aCrateContext(which includes the file structure).OptimizationStage: Takes aCrateContext, analyzes generated code with AI for performance, memory, and idiomatic improvements, and updates theCrateContext.TestingStage: Takes aCrateContextand generates unit, integration, and benchmark test skeletons using AI, adding them to theCrateContext.
Caching
CacheManager: Provides a unified interface for caching.CacheBackendTrait: Implemented byMemoryCacheandRedisCache.- In-Memory Cache: Default L1 cache for speed.
- Redis Cache: Optional L2 persistent cache, enabled via the
redisfeature andredis_urlconfiguration.
Templates
TemplateManager: Uses Handlebars for rendering files likeCargo.toml,README.md,.gitignore, and basiclib.rs/main.rsskeletons.- Templates are stored in the
templates/directory.
Database
- Uses
sqlxfor asynchronous, type-safe SQL interactions. - Defaults to SQLite for local persistence of crate metadata (e.g., name, description, version, features).
- The database URL can be configured to point to PostgreSQL for more robust deployments.
- Schema is created/migrated on startup.
Development
Prerequisites
- Rust (latest stable version recommended via
rustup). - Cargo (comes with Rust).
- Git.
- Python 3 and Pip (if you want to use and test the
opencrates aidercommand locally). - (Optional) Docker and Docker Compose for containerized development/testing.
- (Optional)
cargo-watchfor automatic recompilation on file changes. - (Optional)
cargo-auditfor checking security advisories.
Building
# For a debug build
# For a release build (recommended for performance)
The executable will be in target/debug/opencrates or target/release/opencrates.
Testing (test_all.sh)
A comprehensive test script is provided:
This script typically runs:
cargo fmt -- --check(Code formatting check)cargo clippy --all-targets --all-features -- -D warnings(Linter)cargo test --all-targets --all-features(Unit and integration tests)cargo audit(Security vulnerability check, ifcargo-auditis installed)
Contributing
We welcome contributions! Please see CONTRIBUTING.md (to be created) for detailed guidelines. Generally:
- Fork the repository.
- Create a feature branch (
git checkout -b my-new-feature). - Commit your changes (
git commit -am 'Add some feature'). - Ensure all tests and checks pass (
./scripts/test_all.sh). - Push to the branch (
git push origin my-new-feature). - Open a Pull Request.
Troubleshooting
OPENAI_API_KEYnot found/invalid:- Ensure the
OPENAI_API_KEYenvironment variable is set correctly. - Alternatively, make sure it's correctly specified in your
config.tomlunder[ai] openai_api_key = "sk-...". - Verify your OpenAI account has sufficient credits and the API key is active.
- Ensure the
aider-chatoraichatcommand not found:- These are external tools. Install them separately if you intend to use the
opencrates aideroropencrates aichatcommands. pip install aider-chatcargo install aichat(or other methods per AIChat's documentation)- Ensure their installation locations are in your system's
PATH.
- These are external tools. Install them separately if you intend to use the
- Database connection errors:
- For SQLite (default): Ensure the directory specified in
database.url(e.g.,sqlite:data/opencrates.dbmeans adatasubdirectory) is writable by the user runningopencrates. - For PostgreSQL: Verify the connection string (
postgres://user:pass@host/db), server status, and network accessibility.
- For SQLite (default): Ensure the directory specified in
- "Too many open files" error (Linux/macOS):
- The server might be hitting the system's open file descriptor limit. You might need to increase it using
ulimit -n <new_limit>.
- The server might be hitting the system's open file descriptor limit. You might need to increase it using
- Slow AI responses:
- This can be due to the complexity of the request, the specific AI model chosen, or network latency to the AI provider. Consider using less complex models for quicker, iterative tasks if appropriate.
- Build Failures:
- Ensure you have the latest stable Rust toolchain:
rustup update stable. - Clean your build directory:
cargo cleanand try building again.
- Ensure you have the latest stable Rust toolchain:
License
This project is licensed under either of:
- Apache License, Version 2.0 (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
- MIT license (LICENSE-MIT or http://opensource.org/licenses/MIT)
at your option. Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.