ggen-ai
AI-powered code generation capabilities for ggen - Unified LLM integration using rust-genai for intelligent template generation, SPARQL queries, and RDF graph operations.
🚀 NEW: v1.0.0 with rust-genai Integration
Major Update: Complete migration from custom LLM clients to rust-genai for production-ready multi-provider AI integration.
Features
- 🔧 Multi-provider LLM support: OpenAI, Anthropic, Ollama via rust-genai
- 🤖 Intelligent template generation: Natural language to ggen templates
- 🔍 SPARQL query generation: Intent-based query construction from RDF graphs
- 📊 Ontology generation: Domain descriptions to RDF/OWL schemas
- 🔄 Code refactoring: AI-assisted code improvement suggestions
- 🎪 MCP server integration: Model Context Protocol for AI tool integration
- ⚡ Production-ready: Structured error handling, configuration management, and comprehensive testing
Quick Start
Installation
# Add to your Cargo.toml
Basic Setup
Basic Usage
use ;
use GenAiClient;
async
CLI Usage
# Generate template using AI
# Generate SPARQL query from graph
# Generate RDF ontology
# Start MCP server for AI tools
API Reference
Core Types
use ;
use GenAiClient;
// Configuration for all LLM providers
// Response from LLM completion
// Streaming chunk from LLM
// Usage statistics
Template Generation
use TemplateGenerator;
// Create generator with LLM client
let generator = new;
// Generate REST API controller
let template = generator.generate_rest_controller.await?;
// Generate data model
let template = generator.generate_data_model.await?;
// Generate from natural language description
let template = generator.generate_template.await?;
SPARQL Query Generation
use SparqlGenerator;
use Graph;
// Create generator with LLM client
let generator = new;
// Generate query from natural language intent
let query = generator.generate_query.await?;
// Generate query with specific intent
let query = generator.generate_query_with_intent.await?;
Ontology Generation
use OntologyGenerator;
// Create generator with LLM client
let generator = new;
// Generate ontology from domain description
let ontology = generator.generate_ontology.await?;
// Generate domain-specific ontology
let ontology = generator.generate_domain_ontology.await?;
Code Refactoring
use RefactorAssistant;
// Create refactoring assistant with LLM client
let assistant = new;
// Suggest refactoring improvements
let suggestions = assistant.suggest_refactoring.await?;
// Get detailed suggestions with explanations
for suggestion in suggestions
MCP Tools
The ggen-ai MCP server provides the following tools for AI assistant integration:
ai_generate_template
Generate ggen templates from natural language descriptions.
Parameters:
description(string, required): Natural language descriptionexamples(array, optional): Example requirements or contextlanguage(string, optional): Target programming languageframework(string, optional): Target framework
ai_generate_sparql
Generate SPARQL queries from natural language intent and RDF graphs.
Parameters:
intent(string, required): Natural language query descriptiongraph(string, required): RDF graph data in Turtle format
ai_generate_ontology
Generate RDF/OWL ontologies from domain descriptions.
Parameters:
domain(string, required): Domain descriptionrequirements(array, optional): Specific requirements or classes
ai_refactor_code
Suggest code refactoring improvements using AI analysis.
Parameters:
code(string, required): Code to analyze and refactorlanguage(string, optional): Programming language for context
ai_explain_graph
Explain RDF graph content in natural language.
Parameters:
graph(string, required): RDF graph data in Turtle formatfocus(string, optional): Specific aspect to explain
ai_suggest_delta
Suggest intelligent merge strategies for delta-driven projection.
Parameters:
baseline(string, required): Baseline versioncurrent(string, required): Current generated versionmanual(string, optional): Manual modifications made
Configuration
Environment Variables
Configure LLM providers using environment variables:
# OpenAI Configuration
# Optional custom endpoint
# Anthropic Configuration
# Ollama Configuration (local models)
# Global Configuration
Programmatic Configuration
use ;
// Configure for OpenAI
let config = LlmConfig ;
// Create client
let client = new?;
let generator = new;
Examples
Complete Template Generation Workflow
use ;
async
SPARQL Query Generation
use SparqlGenerator;
use Graph;
async
Testing
# Run unit tests
# Run integration tests
# Run with debug logging
RUST_LOG=ggen_ai=debug
# Test specific provider
OPENAI_API_KEY="test-key"
Contributing
- Follow core team best practices - Use
cargo makecommands, no directcargousage - Add comprehensive tests - Unit, integration, and property tests
- Update documentation - Keep README and guides current
- Use structured error handling - No
.unwrap()or.expect()in library code
Migration from v0.x
Major Update: ggen-ai v1.0.0 migrates from custom LLM clients to rust-genai for production-ready multi-provider support.
- Breaking Changes: Provider initialization now uses configuration objects
- New Features: Environment-based configuration, structured error handling
- Migration Guide: See docs/ggen-ai-migration-guide.md
License
MIT