Secretary 🚀
Secretary is a Rust library that translates natural language into structured data using large language models (LLMs). It provides a simple, type-safe way to extract structured information from unstructured text.
Features
- 🔍 Schema-Based Extraction: Define your data structure using Rust structs and let LLMs extract matching data
- 🔄 Context-Aware Conversations: Maintain conversation state for multi-turn interactions
- 🧠 Progressive Data Building: Incrementally build complex data structures through conversational interactions
- 📋 Declarative Schema Annotations: Document your schemas with field descriptions that guide the LLM
- 🔌 Extensible LLM Support: Currently supports OpenAI API with more providers planned
Quick Start
Or,
[]
= "0.2.30"
Basic Example
use ;
use ;
// Define your output schema
// Implement DataModel to provide instructions for the schema
How It Works
- Define Your Schema: Create a Rust struct that represents the data structure you want to extract
- Implement DataModel: Provide instructions for each field using the
DataModeltrait - Create a Task: Initialize a task with your schema and any additional instructions
- Process Text: Send natural language input to an LLM through the Secretary API
- Get Structured Data: Receive a JSON result that matches your defined schema
The DataModel Trait
The DataModel trait is essential for guiding the LLM on how to populate your schema:
use DataModel;
Advanced Features
Multi-Turn Conversations
Secretary supports contextual, multi-turn conversations that build data progressively:
use ;
// Initialize your schema and task
let mut task = ;
// First user message
task.push.unwrap;
// Generate response based on context
let response = llm.generate_json_with_context.unwrap;
// Add response to conversation context
task.push.unwrap;
// Continue conversation
task.push.unwrap;
let response2 = llm.generate_json_with_context.unwrap;
Contextual Tasks with Reasoning
For complex data gathering, use ContextualTask to maintain reasoning, notes, and follow-up questions:
use ;
// Create a contextual task for complex interactions
let mut task = ;
// Process user input
task.push.unwrap;
let response = llm.generate_json_with_context.unwrap;
// The response includes reasoning, notes, and structured data
// ContextualTask automatically manages reasoning and follow-up questions
Async Processing
Secretary supports async operations for concurrent processing:
use ;
async
Examples
The examples/ directory contains comprehensive examples demonstrating different use cases:
Basic Usage
generate_json.rs- Simple sentiment analysis with structured outputgenerate_json_with_context.rs- Multi-turn conversation with context preservation
Async Processing
async_generate_json.rs- Concurrent processing of multiple requestsasync_generate_json_with_context.rs- Async multi-turn conversations
Contextual Tasks
contextual_prompt_basic.rs- Product analysis with reasoning and notescontextual_prompt_conversation.rs- Interactive trip planning conversationcontextual_prompt_analysis.rs- Advanced contextual analysis patterns
Real-World Applications
product_review_analysis.rs- E-commerce review processing
Run any example with:
# Set environment variables first
# Run an example
Documentation
- Getting Started - Complete setup guide
- Examples - Practical code examples
- Project Structure - Architecture overview
- API Documentation - Detailed API reference
Environment Setup
Secretary requires the following environment variables for OpenAI integration:
# or gpt-4o, gpt-3.5-turbo, etc.
These environment variables are used by the examples and can be referenced in your code as:
let llm = new.unwrap;
License
This project is licensed under the MIT License - see the LICENSE file for details.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.