docs.rs failed to build langchainrust-0.2.4
Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
Visit the last successful build:
langchainrust-0.2.0
langchainrust
A LangChain-inspired Rust framework for building LLM applications. Provides abstractions for agents, chains, memory, RAG pipelines, and tool-calling workflows.
一个受 LangChain 启发的 Rust 框架,用于构建 LLM 应用。提供 Agent、Chain、Memory、RAG 和工具调用等核心抽象。
中文文档
✨ 核心特性
| 组件 | 功能 |
|---|---|
| LLM | OpenAI 兼容接口,支持流式输出、function calling |
| Agents | ReActAgent(文本解析)+ FunctionCallingAgent(原生 FC) |
| Prompts | PromptTemplate 和 ChatPromptTemplate |
| Memory | 对话历史管理 |
| Chains | LLMChain 和 SequentialChain 工作流 |
| RAG | 文档分割、向量存储、语义检索 |
| Loaders | 支持 PDF 和 CSV 文档加载 |
| Tools | 内置工具:计算器、日期时间、数学运算、URL抓取 |
| Callbacks | 执行追踪、LangSmith 集成、日志输出 |
| Tool Calling | bind_tools()、结构化输出、ToolDefinition、to_tool_definition() |
关键优势
- 🚀 完全异步 - 基于 Tokio 的 async/await 支持
- 🔒 类型安全 - 利用 Rust 类型系统确保代码可靠性
- 📦 零成本抽象 - 高性能设计
- 🎯 简洁 API - 直观易用的接口
- 🔌 易于扩展 - 方便添加自定义工具和组件
📦 安装
在 Cargo.toml 中添加:
[]
= "0.2.3"
= { = "1.0", = ["full"] }
🚀 快速开始
基础对话
use ;
use Message;
async
提示词模板
use ;
use Message;
use HashMap;
// 字符串模板
let template = new;
let mut vars = new;
vars.insert;
vars.insert;
let prompt = template.format?;
// 聊天模板
let chat_template = new;
let mut vars = new;
vars.insert;
vars.insert;
vars.insert;
vars.insert;
let messages = chat_template.format?;
Agent 与工具调用
LangChainRust 提供两种 Agent:
| Agent | 方式 | 适用场景 |
|---|---|---|
| ReActAgent | 文本解析(正则提取) | 不支持 Function Calling 的模型 |
| FunctionCallingAgent | 原生 Function Calling | 支持 FC 的模型(推荐) |
使用 FunctionCallingAgent(推荐)
use ;
use Arc;
let tools: = vec!;
// FunctionCallingAgent 自动绑定工具到 LLM
let agent = new;
let executor = new
.with_max_iterations;
let result = executor.invoke.await?;
println!;
使用 ReActAgent(兼容旧模型)
use ;
use Arc;
let tools: = vec!;
let agent = new;
let executor = new
.with_max_iterations;
let result = executor.invoke.await?;
println!;
两种 Agent 的区别
| 维度 | ReActAgent | FunctionCallingAgent |
|---|---|---|
| 工具调用方式 | 文本解析(正则) | 原生 Function Calling |
| 可靠性 | 依赖 Prompt 格式 | 类型安全,模型原生支持 |
| Token 消耗 | 高(需要格式说明) | 低(不需要格式说明) |
| 适用模型 | 所有模型 | 支持 FC 的模型(GPT-4、Claude、Gemini) |
对话记忆
use ;
let mut history = new;
// 添加消息
history.add_message;
history.add_message;
// 获取历史
for msg in history.messages
Chain 工作流
use ;
use Arc;
use HashMap;
use Value;
// 单步 Chain
let chain1 = new;
// 多步顺序 Chain
let chain2 = new;
let pipeline = new
.add_chain
.add_chain;
let mut inputs = new;
inputs.insert;
let results = pipeline.invoke.await?;
RAG 检索增强生成
use ;
use Arc;
// 创建文档
let docs = vec!;
// 文档分割
let splitter = new;
let chunks = splitter.split_document;
// 创建检索器
let store = new;
let embeddings = new;
let retriever = new;
// 索引文档
retriever.add_documents.await?;
// 检索
let relevant_docs = retriever.retrieve.await?;
文档加载器
LangChainRust 现在支持从多种格式加载文档,包括 PDF 和 CSV 文件。
PDF Loader
use ;
// 加载 PDF 文件
let pdf_loader = new;
let documents = pdf_loader.load.await?;
// 提取的文档包含文本内容和元数据
for doc in documents
CSV Loader
use ;
// 加载 CSV 文件,指定内容列为"description"
let csv_loader = new;
let documents = csv_loader.load.await?;
// 每一行数据转换为单独的文档,具有对应元数据
for doc in documents
📚 完整示例
查看 examples/ 目录:
基础示例
hello_llm- 基础 LLM 对话streaming- 流式输出prompt_template- 提示词模板tools- 内置工具
中级示例
agent_with_tools- Agent 工具调用memory_conversation- 多轮对话记忆chain_pipeline- Chain 工作流
高级示例
rag_demo- 完整 RAG 流程multi_tool_agent- 多工具 Agentfull_pipeline- 完整 AI 应用
运行示例:
# 无需 API Key
# 需要 API Key
🧪 测试
# 运行所有测试
# 运行特定模块测试
# 显示测试输出
📁 项目结构
src/
├── core/ # 核心抽象
│ ├── language_models/ # 基础 LLM trait
│ ├── runnables/ # Runnable trait
│ └── tools/ # Tool trait + to_tool_definition()
├── language_models/ # LLM 实现
│ └── openai/ # OpenAI 客户端(支持 Function Calling)
├── agents/ # Agent 框架
│ ├── react/ # ReActAgent(文本解析)
│ └── function_calling/ # FunctionCallingAgent(原生 FC)
├── prompts/ # 提示词模板
├── memory/ # 记忆管理
├── chains/ # 链式调用
├── retrieval/ # RAG 组件
├── embeddings/ # 文本嵌入
├── vector_stores/ # 向量存储
├── tools/ # 内置工具
└── schema/ # 数据结构
🔧 配置
环境变量
# 可选:自定义端点
OpenAIConfig 配置项
| 字段 | 类型 | 说明 |
|---|---|---|
api_key |
String |
OpenAI API 密钥 |
base_url |
String |
API 端点(支持代理) |
model |
String |
模型名称(如 "gpt-3.5-turbo") |
streaming |
bool |
启用流式响应 |
temperature |
Option<f32> |
采样温度 (0.0-2.0) |
max_tokens |
Option<usize> |
最大生成 token 数 |
🔐 安全提示
- 切勿将 API Key 提交到版本控制
- 使用环境变量存储密钥
- 支持代理/自定义端点
📖 文档
🤝 贡献
欢迎贡献代码!请查看 CONTRIBUTING.md 了解详情。
📄 许可证
Apache License, Version 2.0 或 MIT License,任选其一。
🙏 致谢
本项目受 LangChain 启发,使用 Rust 实现。
English Documentation
✨ Features
| Component | Description |
|---|---|
| LLM | OpenAI-compatible API with streaming support |
| Agents | ReActAgent (text parsing) + FunctionCallingAgent (native FC) |
| Prompts | PromptTemplate and ChatPromptTemplate |
| Memory | Conversation history management |
| Chains | LLMChain and SequentialChain workflows |
| RAG | Document splitting, vector stores, semantic retrieval |
| Loaders | PDF and CSV document loading support |
| Tools | Built-in: Calculator, DateTime, Math, URLFetch |
| Tool Calling | bind_tools(), to_tool_definition(), ToolDefinition, structured output |
Key Benefits
- 🚀 Fully Async - Tokio-based async/await support
- 🔒 Type-Safe - Leverage Rust's type system
- 📦 Zero-Cost Abstractions - High-performance design
- 🎯 Simple API - Intuitive interfaces
- 🔌 Extensible - Easy to add custom tools
📦 Installation
Add to Cargo.toml:
[]
= "0.2.4"
= { = "1.0", = ["full"] }
🚀 Quick Start
Basic Chat
use ;
use Message;
async
Prompt Templates
use ;
use Message;
use HashMap;
// String template
let template = new;
let mut vars = new;
vars.insert;
vars.insert;
let prompt = template.format?;
// Chat template
let chat_template = new;
let mut vars = new;
vars.insert;
vars.insert;
vars.insert;
vars.insert;
let messages = chat_template.format?;
Agent with Tools
LangChainRust provides two types of Agents:
| Agent | Method | Use Case |
|---|---|---|
| ReActAgent | Text parsing (regex) | Models without Function Calling support |
| FunctionCallingAgent | Native Function Calling | Models with FC support (recommended) |
Using FunctionCallingAgent (Recommended)
use ;
use Arc;
let tools: = vec!;
// FunctionCallingAgent automatically binds tools to LLM
let agent = new;
let executor = new
.with_max_iterations;
let result = executor.invoke.await?;
println!;
Using ReActAgent (Legacy Support)
use ;
use Arc;
let tools: = vec!;
let agent = new;
let executor = new
.with_max_iterations;
let result = executor.invoke.await?;
println!;
Memory
use ;
let mut history = new;
// Add messages
history.add_message;
history.add_message;
// Retrieve messages
for msg in history.messages
Chain Pipelines
use ;
use Arc;
use HashMap;
use Value;
// Single chain
let chain1 = new;
// Sequential chains
let chain2 = new;
let pipeline = new
.add_chain
.add_chain;
let mut inputs = new;
inputs.insert;
let results = pipeline.invoke.await?;
RAG Pipeline
use ;
use Arc;
// Create documents
let docs = vec!;
// Split documents
let splitter = new;
let chunks = splitter.split_document;
// Create retriever
let store = new;
let embeddings = new;
let retriever = new;
// Index documents
retriever.add_documents.await?;
// Search
let relevant_docs = retriever.retrieve.await?;
📚 Examples
See examples/ for complete code:
Basic
hello_llm- Basic LLM chatstreaming- Streaming outputprompt_template- Using templatestools- Built-in tools
Intermediate
agent_with_tools- Agent with tool callingmemory_conversation- Multi-turn conversationschain_pipeline- Chain workflows
Advanced
rag_demo- Full RAG pipelinemulti_tool_agent- Agent with multiple toolsfull_pipeline- Complete AI application
Run examples:
# Without API key
# With API key
🧪 Testing
# Run all tests
# Run specific module
# Show test output
📁 Project Structure
src/
├── core/ # Core abstractions
│ ├── language_models/ # Base LLM traits
│ ├── runnables/ # Runnable trait
│ └── tools/ # Tool trait + to_tool_definition()
├── language_models/ # LLM implementations
│ └── openai/ # OpenAI client (Function Calling support)
├── agents/ # Agent framework
│ ├── react/ # ReActAgent (text parsing)
│ └── function_calling/ # FunctionCallingAgent (native FC)
├── prompts/ # Prompt templates
├── memory/ # Memory management
├── chains/ # Chain workflows
├── retrieval/ # RAG components
├── embeddings/ # Text embeddings
├── vector_stores/ # Vector databases
├── tools/ # Built-in tools
└── schema/ # Data structures
🔧 Configuration
Environment Variables
# Optional: custom endpoint
OpenAIConfig Options
| Field | Type | Description |
|---|---|---|
api_key |
String |
OpenAI API key |
base_url |
String |
API endpoint (supports proxies) |
model |
String |
Model name (e.g., "gpt-3.5-turbo") |
streaming |
bool |
Enable streaming responses |
temperature |
Option<f32> |
Sampling temperature (0.0-2.0) |
max_tokens |
Option<usize> |
Maximum tokens to generate |
🔐 Security
- Never commit API keys to version control
- Use environment variables for secrets
- Support for proxy/custom endpoints
📖 Documentation
🤝 Contributing
Contributions are welcome! See CONTRIBUTING.md for details.
📄 License
Licensed under either of:
- Apache License, Version 2.0
- MIT License
🙏 Acknowledgments
Inspired by LangChain, implemented in Rust.