langchainrust 0.2.10

A LangChain-inspired framework for building LLM applications in Rust. Supports OpenAI, Agents, Tools, Memory, Chains, RAG, BM25, Hybrid Retrieval, LangGraph, and native Function Calling.
docs.rs failed to build langchainrust-0.2.10
Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
Visit the last successful build: langchainrust-0.2.0

langchainrust

Rust License Crates.io Documentation

A LangChain-inspired Rust framework for building LLM applications.

What it solves: Build Agents, RAG, BM25 keyword search, Hybrid retrieval, LangGraph workflows - all in pure Rust.


Core Features

Component Description
LLM OpenAI / Ollama compatible, streaming, Function Calling
Agents ReActAgent + FunctionCallingAgent
Memory Buffer / Window / Summary / SummaryBuffer
Chains LLMChain / SequentialChain / RetrievalQA
RAG Document splitting, vector store, semantic retrieval
BM25 Keyword search, Chinese/English tokenization, AutoMerging
Hybrid BM25 + Vector hybrid retrieval, RRF fusion
LangGraph Graph workflows, Human-in-the-loop, Subgraph
Tools Calculator / DateTime / Math / URLFetch
MongoDB Persistent storage backend (feature: mongodb-persistence)

Full documentation: 中文文档 | English


Architecture

┌─────────────────────────────────────────────────────────────┐
│                      langchainrust                           │
├─────────────────────────────────────────────────────────────┤
│  LLM Layer                                                   │
│  ├── OpenAIChat / OllamaChat                                 │
│  ├── Function Calling (bind_tools)                          │
│  └── Streaming (stream_chat)                                │
├─────────────────────────────────────────────────────────────┤
│  Agent Layer                                                 │
│  ├── ReActAgent / FunctionCallingAgent                      │
│  ├── AgentExecutor                                          │
│  └── LangGraph (StateGraph, Subgraph, Parallel)             │
├─────────────────────────────────────────────────────────────┤
│  Retrieval Layer                                             │
│  ├── RAG (TextSplitter, VectorStore)                        │
│  ├── BM25 (Keyword Search, AutoMerging)                     │
│  ├── Hybrid (BM25 + Vector, RRF Fusion)                     │
│  └── Storage (InMemory, MongoDB)                            │
├─────────────────────────────────────────────────────────────┤
│  Utility Layer                                               │
│  ├── Memory (Buffer, Window, Summary)                       │
│  ├── Chains (LLMChain, SequentialChain)                     │
│  ├── Prompts (PromptTemplate, ChatPromptTemplate)           │
│  ├── Tools (Calculator, DateTime, URLFetch)                 │
│  └── Callbacks (LangSmith, StdOut)                          │
└─────────────────────────────────────────────────────────────┘

Installation

[dependencies]
langchainrust = "0.2.6"
tokio = { version = "1.0", features = ["full"] }

# Optional features
langchainrust = { version = "0.2.6", features = ["mongodb-persistence"] }  # MongoDB storage
langchainrust = { version = "0.2.6", features = ["qdrant-integration"] }    # Qdrant vector DB

Quick Start

use langchainrust::{OpenAIChat, OpenAIConfig, BaseChatModel};
use langchainrust::schema::Message;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let config = OpenAIConfig {
        api_key: std::env::var("OPENAI_API_KEY")?,
        base_url: "https://api.openai.com/v1".to_string(),
        model: "gpt-3.5-turbo".to_string(),
        ..Default::default()
    };
    
    let llm = OpenAIChat::new(config);
    
    let response = llm.chat(vec![
        Message::system("You are a helpful assistant."),
        Message::human("What is Rust?"),
    ], None).await?;
    
    println!("{}", response.content);
    Ok(())
}

BM25 Keyword Search

use langchainrust::{BM25Retriever, Document};

let mut retriever = BM25Retriever::new();

retriever.add_documents_sync(vec![
    Document::new("Rust is a systems programming language"),
    Document::new("Python is a scripting language"),
]);

let results = retriever.search("systems programming", 3);

for result in results {
    println!("Document: {}", result.document.content);
    println!("Score: {}", result.score);
}

More examples in Usage Guide (中文).


Documentation

Docs Content
Usage Guide (中文) LLM、Agent、Memory、RAG、BM25、Hybrid、LangGraph 详细用法
Usage Guide (English) Detailed usage for all components
API Docs Rust API documentation

Testing

cargo test

Contributing

Contributions welcome! See CONTRIBUTING.md.


License

MIT or Apache-2.0, at your option.