MoFA SDK
MoFA (Modular Framework for Agents) SDK - A standard development toolkit for building AI agents with Rust.
Architecture
mofa-sdk (标准 API 层 - SDK)
↓
├── mofa-kernel (微内核核心)
├── mofa-runtime (运行时)
├── mofa-foundation (业务逻辑)
└── mofa-plugins (插件系统)
Public Modules
kernel: core abstractions frommofa-kernelruntime: lifecycle and execution runtimeagent: foundation agent building blocksllm: LLM integration and helpersplugins: plugin system and adaptersworkflow: workflow engine and DSLpersistence: persistence stores and pluginsmessaging: message bus and contractssecretary: secretary agent patterncollaboration: multi-agent collaboration protocolscoordination: scheduling and coordination utilitiesprompt: prompt templates and compositionconfig: global config facade (kernel/runtime/foundation)skills: progressive disclosure skills systemprelude: common imports for quick start
Installation
Add this to your Cargo.toml:
[]
= "0.1"
Optional Features
# With LLM support (OpenAI, Ollama, Azure)
= { = "0.1", = ["openai"] }
# With dora-rs runtime support (distributed dataflow)
= { = "0.1", = ["dora"] }
# With UniFFI bindings (Python, Kotlin, Swift, Java)
= { = "0.1", = ["uniffi"] }
# With PyO3 Python bindings
= { = "0.1", = ["python"] }
# Full features
= { = "0.1", = ["openai", "uniffi", "dora"] }
Quick Start
Basic Agent
use ;
use run_agents;
use async_trait;
async
Batch Execution
use AgentInput;
use run_agents;
async
LLM Agent (Recommended)
Use the built-in LLMAgent for quick LLM interactions:
use LLMAgentBuilder;
async
Load Agent from Configuration File
use agent_from_config;
async
With Dora Runtime
use ;
async
Cross-Language Bindings (UniFFI)
MoFA provides cross-language bindings via UniFFI for Python, Kotlin, Swift, Java, and Go.
Building with UniFFI
# Build with UniFFI and OpenAI features
Generating Bindings
Use the provided script:
# Generate all bindings (Python, Kotlin, Swift, Java)
# Generate specific language
For Go bindings (requires separate tool):
# Install uniffi-bindgen-go first
# Generate Go bindings
Python Quick Start
# Create an agent using the builder pattern
=
=
=
=
=
=
# Simple Q&A
=
# Multi-turn chat
=
= # Remembers context
# Get history
=
Java Quick Start
;
// Create an agent using the builder pattern
LLMAgentBuilder builder ;
builder ;
builder ;
builder ;
builder ;
LLMAgent agent ;
// Simple Q&A
String response ;
System.out.;
// Multi-turn chat
String r1 ;
String r2 ;
// Get history
List history ;
agent.;
Go Quick Start
package main
import (
"fmt"
"os"
mofa "mofa-sdk/bindings/go"
)
func main()
Kotlin Quick Start
import org.mofa.*
// Create an agent using the builder pattern
val builder = UniFFI.newLlmAgentBuilder()
builder.setId("my-agent")
builder.setName("Kotlin Agent")
builder.setSystemPrompt("You are a helpful assistant.")
builder.setOpenaiProvider(
apiKey = System.getenv("OPENAI_API_KEY"),
baseUrl = System.getenv("OPENAI_BASE_URL"),
model = System.getenv("OPENAI_MODEL") ?: "gpt-3.5-turbo"
)
val agent = builder.build()
// Simple Q&A
val response = agent.ask("What is Kotlin?")
println(response)
// Multi-turn chat
val r1 = agent.chat("My name is Diana.")
val r2 = agent.chat("What's my name?")
Swift Quick Start
import MoFA
// Create an agent using the builder pattern
let builder = try UniFFI.newLlmAgentBuilder()
try builder.setId("my-agent")
try builder.setName("Swift Agent")
try builder.setSystemPrompt("You are a helpful assistant.")
try builder.setOpenaiProvider(
apiKey: ProcessInfo.processInfo.environment["OPENAI_API_KEY"]!,
baseUrl: ProcessInfo.processInfo.environment["OPENAI_BASE_URL"],
model: ProcessInfo.processInfo.environment["OPENAI_MODEL"]
)
let agent = try builder.build()
// Simple Q&A
let response = try agent.ask(question: "What is Swift?")
print(response)
// Multi-turn chat
let r1 = try agent.chat(message: "My name is Eve.")
let r2 = try agent.chat(message: "What's my name?")
Available Functions (All Languages)
| Function | Description |
|---|---|
get_version() |
Get SDK version string |
is_dora_available() |
Check if Dora runtime support is enabled |
new_llm_agent_builder() |
Create a new LLMAgentBuilder instance |
LLMAgentBuilder Methods
| Method | Description |
|---|---|
set_id(id) |
Set agent ID |
set_name(name) |
Set agent name |
set_system_prompt(prompt) |
Set system prompt |
set_temperature(temp) |
Set temperature (0.0-1.0) |
set_max_tokens(tokens) |
Set max tokens for response |
set_session_id(id) |
Set session ID |
set_user_id(id) |
Set user ID |
set_tenant_id(id) |
Set tenant ID |
set_context_window_size(size) |
Set context window size in rounds |
set_openai_provider(key, url, model) |
Configure OpenAI provider |
build() |
Build the LLMAgent instance |
LLMAgent Methods
| Method | Description |
|---|---|
agent_id() |
Get agent ID |
name() |
Get agent name |
ask(question) |
Simple Q&A (no context retention) |
chat(message) |
Multi-turn chat (with context retention) |
clear_history() |
Clear conversation history |
get_history() |
Get conversation history |
Features
| Feature | Description |
|---|---|
openai |
Enable LLM support (OpenAI, Ollama, Azure, Compatible) |
dora |
Enable dora-rs runtime for distributed dataflow |
uniffi |
Enable UniFFI bindings for cross-language support |
python |
Enable PyO3 Python native bindings |
Configuration File Format (agent.yml)
agent:
id: "my-agent-001"
name: "My LLM Agent"
description: "A helpful assistant"
llm:
provider: openai # openai, ollama, azure, compatible
model: gpt-4o
api_key: ${OPENAI_API_KEY} # Environment variable reference
temperature: 0.7
max_tokens: 4096
system_prompt: |
You are a helpful AI assistant.
Documentation
License
Licensed under either of Apache License, Version 2.0 or MIT license at your option.