Hey! đź‘‹ This is the Rust client for the Universal Tool Calling Protocol (UTCP).
Basically, I wanted a way to discover and call tools across a whole bunch of different protocols—HTTP, CLI, WebSocket, gRPC, MCP, you name it—without having to write custom glue code for every single one. This library gives you a single, unified API to handle all of that.
It's heavily inspired by the go-utcp project, but built from the ground up for Rust. 🦀
Why use this?
- One API for everything: You don't care if a tool is a local Python script, a remote gRPC service, or an MCP server. You just ask for the tool by name, and
rs-utcphandles the transport. - Config-driven: You can load your tool providers from a JSON file. This is huge because it means you can add or change endpoints without recompiling your app.
- Codemode: This is the really cool part. 🚀 It includes a scripting environment (powered by Rhai) that lets you orchestrate complex workflows. You can even hook up an LLM to generate these scripts on the fly.
Quick Start
First, add it to your project:
(Or clone it locally if you're hacking on it).
Try the demo
I've included a bunch of examples to get you started. The easiest way to see it in action is the basic usage demo:
This spins up a mock HTTP provider and shows you how to call a tool.
Minimal Setup
Here's what it looks like to use it in your code:
use ;
use Arc;
async
Supported Transports
We support a lot of protocols out of the box. Some are more mature than others, but here's the list:
- HTTP (UTCP manifest or OpenAPI spec auto-converted on discovery)
- MCP (Model Context Protocol - supports both stdio and SSE!)
- WebSocket
- gRPC
- CLI (Run local binaries as tools)
- GraphQL
- TCP / UDP
- SSE (Server-Sent Events)
- WebRTC (Peer-to-peer data channels with signaling)
Check out the examples/ folder for a working server/client demo of almost every transport.
Codemode & Orchestration
If you want to get fancy, you can use "Codemode". It allows you to execute Rhai scripts that have access to your registered tools.
We also provide a CodemodeOrchestrator which acts as a bridge between an LLM and your tools. It follows a 4-step flow:
- Decide: Asks the LLM if any tools are needed for the user prompt.
- Select: Asks the LLM to pick the relevant tools from the registry.
- Generate: Asks the LLM to write a Rhai script using those tools.
- Execute: Runs the script safely within the Codemode sandbox.
// Inside a Rhai script generated by the Orchestrator
let result = call_tool;
print;
You can run the evaluator demo to play with this:
And if you want a full LLM-in-the-loop orchestration, there's a Gemini-backed example:
GEMINI_API_KEY=your_key_here
By default it targets gemini-pro; override with GEMINI_MODEL if you prefer.
Status
- HTTP: Solid and feature-complete.
- MCP: Working well (stdio & SSE).
- WebRTC: Fully implemented with signaling and streaming support.
- Others: Mostly functional skeletons. They work for the happy path, but might need some hardening.
If you find a bug or want to add a new transport, PRs are super welcome!
Development
- Format:
cargo fmt - Check:
cargo check --examples - Test:
cargo test
License
MIT (or whichever license you prefer, just update this).