mcp-rust-server
High-performance Model Context Protocol (MCP) server for code analysis, security scanning, and project insights—written in Rust 🦀.
Related Project
This MCP server exposes the capabilities of the syncable-cli
tool to AI agents. While syncable-cli
is a standalone CLI tool for interacting with Syncable workspaces, this server acts as a bridge, allowing AI agents and other clients to access those CLI features programmatically via the Model Context Protocol (MCP). Both projects are closely related and complement each other.
Table of Contents
Features
- Fast & Scalable: Built with async Rust on the Tokio runtime
- Multi-Protocol: Supports both stdio and SSE (Server-Sent Events) transports
- Security Scanning: Static analysis and vulnerability detection
- Extensible: Easily add new MCP handlers and custom tools
- Production-Ready: Optimized release profile, structured logging, and CI integration
Installation
rust-mcp-server-syncable-cli
is published on crates.io. You need a recent Rust toolchain (1.70+ recommended). It works as an MCP server for AI agents where you can use the langgraph framework or similar to connect to this MCP server for code scanning.
CLI Binaries
Install the server binaries from crates.io:
This installs two binaries into your Cargo bin
directory (usually ~/.cargo/bin
):
mcp-stdio
— stdin/stdout-based MCP servermcp-sse
— HTTP/SSE-based MCP server
Add to PATH
If you see a warning like:
be sure to add
/Users/yourname/.cargo/bin
to your PATH to be able to run the installed binaries
Add the following to your shell profile:
For zsh (default on recent macOS):
For bash:
Verify installation:
Python Client Example
You can connect to the MCP server from Python using the mcp client library or similar.
Below is an example using mcp.client.stdio
to launch and communicate with the Rust MCP server via stdio:
await
# List available tools
= await
# Call the 'about_info' tool
= await
# Call the 'analysis_scan' tool
= await
# Call the 'security_scan' tool
= await
# Call the 'dependency_scan' tool
= await
Requirements:
- Install the Python MCP client:
- Make sure
mcp-stdio
is in yourPATH
as described above.
Using HTTP/SSE Mode
If you prefer to use HTTP/SSE, start the server with:
Then, in Python, you can send HTTP POST requests to http://localhost:8000/mcp
using requests
or aiohttp
.
Example:
=
=
Library
Add to your project’s Cargo.toml
:
[]
= "0.1.4"
Usage
CLI Binaries
# Run the stdio-based server
# Run the SSE-based server
By default, both servers will:
- Read framed MCP requests (JSON-RPC) from the chosen transport
- Dispatch to your registered handlers
- Write framed MCP responses
Library
use ;
async
start_stdio()
initializes logging, registers tools, and listens on stdin/stdout.start_sse()
spins up an HTTP server athttp://0.0.0.0:8000/mcp
and streams MCP responses.
Documentation
Full API documentation is generated on docs.rs:
Contributing
Contributions are welcome! Please:
- Fork the repo
- Create a feature branch (
git checkout -b feat/your-feature
) - Commit your changes (
git commit -m "Add feature"
) - Push to your fork (
git push origin feat/your-feature
) - Open a pull request
Run tests and lint before submitting:
Roadmap & Upcoming Features
LangGraph Integration (Coming Soon 🚧)
We are planning to add first-class support for the LangGraph framework. This will include:
-
REST API Interface:
Exposing a standard RESTful API (in addition to the current MCP stdio and SSE transports), making it easy to connect LangGraph and other agent frameworks without requiring a custom Python client. -
Plug-and-Play LangGraph Support:
Example workflows and documentation for integrating this MCP server as a tool node in LangGraph pipelines. -
OpenAPI/Swagger Documentation:
To make it easy to explore and test the REST endpoints.
Stay tuned! If you are interested in this feature or want to contribute, please open an issue or discussion on GitHub.
License
Licensed under the MIT License. See [LICENSE] for details.
Acknowledgments
- Built on top of the rust-mcp-sdk
- Inspired by the Syncable CLI MCP Server
- Thanks to the Rust community and all contributors