Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
opencodesearch
opencodesearch is an asynchronous Rust code search system with a Model Context Protocol (MCP) server.
It indexes large repositories into vector + keyword backends and serves search results through MCP tools.
Features
- Fully async runtime (
tokio) - 4 isolated processes:
- orchestrator (state machine + supervision)
- background ingestor
- MCP server process
- git watchdog process
- Required crates integrated and used in runtime code:
opencodesearchparserqdrant-clientollama-rsrmcp
- Hybrid retrieval:
- semantic search (Qdrant vectors)
- keyword search (Quickwit HTTP + local shadow fallback)
- MCP server compatible with MCP clients using streamable HTTP and stdio transports
Architecture
State machine in orchestrator:
SPINUP: loadconfig.jsonNORMAL: runingestor+mcp+watchdogUPDATE: keepwatchdog, stopingestor+mcpduring update windowCLOSING: stop all children gracefully
Update flow:
- watchdog tracks git commits since last sync
- when threshold (
commit_threshold) is reached:- send
UPDATE_STARTto orchestrator - pull + compute changed/deleted files
- remove stale docs
- reindex changed files
- send
UPDATE_END
- send
Requirements
- Rust stable toolchain
- Docker + Docker Compose
- Local network access to:
- Ollama (
11434) - Qdrant (
6333HTTP,6334gRPC) - Quickwit (
7280)
- Ollama (
Configuration
config.json schema:
Important:
qdrant.server_urlshould target the gRPC endpoint port (6334) forqdrant-client.quickwit.quickwit_urlshould target HTTP (7280).
Start Backend Services
Run all local dependencies:
Check containers:
Running the System
1) Orchestrator mode (recommended)
Starts and supervises all child processes.
2) Individual process modes
You can run each process directly for debugging.
Ingestor:
MCP server:
MCP server over stdio (for local MCP clients):
Watchdog (requires orchestrator IPC env):
OPENCODESEARCH_IPC_SOCKET=/tmp/opencodesearch.sock
MCP Server Usage
The MCP server supports:
- streamable HTTP via
cargo run -- mcp --config config.json - stdio via
cargo run -- mcp-stdio --config config.json
Implemented MCP tool:
search_code- input:
query: stringlimit?: number(default 8, max 50)
- output (structured JSON): array of objects with
snippetpathstart_lineend_linescoresource
- input:
Example tool input
Result shape
Using With MCP Clients
This server supports both:
- streamable HTTP (
cargo run -- mcp --config config.json) - local stdio (
cargo run -- mcp-stdio --config config.json)
OpenAI Codex
Codex supports both stdio and streamable HTTP MCP servers.
Stdio (CLI):
Remote HTTP (~/.codex/config.toml or .codex/config.toml):
[]
= "http://localhost:9443/"
Then verify:
OpenCode
OpenCode config uses the mcp section in opencode.json (or opencode.jsonc).
Remote HTTP:
Local stdio:
Claude Code
Claude Code supports HTTP, SSE, and stdio MCP transports.
Remote HTTP:
Local stdio:
Then verify:
TLS / HTTPS Notes
- Default local config uses
http://localhost:9443. - For
https://..., provide a certificate trusted by your MCP client. - TLS cert and key defaults:
certs/localhost-cert.pemcerts/localhost-key.pem
- Override TLS file paths with:
OPENCODESEARCH_TLS_CERT_PATHOPENCODESEARCH_TLS_KEY_PATH
- For Codex specifically, you can provide a custom CA bundle with
CODEX_CA_CERTIFICATE.
References:
- Codex MCP docs: https://developers.openai.com/codex/mcp
- OpenCode MCP docs: https://opencode.ai/docs/mcp-servers/
- Claude Code MCP docs: https://code.claude.com/docs/en/mcp
Quick curl test
Use the included script:
Optional:
MCP_URL=https://localhost:9443/ MCP_INSECURE=1 ./test_mcp_curl.sh
The script performs the required MCP HTTP handshake steps:
initialize- extract
mcp-session-idfrom response headers - send
notifications/initializedwith the samemcp-session-id - call
tools/callforsearch_code
Manual curl sequence
Initialize and capture session id:
Send initialized notification:
SESSION_ID=""
Call the MCP tool:
Rust API Documentation
The crate exposes reusable modules for embedding, indexing, MCP serving, and process control.
Modules
config: parse typed app config (AppConfig)chunking: parse/split source files into chunks (chunk_file)indexing: indexing runtime (IndexingRuntime)qdrant_store: vector storage + semantic query (QdrantStore)quickwit: keyword storage/query (QuickwitStore)mcp: MCP server type (OpenCodeSearchMcpServer)watchdog: git update monitor (WatchdogProcess)orchestrator: multi-process supervisor (Orchestrator)
Minimal Rust indexing example
use AppConfig;
use IndexingRuntime;
async
Minimal Rust semantic search example
use AppConfig;
use IndexingRuntime;
async
Minimal Rust MCP server embedding
use AppConfig;
use IndexingRuntime;
use OpenCodeSearchMcpServer;
async
Testing
Standard tests
Live container integration tests
Requires running Docker services and local git:
Current ignored integration tests validate:
- Ollama connectivity
- Quickwit + Qdrant connectivity
- full indexing flow on generated Python project
- retrieval through MCP search path with non-exact query phrasing
- 100-commit refactor scenario for watchdog threshold behavior
Troubleshooting
- Quickwit health endpoint: use
http://localhost:7280/health/livez - If embeddings fail, confirm Ollama model availability:
qwen3-embedding:0.6b
- Qdrant client requires gRPC port (
6334) in config - If integration tests fail on startup race, rerun after a short container warmup