opencode_webresearch 0.1.0

Rust library for automated web research workflows via OpenCode and MCP servers.
Documentation

opencode_webresearch

opencode_webresearch is a Rust library for automated web research workflows through OpenCode and MCP tools.

It is designed to:

  • Probe a configured OpenCode server.
  • Start a managed server when one is not available.
  • Run a research prompt in a fresh session.
  • Auto-approve permission requests to keep tool execution unblocked.
  • Collect streamed assistant output.
  • Validate and write the final result to response.json.
  • Retry from the beginning when failures or empty responses occur.

Installation

[dependencies]
opencode_webresearch = "0.1.0"

Required Runtime Dependencies

  • Unix-like OS (Linux/macOS), matching opencode-sdk support.
  • opencode binary available in PATH if managed server startup is needed.
  • Configured OpenCode/MCP environment for tools like searxng and webfetch.

API

ResearchRequest

{
  "prompt": "who is brosnan yuen? use MCP server searxng to search. save answer as response.json",
  "opencode_server_hostname": "0.0.0.0",
  "opencode_server_port": "7777",
  "llm_provider": "llama.cpp",
  "llm_mode_name": "gemma-4-26B-A4B-it-UD-Q4_K_M.gguf",
  "tools": ["searxng", "webfetch"],
  "output_directory": "/home/brosnan/opencode_webresearch/output"
}

Notes:

  • opencode_server_port accepts either number (7777) or string ("7777").
  • working_directory, timeout_secs, max_attempts, and other operational controls are also available.

run_research

use opencode_webresearch::{run_research, ResearchRequest};
use std::path::PathBuf;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let request = ResearchRequest {
        prompt: "Research Raspberry Pi RP2040 and summarize key hardware facts.".to_string(),
        opencode_server_hostname: "0.0.0.0".to_string(),
        opencode_server_port: 7777,
        llm_provider: "llama.cpp".to_string(),
        llm_mode_name: "gemma-4-26B-A4B-it-UD-Q4_K_M.gguf".to_string(),
        tools: vec!["searxng".to_string(), "webfetch".to_string()],
        output_directory: PathBuf::from("/home/brosnan/opencode_webresearch/output"),
        working_directory: Some(PathBuf::from("/home/brosnan/opencode_webresearch")),
        timeout_secs: 300,
        server_startup_timeout_ms: 15_000,
        max_attempts: 3,
        shutdown_server_when_done: true,
    };

    let response = run_research(request).await?;
    println!("Wrote {}", response.response_path.display());
    Ok(())
}

Output Format

The library writes response.json in the provided output directory.

It includes:

  • Original prompt.
  • Assistant answer text.
  • Session ID.
  • Attempt number.
  • Provider/model metadata.
  • Requested tool names.
  • Message count.
  • Generation timestamp.

Retry Behavior

Each attempt executes the full sequence:

  1. Check server health.
  2. Start server if needed.
  3. Create session.
  4. Send prompt.
  5. Stream response and handle permissions.
  6. Save validated response.json.
  7. Delete session.

If any step fails, or if the answer is empty, the workflow retries from step 1 until max_attempts is exhausted.

Testing

Run unit and integration test targets:

cargo test

Integration tests that require real OpenCode + network are present but marked #[ignore]:

  • Research "brosnan yuen" and write response.json.
  • Research RP2040 and verify datasheet PDF downloads in /home/brosnan/opencode_webresearch/output.

Run ignored integration tests explicitly:

cargo test --test integration_webresearch -- --ignored