Skip to main content

Crate cqs

Crate cqs 

Source
Expand description

Local semantic search for code using ML embeddings. Find functions by what they do, not just their names.

§Features

  • Semantic search: Uses E5-base-v2 embeddings (769-dim: 768 model + sentiment)
  • Notes with sentiment: Unified memory system for AI collaborators
  • Multi-language: Rust, Python, TypeScript, JavaScript, Go, C, Java
  • GPU acceleration: CUDA/TensorRT with CPU fallback
  • MCP integration: Works with Claude Code and other AI assistants

§Quick Start

use cqs::{Embedder, Parser, Store};
use cqs::store::ModelInfo;

// Initialize components
let parser = Parser::new()?;
let embedder = Embedder::new()?;
let store = Store::open(std::path::Path::new(".cq/index.db"))?;

// Parse and embed a file
let chunks = parser.parse_file(std::path::Path::new("src/main.rs"))?;
let embeddings = embedder.embed_documents(
    &chunks.iter().map(|c| c.content.as_str()).collect::<Vec<_>>()
)?;

// Search for similar code
let query_embedding = embedder.embed_query("parse configuration file")?;
let results = store.search(&query_embedding, 5, 0.3)?;

§MCP Server

Start the MCP server for AI assistant integration:

use std::path::PathBuf;

// Stdio transport (for Claude Code)
cqs::serve_stdio(PathBuf::from("."), false)?;  // false = CPU, true = GPU

// HTTP transport with GPU embedding (None = no auth)
// Note: serve_http blocks the current thread
cqs::serve_http(".", "127.0.0.1", 3000, None, true)?;

Re-exports§

pub use embedder::Embedder;
pub use embedder::Embedding;
pub use hnsw::HnswIndex;
pub use index::IndexResult;
pub use index::VectorIndex;
pub use mcp::serve_http;
pub use mcp::serve_stdio;
pub use note::parse_notes;
pub use parser::Chunk;
pub use parser::Parser;
pub use store::ModelInfo;
pub use store::SearchFilter;
pub use store::Store;

Modules§

config
Configuration file support for cqs
embedder
Embedding generation with ort + tokenizers
hnsw
HNSW (Hierarchical Navigable Small World) index for fast vector search
index
Vector index trait for nearest neighbor search
language
Language registry for code parsing
mcp
MCP (Model Context Protocol) server implementation
note
Note parsing and types
parser
Code parsing with tree-sitter
reference
Reference index support for multi-index search
store
SQLite storage for chunks and embeddings (sqlx async with sync wrappers)

Structs§

DiffResult
Result of a semantic diff
GatherOptions
Options for gather operation
ProjectRegistry
Global registry of indexed cqs projects

Enums§

GatherDirection
Direction of call graph expansion
NlTemplate
Template variants for NL description generation.
Pattern
Known structural patterns

Constants§

EMBEDDING_DIM
Embedding dimension: 768 from E5-base-v2 model + 1 sentiment dimension. Single source of truth — all modules import this constant.

Functions§

enumerate_files
Enumerate files to index in a project directory.
gather
Gather relevant code chunks for a query
generate_nl_description
Generate natural language description from chunk metadata.
generate_nl_with_template
Generate NL description using a specific template variant.
index_notes
Index notes into the database (embed and store)
normalize_for_fts
Normalize code text for FTS5 indexing.
search_across_projects
Search across all registered projects
semantic_diff
Run a semantic diff between two stores
strip_unc_prefix
No-op on non-Windows platforms