Skip to main content

Crate syntaqlite

Crate syntaqlite 

Source
Expand description

Fast, accurate SQL tooling for SQLite and its dialects.

This crate provides parsing, formatting, and analysis for SQL, built on SQLite’s own tokenizer and parser rules. Four design principles guide the library:

  • Reliability — uses SQLite’s own parser rules; formatting is round-trip safe and validation mirrors real engine behaviour.
  • Speed — all core types (Formatter, Analyzer, Catalog) are designed for reuse across many inputs without re-allocation.
  • Portability — the core formatting and analysis engine has no runtime dependencies beyond the standard library; optional features (lsp, serde) pull in additional crates.
  • Flexibility — supports multiple database dialects that extend SQLite’s syntax with their own tokens and rules.

§Parsing

Use Parser to parse SQL source text into a typed AST:

use syntaqlite::parse::ParseErrorKind;
use syntaqlite::{Parser, ParseOutcome};

let parser = Parser::new();
let mut session = parser.parse("SELECT 1; SELECT 2");
loop {
    match session.next() {
        ParseOutcome::Ok(stmt) => println!("{:?}", stmt.root()),
        ParseOutcome::Err(e) => {
            eprintln!("parse error: {}", e.message());
            if e.kind() == ParseErrorKind::Fatal { break; }
        }
        ParseOutcome::Done => break,
    }
}

See the parse module for additional types like IncrementalParseSession, ParserConfig, and ParserToken.

§Analysis

Use Analyzer to check SQL against a known schema. The analyzer produces a Analysis containing structured Diagnostic values with byte-offset spans and “did you mean?” suggestions.

use syntaqlite::analysis::CatalogLayer;
use syntaqlite::{AnalysisContext, Catalog, Analyzer, sqlite_dialect};

let mut analyzer = Analyzer::new();
let mut catalog = Catalog::new(sqlite_dialect());

// Register a table so the analyzer can resolve column references.
catalog.layer_mut(CatalogLayer::Database)
    .insert_table("users", Some(vec!["id".into(), "name".into()]), false);

let mut ctx = AnalysisContext::new(&mut catalog);
let model = analyzer.analyze("SELECT id, name FROM users", &mut ctx);

// All names resolve — no diagnostics.
assert!(!model.has_diagnostics());

For richer output, use DiagnosticRenderer to produce rustc-style error messages with source context and underlines.

§Formatting

Use Formatter to pretty-print SQL with consistent style. The formatter parses each statement, runs a bytecode interpreter over the AST, and renders the result with a Wadler-style pretty-printer.

let mut fmt = Formatter::with_config(
    &FormatConfig::default()
        .with_keyword_case(KeywordCase::Lower)
        .with_line_width(60),
);

let output = fmt.format("select id,name from users where active=1").unwrap();
assert!(output.starts_with("select"));
assert!(output.contains("from"));

See FormatConfig for all available options (line width, indent width, keyword casing, semicolons).

§Features

  • sqlite (default): enables the built-in SQLite dialect, Dialect, and re-exports Parser, Tokenizer, and typed AST nodes.
  • fmt (default): enables Formatter, FormatConfig, and KeywordCase.
  • validation (default): enables Analyzer, Catalog, Diagnostic, and related types.
  • lsp: enables LspServer and [lsp::LspHost] for editor integration.
  • experimental-embedded: enables [embedded] SQL extraction from Python and TypeScript/JavaScript source files.
  • serde: adds Serialize/Deserialize impls for diagnostics and AST nodes.
  • serde-json: adds JSON convenience helpers (catalog from JSON, AST dump).

§Choosing an API

  • Use Parser and Tokenizer for parsing and tokenizing SQL.
  • Use Analyzer + Catalog when you need to validate SQL against a database schema (table/column/function resolution).
  • Use Formatter when you need to pretty-print or normalize SQL text.
  • Use LspServer (requires the lsp feature) to embed a full Language Server Protocol implementation in an editor or tool.
  • Use typed when building reusable code over known generated dialects.
  • Use any when dialect choice happens at runtime or crosses FFI/plugin boundaries.

Modules§

analysis
Analysis and validation.
any
Type-erased (dialect-agnostic) parser and tokenizer types.
fmt
SQL formatter.
nodes
Generated typed AST nodes for the built-in SQLite dialect.
parse
Tokenizer, parser, and related types for SQLite SQL.
source
Typed source coordinates — offsets, lengths, ranges, and source-text wrappers. Every position-like value the parser emits is modeled by a newtype in this module so coordinate-system confusion is caught at compile time rather than surfacing as a runtime bug.
typed
Typed (dialect-parameterized) parser and tokenizer infrastructure.
util
Cross-cutting utilities for dialect configuration, compatibility, and rendering.

Structs§

AnalysisConfig
Configuration for analysis.
AnalysisContext
Per-call context passed to Analyzer::analyze. Bundles the catalog (mutated in place as DDL accumulates and imports are recorded), the analysis config, and an optional module resolver.
Analyzer
Stateless analysis engine.
Catalog
Layered semantic catalog describing a database schema.
CheckConfig
Per-category check levels.
Diagnostic
A diagnostic produced by analysis, with location, message, severity, optional help, and an optional macro expansion traceback.
Dialect
The typed SQLite dialect handle.
FormatConfig
Configuration for the SQL formatter.
Formatter
High-level SQL formatter that pretty-prints SQL source text.
Parser
High-level entry point for parsing SQLite SQL into typed AST statements.

Enums§

CheckLevel
Severity level for a check category.
ParseOutcome
Tri-state parse result for statement-oriented parser APIs.

Functions§

sqlite_dialect
Returns the built-in SQLite dialect handle.