Skip to main content

Crate syntaqlite

Crate syntaqlite 

Source
Expand description

Fast, accurate SQL tooling for SQLite and its dialects.

This crate provides parsing, formatting, and semantic validation for SQL, built on SQLite’s own tokenizer and grammar rules. Four design principles guide the library:

  • Reliability — uses SQLite’s own grammar rules; formatting is round-trip safe and validation mirrors real engine behaviour.
  • Speed — all core types (Formatter, SemanticAnalyzer, Catalog) are designed for reuse across many inputs without re-allocation.
  • Portability — the core formatting and validation engine has no runtime dependencies beyond the standard library; optional features (lsp, serde) pull in additional crates.
  • Flexibility — supports multiple database dialects that extend SQLite’s grammar with their own tokens and rules.

§Parsing

Use Parser to parse SQL source text into a typed AST:

use syntaqlite::parse::ParseErrorKind;
use syntaqlite::{Parser, ParseOutcome};

let parser = Parser::new();
let mut session = parser.parse("SELECT 1; SELECT 2");
loop {
    match session.next() {
        ParseOutcome::Ok(stmt) => println!("{:?}", stmt.root()),
        ParseOutcome::Err(e) => {
            eprintln!("parse error: {}", e.message());
            if e.kind() == ParseErrorKind::Fatal { break; }
        }
        ParseOutcome::Done => break,
    }
}

See the parse module for additional types like IncrementalParseSession, ParserConfig, and ParserToken.

§Validation

Use SemanticAnalyzer to check SQL against a known schema. The analyzer produces a SemanticModel containing structured Diagnostic values with byte-offset spans and “did you mean?” suggestions.

use syntaqlite::semantic::CatalogLayer;
use syntaqlite::{
    SemanticAnalyzer, Catalog, ValidationConfig, sqlite_dialect,
};

let mut analyzer = SemanticAnalyzer::new();
let mut catalog = Catalog::new(sqlite_dialect());

// Register a table so the analyzer can resolve column references.
catalog.layer_mut(CatalogLayer::Database)
    .insert_table("users", Some(vec!["id".into(), "name".into()]), false);

let config = ValidationConfig::default();
let model = analyzer.analyze("SELECT id, name FROM users", &catalog, &config);

// All names resolve — no diagnostics.
assert!(model.diagnostics().is_empty());

For richer output, use DiagnosticRenderer to produce rustc-style error messages with source context and underlines.

§Formatting

Use Formatter to pretty-print SQL with consistent style. The formatter parses each statement, runs a bytecode interpreter over the AST, and renders the result with a Wadler-style pretty-printer.

let mut fmt = Formatter::with_config(
    &FormatConfig::default()
        .with_keyword_case(KeywordCase::Lower)
        .with_line_width(60),
);

let output = fmt.format("select id,name from users where active=1").unwrap();
assert!(output.starts_with("select"));
assert!(output.contains("from"));

See FormatConfig for all available options (line width, indent width, keyword casing, semicolons).

§Features

  • sqlite (default): enables the built-in SQLite grammar, Dialect, and re-exports Parser, Tokenizer, and typed AST nodes.
  • fmt (default): enables Formatter, FormatConfig, and KeywordCase.
  • validation (default): enables SemanticAnalyzer, Catalog, Diagnostic, and related types.
  • lsp: enables LspServer and [lsp::LspHost] for editor integration.
  • experimental-embedded: enables [embedded] SQL extraction from Python and TypeScript/JavaScript source files.
  • serde: adds Serialize/Deserialize impls for diagnostics and AST nodes.
  • serde-json: adds JSON convenience helpers (catalog from JSON, AST dump).

§Choosing an API

  • Use Parser and Tokenizer for parsing and tokenizing SQL.
  • Use SemanticAnalyzer + Catalog when you need to validate SQL against a database schema (table/column/function resolution).
  • Use Formatter when you need to pretty-print or normalize SQL text.
  • Use LspServer (requires the lsp feature) to embed a full Language Server Protocol implementation in an editor or tool.
  • Use typed when building reusable code over known generated grammars.
  • Use any when grammar choice happens at runtime or crosses FFI/plugin boundaries.

Modules§

any
Type-erased (grammar-agnostic) parser and tokenizer types.
fmt
SQL formatter.
nodes
Generated typed AST nodes for the built-in SQLite grammar.
parse
Tokenizer, parser, and related types for SQLite SQL.
semantic
Semantic analysis and validation.
typed
Typed (grammar-parameterized) parser and tokenizer infrastructure.
util
Cross-cutting utilities for grammar configuration, compatibility, and rendering.

Structs§

Catalog
Layered semantic catalog describing a database schema.
CheckConfig
Per-category check levels.
Diagnostic
A diagnostic produced by parsing or semantic analysis.
Dialect
The typed SQLite dialect handle.
FormatConfig
Configuration for the SQL formatter.
Formatter
High-level SQL formatter that pretty-prints SQL source text.
Parser
High-level entry point for parsing SQLite SQL into typed AST statements.
SemanticAnalyzer
Long-lived semantic analysis engine.
ValidationConfig
Configuration for semantic validation.

Enums§

CheckLevel
Severity level for a check category.
ParseOutcome
Tri-state parse result for statement-oriented parser APIs.

Functions§

sqlite_dialect
Returns the built-in SQLite dialect handle.