sqlite3_tokenizer 0.1.0

Tokenizes SQL strings as SQLite would
Documentation
  • Coverage
  • 2.41%
    4 out of 166 items documented1 out of 5 items with examples
  • Size
  • Source code size: 51.75 kB This is the summed size of all the files inside the crates.io package for this release.
  • Documentation size: 1.39 MB This is the summed size of all files generated by rustdoc for all configured targets
  • Ø build duration
  • this release: 11s Average build duration of successful builds.
  • all releases: 11s Average build duration of successful builds in releases after 2024-10-23.
  • Links
  • Homepage
  • Documentation
  • PeterReid/sqlite3_tokenizer
    0 0 1
  • crates.io
  • Dependencies
  • Versions
  • Owners
  • PeterReid

This crate provides Tokenizer, which iterates over tokens in a SQL string as SQLite would.

Example

extern crate sqlite3_tokenizer;

use sqlite3_tokenizer::Tokenizer;

fn main() {
    for token in Tokenizer::new("SELECT * FROM t") {
        println!("Token of kind {:?} is written {:?}", token.kind, token.text);
    }
}

outputs

Token of kind Select is written "SELECT"
Token of kind Space is written " "
Token of kind Star is written "*"
Token of kind Space is written " "
Token of kind From is written "FROM"
Token of kind Space is written " "
Token of kind Id is written "t"