Trait sqlite3_parser::lexer::Splitter
source · pub trait Splitter: Sized {
type Error: ScanError;
type TokenType;
fn split<'input>(
&mut self,
data: &'input [u8],
eof: bool
) -> Result<(Option<(&'input [u8], Self::TokenType)>, usize), Self::Error>;
}
Expand description
Split function used to tokenize the input
Required Associated Types§
Required Methods§
sourcefn split<'input>(
&mut self,
data: &'input [u8],
eof: bool
) -> Result<(Option<(&'input [u8], Self::TokenType)>, usize), Self::Error>
fn split<'input>(
&mut self,
data: &'input [u8],
eof: bool
) -> Result<(Option<(&'input [u8], Self::TokenType)>, usize), Self::Error>
The arguments are an initial substring of the remaining unprocessed
data and a flag, eof
, that reports whether the Reader has no more data
to give.
If the returned error is non-nil, scanning stops and the error is returned to the client.
The function is never called with an empty data slice unless at EOF.
If eof
is true, however, data may be non-empty and,
as always, holds unprocessed text.
Implementors§
source§impl Splitter for Tokenizer
impl Splitter for Tokenizer
ⓘ
use sqlite3_parser::lexer::sql::Tokenizer;
use sqlite3_parser::lexer::Scanner;
let tokenizer = Tokenizer::new();
let input = "PRAGMA parser_trace=ON;".as_bytes();
let mut s = Scanner::new(input, tokenizer);
let (token1, _) = s.scan().unwrap().unwrap();
s.scan().unwrap().unwrap();
assert!(b"PRAGMA".eq_ignore_ascii_case(token1));