pub enum Token<'a> {
}Variants§
Comment(&'a str)
Newline
Import
From
As
FunctionName(&'a str)
Identifier(&'a str)
Label(&'a str)
Path(&'a str)
LeftBrace
RightBrace
LeftPar
RightPar
Comma
Colon
LeftAngle
RightAngle
Semi
Equals
Integer(i64)
Float(f64)
Character(char)
True
False
Implementations§
Source§impl<'a> Token<'a>
impl<'a> Token<'a>
pub fn matches_against(&self, pattern: Token<'a>) -> bool
pub fn assume_comment(self) -> &'a str
pub fn assume_function_name(self) -> &'a str
pub fn assume_identifier(self) -> &'a str
pub fn assume_identifier_like(self) -> &'a str
pub fn assume_label(self) -> &'a str
pub fn assume_path(self) -> &'a str
pub fn assume_integer(self) -> i64
pub fn assume_float(self) -> f64
pub fn assume_character(self) -> char
Trait Implementations§
Source§impl<'s> Logos<'s> for Token<'s>
impl<'s> Logos<'s> for Token<'s>
Source§type Error = ()
type Error = ()
Error type returned by the lexer. This can be set using
#[logos(error = MyError)]. Defaults to () if not set.Source§type Extras = ()
type Extras = ()
Associated type
Extras for the particular lexer. This can be set using
#[logos(extras = MyExtras)] and accessed inside callbacks.Source§type Source = str
type Source = str
Source type this token can be lexed from. This will default to
str,
unless one of the defined patterns explicitly uses non-unicode byte values
or byte slices, in which case that implementation will use [u8].Source§fn lex(lex: &mut Lexer<'s, Self>)
fn lex(lex: &mut Lexer<'s, Self>)
The heart of Logos. Called by the
Lexer. The implementation for this function
is generated by the logos-derive crate.Auto Trait Implementations§
impl<'a> Freeze for Token<'a>
impl<'a> RefUnwindSafe for Token<'a>
impl<'a> Send for Token<'a>
impl<'a> Sync for Token<'a>
impl<'a> Unpin for Token<'a>
impl<'a> UnwindSafe for Token<'a>
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more