TermParser

Struct TermParser 

Source
pub struct TermParser<I>
where I: TryNextWithContext<Arena, Item = u8, Error: Display + 'static>,
{ /* private fields */ }
Expand description

Prolog-like term parser with operator precedence and associativity handling.

The TermParser drives the parsing of Prolog-style terms using the parlex SLR(1) core library. It builds upon the TermTokenParser for tokenization and produces Term values stored in an Arena for efficient allocation.

Operator definitions are resolved dynamically through an OperDefs table, allowing user-defined or default operators to control how expressions are grouped and nested according to their fixity, precedence, and associativity.

/// # Input / Output

§End Tokens and Multiple Sentences

The underlying parser emits an explicit tokens at the end of a parsing unit (end of “sentence” or expression). The parser uses this to finalize and emit one result. If the input contains multiple independent sentences, you will receive multiple results — one per End — and None only after all input is consumed.

§Empty Statements

The terms grammar also accepts an empty term, which is returned as a token with Value::None. This occurs, for example, when the last statement in the input is terminated by a semicolon (.) but followed by no further expression. In that case:

  1. The parser first emits the token for the preceding completed term.
  2. It then emits an additional token representing the empty term (Value::None).
  3. Finally, it returns None, indicating the end of the input stream.

This design allows the parser to fully reflect the structure of the input.

§Errors

All failures are surfaced through a composed [ParserError<LexerError<I::Error, CalcError>, CalcError, CalcToken>]:

  • I::Error — errors from the input source,
  • [TermParserError] — lexical/semantic errors (e.g., UTF-8, integer parsing, symbol-table issues).

§Example

let mut arena = Arena::try_with_default_opers().unwrap();
let input = IterInput::from("hello = 1 .\n foo =\n [5, 3, 2].\n (world, hello, 10).\n\n1000".bytes());
let mut parser = TermParser::try_new(input).unwrap();
let vs = parser.try_collect_with_context(&mut arena).unwrap();
assert_eq!(vs.len(), 4);

Implementations§

Source§

impl<I> TermParser<I>
where I: TryNextWithContext<Arena, Item = u8, Error: Display + 'static>,

Source

pub fn try_new(input: I) -> Result<Self, ParlexError>

Creates a new TermParser for the given input stream and operator definitions.

§Parameters
  • input: A fused iterator over bytes to be parsed.
  • arena: A term arena, used to initialized default operator defs.
§Returns

A fully initialized TermParser ready to parse Prolog-like terms.

§Errors

Returns an error if the lexer context cannot be initialized or if the generated parser tables fail to load.

Trait Implementations§

Source§

impl<I> TryNextWithContext<Arena, (LexerStats, ParserStats)> for TermParser<I>
where I: TryNextWithContext<Arena, Item = u8, Error: Display + 'static>,

Source§

type Item = Term

Tokens produced by this lexer.

Source§

type Error = ParlexError

Unified error type.

Source§

fn try_next_with_context( &mut self, context: &mut Arena, ) -> Result<Option<Term>, ParlexError>

Advances the parser and returns the next token, or None at end of input.

The provided context (an Arena) may be mutated by rule actions (for example, to intern terms). This method is fallible; both input and lexical errors are converted into Self::Error.

§End of Input

When the lexer reaches the end of the input stream, it will typically emit a final TokenID::End token before returning None.

This explicit End token is expected by the Parlex parser to signal successful termination of a complete parsing unit. Consumers should treat this token as a logical end-of-sentence or end-of-expression marker, depending on the grammar.

If the input contains multiple independent sentences or expressions, the lexer may emit multiple End tokens—one after each completed unit. In such cases, the parser can restart or resume parsing after each End to produce multiple parse results from a single input stream.

Once all input has been consumed, the lexer returns None.

Source§

fn stats(&self) -> (LexerStats, ParserStats)

Source§

fn try_collect_with_context( &mut self, context: &mut C, ) -> Result<Vec<Self::Item>, Self::Error>

Collects all remaining items into a Vec, using the given context. Read more

Auto Trait Implementations§

§

impl<I> Freeze for TermParser<I>
where I: Freeze,

§

impl<I> RefUnwindSafe for TermParser<I>
where I: RefUnwindSafe,

§

impl<I> Send for TermParser<I>
where I: Send,

§

impl<I> Sync for TermParser<I>
where I: Sync,

§

impl<I> Unpin for TermParser<I>
where I: Unpin,

§

impl<I> UnwindSafe for TermParser<I>
where I: UnwindSafe,

Blanket Implementations§

Source§

impl<T> Any for T
where T: 'static + ?Sized,

Source§

fn type_id(&self) -> TypeId

Gets the TypeId of self. Read more
Source§

impl<T> Borrow<T> for T
where T: ?Sized,

Source§

fn borrow(&self) -> &T

Immutably borrows from an owned value. Read more
Source§

impl<T> BorrowMut<T> for T
where T: ?Sized,

Source§

fn borrow_mut(&mut self) -> &mut T

Mutably borrows from an owned value. Read more
Source§

impl<T> From<T> for T

Source§

fn from(t: T) -> T

Returns the argument unchanged.

Source§

impl<T, U> Into<U> for T
where U: From<T>,

Source§

fn into(self) -> U

Calls U::from(self).

That is, this conversion is whatever the implementation of From<T> for U chooses to do.

Source§

impl<T, U> TryFrom<U> for T
where U: Into<T>,

Source§

type Error = Infallible

The type returned in the event of a conversion error.
Source§

fn try_from(value: U) -> Result<T, <T as TryFrom<U>>::Error>

Performs the conversion.
Source§

impl<T, U> TryInto<U> for T
where U: TryFrom<T>,

Source§

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.
Source§

fn try_into(self) -> Result<U, <U as TryFrom<T>>::Error>

Performs the conversion.
Source§

impl<V, T> VZip<V> for T
where V: MultiLane<T>,

Source§

fn vzip(self) -> V