Expand description
Transform pipeline infrastructure
This module provides a composable, type-safe transformation system that replaces the old rigid pipeline architecture. Any transform can be chained with another if their types are compatible, enabling modular and reusable processing stages.
§Architecture Overview
The transform system consists of three core concepts:
§1. The Runnable Trait
The fundamental interface for all transformation stages. Any type implementing
Runnable<I, O> can transform input of type I to output of type O:
pub trait Runnable<I, O> {
fn run(&self, input: I) -> Result<O, TransformError>;
}This trait is implemented by individual processing stages (tokenization, parsing, etc.).
§2. The Transform<I, O> Type
A wrapper that enables composition. Any Runnable can be converted to a Transform,
which provides the .then() method for type-safe chaining:
let pipeline = Transform::from_fn(|x| Ok(x))
.then(Tokenize) // String → Vec<Token>
.then(Parse); // Vec<Token> → Ast
// Result: Transform<String, Ast>The compiler enforces that output types match input types at each stage.
§3. Static Lazy Transforms
Common pipelines are pre-built as static references using once_cell::sync::Lazy.
This provides zero-cost abstractions for standard processing paths:
pub static LEXING: Lazy<Transform<String, TokenStream>> = Lazy::new(|| {
Transform::from_fn(Ok)
.then(CoreTokenization::new())
.then(SemanticIndentation::new())
});See the standard module for all pre-built transforms.
§Usage Patterns
§Direct Transform Usage
For programmatic access to specific stages:
use lex_parser::lex::transforms::standard::LEXING;
let tokens = LEXING.run("Session:\n Content\n".to_string())?;§With DocumentLoader
For most use cases, use DocumentLoader
which provides convenient shortcuts:
use lex_parser::lex::loader::DocumentLoader;
let loader = DocumentLoader::from_string("Hello\n");
let doc = loader.parse()?; // Full AST
let tokens = loader.tokenize()?; // Lexed tokens§Custom Pipelines
Build custom processing chains for specialized needs:
use lex_parser::lex::transforms::{Transform, standard::CORE_TOKENIZATION};
let custom = CORE_TOKENIZATION
.then(MyCustomStage::new())
.then(AnotherStage::new());
let result = custom.run(source)?;§Module Organization
stages: Individual transformation stages (tokenization, indentation, parsing)standard: Pre-built transform combinations for common use cases
§Design Benefits
- Type Safety: Compiler verifies pipeline stage compatibility
- Composability: Mix and match stages to create custom pipelines
- Reusability: Share transforms across CLI, tests, and library code
- Clarity: Explicit stage boundaries with clear input/output types
- Testability: Test individual stages in isolation
Modules§
Structs§
- Transform
- A composable transformation pipeline
Enums§
- Transform
Error - Error that can occur during transformation
Traits§
- Runnable
- Trait for anything that can transform an input to an output