Docs.rs
  • kodept-parse-0.1.4
    • kodept-parse 0.1.4
    • Docs.rs crate page
    • Apache-2.0
    • Links
    • Repository
    • crates.io
    • Source
    • Owners
    • ITesserakt
    • Dependencies
      • derive_more ^0.99 normal
      • enum-iterator ^1.4.1 normal
      • extend ^1.2 normal
      • kodept-core ^0.2 normal
      • nom ^7.1 normal
      • nom-supreme ^0.8.0 normal
      • nonempty-collections ^0.1 normal
      • size-of ^0.1 normal
      • thiserror ^1.0.44 normal
      • test-case ^3.2.1 dev
    • Versions
    • 0% of the crate is documented
  • Go to latest version
  • Platform
    • i686-pc-windows-msvc
    • i686-unknown-linux-gnu
    • x86_64-apple-darwin
    • x86_64-pc-windows-msvc
    • x86_64-unknown-linux-gnu
  • Feature flags
  • docs.rs
    • About docs.rs
    • Badges
    • Builds
    • Metadata
    • Shorthand URLs
    • Download
    • Rustdoc JSON
    • Build queue
    • Privacy policy
  • Rust
    • Rust website
    • The Book
    • Standard Library API Reference
    • Rust by Example
    • The Cargo Guide
    • Clippy Documentation

kodept_parse0.1.4

  • Macros
  • Structs
  • Enums
  • Constants
  • Traits
  • Functions
  • Type Aliases
?
Change settings

List of all items

Structs

  • parser::error::TokenVerificationError
  • token_match::TokenMatch
  • token_stream::TokenStream
  • token_stream::TokenStreamIndices
  • token_stream::TokenStreamIterator
  • tokenizer::Tokenizer

Enums

  • lexer::enums::BitOperator
  • lexer::enums::ComparisonOperator
  • lexer::enums::Identifier
  • lexer::enums::Ignore
  • lexer::enums::Keyword
  • lexer::enums::Literal
  • lexer::enums::LogicOperator
  • lexer::enums::MathOperator
  • lexer::enums::Operator
  • lexer::enums::Symbol
  • lexer::enums::Token
  • tokenizer::TokenizeError

Traits

  • lexer::traits::ToRepresentation

Macros

  • function
  • match_any_token
  • match_token

Functions

  • lexer::exact_literal_token
  • lexer::soft_literal_token
  • parser::file::grammar

Type Aliases

  • ParseError
  • ParseResult
  • TokenizationError
  • TokenizationResult

Constants

  • lexer::LOWER_ALPHABET
  • lexer::UPPER_ALPHABET