token 1.0.0-rc1

A simple string-tokenizer (and sentence splitter) Note: If you find that you would like to use the name for something more appropriate, please just send me a mail at jaln at itu dot dk
docs.rs failed to build token-1.0.0-rc1
Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.

Token

Build status (master)

This is a small package containing a simple string-tokenizer for the rust programming language. The package also contains a simple sentence-splitting iterator.

(The sentence splitter might be moved, once I find out where I want it).

Documentation

machtan.github.io/token-rs/token

Building

Add the following to your Cargo.toml file

[dependencies.token] git = "https://github.com/machtan/token-rs"

Examples

extern crate token;

let separators = vec![' ', '\n', '\t', '\r'];
let source: &str = "    Hello world \n  How do you do\t-Finely I hope";

let mut tokenizer = tokenizer::Tokenizer::new(source.as_bytes(), separators);
println!("Tokenizing...");
for token in tokenizer {
    println!("- Got token: {}", token.unwrap());
}
println!("Done!");

License

MIT (do what you want with it)