docs.rs failed to build token-1.0.0-rc1
Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
Token
This is a small package containing a simple string-tokenizer for the rust programming language. The package also contains a simple sentence-splitting iterator.
(The sentence splitter might be moved, once I find out where I want it).
Documentation
machtan.github.io/token-rs/token
Building
Add the following to your Cargo.toml file
[dependencies.token] git = "https://github.com/machtan/token-rs"
Examples
extern crate token;
let separators = vec!;
let source: &str = " Hello world \n How do you do\t-Finely I hope";
let mut tokenizer = new;
println!;
for token in tokenizer
println!;
License
MIT (do what you want with it)