Expand description
Parsing and applying robots.txt files.
§Examples
use roboto::Robots;
let robots = r#"
User-agent: *
Disallow: /
"#.parse::<Robots>().unwrap();
assert!(!robots.is_allowed(&"googlebot".parse().unwrap(), "/"));
assert!(robots.is_allowed(&"googlebot".parse().unwrap(), "/robots.txt"));
assert!(!robots.is_allowed(&"googlebot".parse().unwrap(), "/foo/bar"));
§References
Modules§
- error
- Error types for the robots.txt parser and evaluator
Structs§
- Directive
- A directive in a robots.txt file, which associates a path with a directive type.
- Directive
Path - A path directive in a robots.txt file.
- Robot
Agent - A set of User-Agents and associated directives.
- Robots
- A robots.txt file.
- User
Agent - A User-Agent string.
Enums§
- Directive
Type - A directive type in a robots.txt file.