Spider
Multithreaded web crawler written in Rust main repo.
Dependencies
On Linux
- OpenSSL 1.0.1, 1.0.2, 1.1.0, or 1.1.1
Example
This is a basic blocking example crawling a web page, add spider to your Cargo.toml:
[]
= "1.7.18"
And then the code:
extern crate spider;
use Website;
You can use Configuration object to configure your crawler:
// ..
let mut website: Website = new;
website.configuration.blacklist_url.push;
website.configuration.respect_robots_txt = true;
website.configuration.delay = 2000; // Defaults to 250 ms
website.configuration.concurrency = 10; // Defaults to number of cpus available * 4
website.configuration.user_agent = "myapp/version"; // Defaults to spider/x.y.z, where x.y.z is the library version
website.on_link_find_callback = ; // Callback to run on each link find
website.crawl;
Regex Blacklisting
There is an optional "regex" crate that can be enabled:
[]
= { = "1.7.18", = ["regex"] }
extern crate spider;
use Website;