Spider
Multithreaded web crawler written in Rust.
Dependencies
On Debian or other DEB based distributions:
On Fedora and other RPM based distributions:
Usage
Add this dependency to your Cargo.toml file.
[]
= "1.3.1"
Then you'll be able to use library. Here is a simple example:
extern crate spider;
use Website;
You can use Configuration object to configure your crawler:
// ..
let mut website: Website = new;
website.configuration.blacklist_url.push;
website.configuration.respect_robots_txt = true;
website.configuration.verbose = true; // Defaults to false
website.configuration.delay = 2000; // Defaults to 250 ms
website.configuration.concurrency = 10; // Defaults to 4
website.configuration.user_agent = "myapp/version"; // Defaults to spider/x.y.z, where x.y.z is the library version
website.on_link_find_callback = ; // Callback to run on each link find
website.crawl;
You can get a working example at example.rs and run it with
TODO
- multi-threaded system
- respect robot.txt file
- add configuration object for polite delay, etc..
- add polite delay
- parse command line arguments
Contribute
I am open-minded to any contribution. Just fork & commit on another branch.