lolchive
local liminal page archiver
doesn't work on windows yet
this will save webpages to your computer to the path you specify so
google.com/path/to/this
is
google.com/
|_/path
|_/to
|_/this
|_/date
|/css
|/images
|/js
|_index.html
will be the folder path.
Use
the fantoccini archiver uses fantoccini which for these purposes use the geckodriver the basic archiver just uses reqwest
FantocciniArchiver
use FantocciniArchiver
use dirs;
let url = "https://www.merriam-webster.com/dictionary/fantoccini";
//use the connection string to pass in, this is where geckodriver is running
let connection_string = "http://localhost:4444";
//set up absolute pathe to where you want it to store archive
let home_dir = home_dir.expect;
let new_dir = format!;
//create archiver
let archiver = new.await;
//archive
let path = archiver.create_archive.await;
//path to the archive returned
println!;
//close archiver
let _ = archiver.close.await;
Basic Archiver the basic archiver just uses reqwest
use BasicArchiver
use dirs;
let url = "https://www.rust-lang.org/";
let home_dir = home_dir.expect;
let new_dir = format!;
println!;
let path = create_archive.await;
println!;
Crawler
Fantoccini Crawler - uses fantoccini and the gecko webdriver
use FantocciniCrawler;
use dirs;
let url = "https://en.wikipedia.org/wiki/Rust_(programming_language)";
let connection_string = "http://localhost:4444";
let home_dir = home_dir.expect;
let new_dir = format!;
let fcrawler = new.await.unwrap;
let paths = fcrawler.save_crawl.await.unwrap;
let _ = fcrawler.close.await;
println!;
assert!;
Basic Crawler - uses reqwest
use BasicCrawler;
use dirs;
let url = "https://www.rust-lang.org/";
let home_dir = home_dir.expect;
let new_dir = format!;
let paths = save_crawl.await.unwrap;
println!;
assert!;