pub struct Scraper<State = WithoutServers> { /* private fields */ }
Expand description
A scraper for CVMFS servers.
This struct provides a builder interface for scraping CVMFS servers, and it has three states: WithoutServers, WithServers, and ValidatedAndReady. The scraper is created with the new() method, and then servers can be added with the with_servers() method.
Transitions:
- new(): creates a Scraper in the WithoutServers state.
- with_servers(): WithoutServers -> WithServers.
- validate(): WithServers -> ValidatedAndReady
Notes:
- You may only add servers in the WithoutServers state.
- You may only validate the scraper in the WithServers state.
- You may only scrape the servers in the ValidatedAndReady state.
- Once the scraper is in the ValidatedAndReady state, it is no longer mutable.
§Example
use cvmfs_server_scraper::{Scraper, ScraperCommon, Hostname, Server, ServerType, ServerBackendType};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let servers = vec![
Server::new(
ServerType::Stratum1,
ServerBackendType::CVMFS,
Hostname::try_from("azure-us-east-s1.eessi.science").unwrap(),
),
Server::new(
ServerType::Stratum1,
ServerBackendType::AutoDetect,
Hostname::try_from("aws-eu-central-s1.eessi.science").unwrap(),
),
];
let scraper = Scraper::new()
.forced_repositories(vec!["repo1", "repo2"])
.with_servers(servers)
.ignored_repositories(vec!["repo3", "repo4"])
.geoapi_servers(vec!["cvmfs-stratum-one.cern.ch", "cvmfs-stratum-one.ihep.ac.cn"])?;
let server_results = scraper.validate()?.scrape().await;
Ok(())
}
Implementations§
source§impl Scraper<WithoutServers>
impl Scraper<WithoutServers>
sourcepub fn new() -> Self
pub fn new() -> Self
Create a new Scraper.
This method creates a new Scraper with no servers added and in the WithoutServers state. To add servers, use the with_servers() method.
sourcepub fn with_servers(self, servers: Vec<Server>) -> Scraper<WithServers>
pub fn with_servers(self, servers: Vec<Server>) -> Scraper<WithServers>
Add a list of servers to the scraper.
This method transitions the scraper to the WithServers state, and you may no longer add servers after calling this method.
source§impl Scraper<WithServers>
impl Scraper<WithServers>
sourcepub fn validate(self) -> Result<Scraper<ValidatedAndReady>, ScrapeError>
pub fn validate(self) -> Result<Scraper<ValidatedAndReady>, ScrapeError>
Validate the scraper and transition to the ValidatedAndReady state.
This method performs some basic pre-flight checks to ensure that the scraper is correctly configured. If the checks pass, the scraper transitions to the ValidatedAndReady state, and you may no longer add servers or repositories.
The checks performed are:
- If any servers use the S3 backend, the forced repositories list cannot be empty.
source§impl Scraper<ValidatedAndReady>
impl Scraper<ValidatedAndReady>
sourcepub async fn scrape(&self) -> Vec<ScrapedServer>
pub async fn scrape(&self) -> Vec<ScrapedServer>
Scrape the servers.
This method scrapes the servers and returns a list of ScrapedServer objects, which contain the results of the scrape. This list will contain either PopulatedServer objects or FailedServer objects, depending on whether the scrape was successful or not for that specific server.