# Reliable HTTP Downloads over unreliable networks
A rust http client library that aims to "do the right thing by default",
especially w.r.t. error handling.
WARNING: this library has not been thoroughly tested yet.
It is mainly designed for downloading large files, since that is the case
where being robust against spurious network failure is most important.
## Features
* Automatically resume downloads using the `Range` header in the case of an error.
* Exponential backoff.
* Stalled download timeouts, so downloads will be canceled if they are not making progress
* Detection of mid-air collisions (files being modified while they are downloading) via `ETag` and `Last-Modified`
* Automatic *restart* when needed (eg. if document is modified mid-download, or if the server doesn't support `Range` requests).
* Informative logging via the `tracing` crate.
* Progress reporting via the `Progress` struct.
* Redirect handling.
* Flexible configuration.