Enum aquatic_http_protocol::request::Request
source · pub enum Request {
Announce(AnnounceRequest),
Scrape(ScrapeRequest),
}
Variants§
Announce(AnnounceRequest)
Scrape(ScrapeRequest)
Implementations§
source§impl Request
impl Request
sourcepub fn from_bytes(bytes: &[u8]) -> Result<Self, RequestParseError>
pub fn from_bytes(bytes: &[u8]) -> Result<Self, RequestParseError>
Parse Request from HTTP request bytes
sourcepub fn from_http_get_path(path: &str) -> Result<Self>
pub fn from_http_get_path(path: &str) -> Result<Self>
Parse Request from http GET path (/announce?info_hash=...
)
Existing serde-url decode crates were insufficient, so the decision was made to create a custom parser. serde_urlencoded doesn’t support multiple values with same key, and serde_qs pulls in lots of dependencies. Both would need preprocessing for the binary format used for info_hash and peer_id.
The info hashes and peer id’s that are received are url-encoded byte by byte, e.g., %fa for byte 0xfa. However, they need to be parsed as UTF-8 string, meaning that non-ascii bytes are invalid characters. Therefore, these bytes must be converted to their equivalent multi-byte UTF-8 encodings.