URL Cleaner
The CLI interface for URL Cleaner.
Default cleaner
See ../default_cleaner.md
for details about the included default cleaner.
Performance
On a mostly stock lenovo thinkpad T460S (Intel i5-6300U (4) @ 3.000GHz) running Kubuntu 24.10 (kernel 6.14.0) that has "not much" going on (FireFox, Steam, etc. are closed), hyperfine gives me the following benchmark:
Last updated 2025-05-06.
Also the numbers are in milliseconds.
For reasons not yet known to me, everything from an Intel i5-8500 (6) @ 4.100GHz to an AMD Ryzen 9 7950X3D (32) @ 5.759GHz seems to max out at between 140 and 110ms per 100k (not a typo) of the amazon URL despite the second CPU being significantly more powerful.
Parsing output
Note: JSON output is supported.
Unless a Debug
variant is used, the following should always be true:
- Input URLs are a list of URLs starting with URLs provided as command line arguments then each line of the STDIN.
- The nth line of STDOUT corresponds to the nth input URL.
- If the nth line of STDOUT is empty, then something about reading/parsing/cleaning the URL failed.
- The nth non-empty line of STDERR corresponds to the nth empty line of STDOUT.
- Currently empty STDERR lines are not printed when a URL succeeds. While it would make parsing the output easier it would cause visual clutter on terminals. While this will likely never change by default, parsers should be sure to follow 4 strictly in case this is added as an option.
JSON output
The --json
/-j
flag can be used to have URL Cleaner output JSON instead of lines.
The format looks like this, but minified:
The surrounding {"Ok": {...}}
is to let URL Cleaner Site return {"Err": {...}}
on invalid input.
Exit code
Currently, the exit code is determined by the following rules:
- If no cleanings work and none fail, returns 0. This only applies if no URLs are provided.
- If no cleanings work and some fail, returns 1.
- If some cleanings work and none fail, returns 0.
- If some cleanings work and some fail, returns 2. This only applies if multiple URLs are provided.