tersify 0.5.0

Universal LLM context compressor — pipe anything, get token-optimized output
Documentation
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
## What this PR does

<!-- One paragraph describing the change -->

## Checklist

- [ ] `cargo test` passes
- [ ] `cargo clippy -- -D warnings` passes
- [ ] `cargo fmt --check` passes
- [ ] New behaviour is covered by tests
- [ ] CHANGELOG.md updated under `## [Unreleased]`

## Benchmark impact

<!-- Run `cargo run -- bench` and paste the output if compression ratios changed -->