oha (おはよう)
oha is a tiny program that sends some load to a web application and show realtime tui inspired by rakyll/hey.
This program is written in Rust and powered by tokio and beautiful tui by ratatui.

Installation
This program is built on stable Rust, with both make and cmake prerequisites to install via cargo.
cargo install oha
You can optionally build oha against native-tls instead of rustls.
cargo install --no-default-features --features native-tls oha
You can enable VSOCK support by enabling vsock feature.
cargo install --features vsock oha
You can enable experimental HTTP3 support by enabling the http3 feature. This uses the H3 library by the developers of Hyper.
It will remain experimental as long as H3 is experimental. It currently depends on using rustls for TLS.
Download pre-built binary
You can download pre-built binary from Release page for each version and from Publish workflow and Publish PGO workflow for each commit.
On Arch Linux
pacman -S oha
On macOS (Homebrew)
brew install oha
On Windows (winget)
winget install hatoo.oha
On Debian (Azlux's repository)
echo "deb [signed-by=/usr/share/keyrings/azlux-archive-keyring.gpg] http://packages.azlux.fr/debian/ stable main" | sudo tee /etc/apt/sources.list.d/azlux.list
sudo wget -O /usr/share/keyrings/azlux-archive-keyring.gpg https://azlux.fr/repo.gpg
apt update
apt install oha
X-CMD (Linux, macOS, Windows WSL/GitBash)
You can install with x-cmd.
Containerized
You can also build and create a container image including oha
Then you can use oha directly through the container
Profile-Guided Optimization (PGO)
You can build oha with PGO by using the following commands:
And the binary will be available at target/[target-triple]/pgo/oha.
Platform
- Linux - Tested on Ubuntu 18.04 gnome-terminal
- Windows 10 - Tested on Windows Powershell
- MacOS - Tested on iTerm2
Usage
-q option works different from rakyll/hey. It's set overall query per second instead of for each workers.
)
<URL> Target
)
Performance
oha uses faster implementation when --no-tui option is set and both -q and --burst-delay are not set because it can avoid overhead to gather data realtime.
Output
By default oha outputs a text summary of the results.
oha prints JSON summary output when --output-format json option is set.
The schema of JSON output is defined in schema.json.
When --output-format csv is used result of each request is printed as a line of comma separated values.
Tips
Stress test in more realistic condition
oha uses default options inherited from rakyll/hey but you may need to change options to stress test in more realistic condition.
I suggest to run oha with following options.
-
--disable-keepalive
In real, user doesn't query same URL using Keep-Alive. You may want to run without
Keep-Alive. -
--latency-correction
You can avoid
Coordinated Omission Problemby using--latency-correction.
Burst feature
You can use --burst-delay along with --burst-rate option to introduce delay between a defined number of requests.
In this particular scenario, every 2 seconds, 4 requests will be processed, and after 6s the total of 10 requests will be processed.
NOTE: If you don't set --burst-rate option, the amount is default to 1
Dynamic url feature
You can use --rand-regex-url option to generate random url for each connection.
Each Urls are generated by rand_regex crate but regex's dot is disabled since it's not useful for this purpose and it's very inconvenient if url's dots are interpreted as regex's dot.
Optionally you can set --max-repeat option to limit max repeat count for each regex. e.g http://127.0.0.1/[a-z]* with --max-repeat 4 will generate url like http://127.0.0.1/[a-z]{0,4}
Currently dynamic scheme, host and port with keep-alive are not works well.
URLs from file feature
You can use --urls-from-file to read the target URLs from a file. Each line of this file needs to contain one valid URL as in the example below.
http://domain.tld/foo/bar
http://domain.tld/assets/vendors-node_modules_highlight_js_lib_index_js-node_modules_tanstack_react-query_build_modern-3fdf40-591fb51c8a6e.js
http://domain.tld/images/test.png
http://domain.tld/foo/bar?q=test
http://domain.tld/foo
Such a file can for example be created from an access log to generate a more realistic load distribution over the different pages of a server.
When this type of URL specification is used, every request goes to a random URL given in the file.
Contribution
Feel free to help us!
Here are some areas which need improving.
- Write tests
- Improve tui design.
- Show more information?
- Improve speed
- I'm new to tokio. I think there are some space to optimize query scheduling.