- macro-less and type-safe APIs for declarative, ergonomic code
- runtime-flexible :
tokio,smol,nio,glommio,monoio,compioandworker(Cloudflare Workers),lambda(AWS Lambda) - good performance, no-network testing, well-structured middlewares, Server-Sent Events, WebSocket, highly integrated OpenAPI document generation, ...
Quick Start
- Add to
dependencies:
[]
= { = "0.24", = ["rt_tokio"] }
= { = "1", = ["full"] }
- Write your first code with Ohkami : examples/quick_start
use ;
use ;
async
async
async
- Run and check the behavior :
Core APIs
Ohkami
Ohkami is the main entry point of Ohkami application:
a collection of Routes and Fangs, and provides .howl()/.howls() method to run the application.
new.howl.await;
.howls() (tls feature only) is used to run Ohkami with TLS (HTTPS) support
upon rustls ecosystem.
howl(s) supports graceful shutdown by Ctrl-C ( SIGINT ) on native runtimes.
Route
Route is the core trait to define Ohkami's routing:
.GET(),.POST(),.PUT(),.PATCH(),.DELETE(),.OPTIONS()to define API endpoints.By({another Ohkami})to nestOhkamis.Mount({directory path})to serve static directory (pre-compressed files withgzip,deflate,br,zstdare supported)
Here GET, POST, etc. takes a handler function:
async fn
On native runtimes, whole a handler must be Send + Sync + 'static
and the return future must be Send + 'static.
fangs
Ohkami's request handling system is called fang; all handlers and middlewares are built on it.
/* simplified for description */
built-ins:
BasicAuth,Cors,Csrf,Jwt(authentication/security)Context(reuqest context)Enamel(security headers; experimantal)Timeout(handling timeout; native runtimes only)openapi::Tag(tag for OpenAPI document generation;openapifeature only)
Ohkami provides FangAction utility trait to implement Fang trait easily:
/* simplified for description */
Additionally, you can apply fangs both as global fangs to an Ohkami or
as local fangs to a specific handler (described below).
claws
Ohkami provides claw API: handler parts designed for declarative way to
extract request data and construct response data:
content- typed content {extracted from request / for response} of specific format- built-ins:
Json<T>,Text<T>,Html<T>,UrlEncoded<T>,Multipart<T>
- built-ins:
param- typed parameters extracted from request- built-ins:
Path<P>,Query<T>
- built-ins:
header- types for specific header extracted from request- built-ins: types for standard request headers
status- types for response with specific status code- built-ins: types for standard response status codes
( here T means a type that implements serde::Deserialize for request and serde::Serialize for response, and P means a type that implements FromParam or a tuple of such types. )
The number of path parameters extracted by Path is automatically asserted
to be the same or less than the number of path parameters contained in the route path
when the handler is registered to routing.
async
async : ,
Query: ,
)
Feature flags
"rt_tokio", "rt_smol", "rt_nio", "rt_glommio", "rt_monoio", "rt_compio" : native async runtime
"rt_worker" : Cloudflare Workers
- worker v0.7.*
Ohkami has first-class support for Cloudflare Workers:
#[worker]macro to define a Worker#[bindings],ws::SessionMaphelper- better
DurableObject - not require
SendSyncbound for handlers or fangs - worker_openapi.js script to generate OpenAPI document from
#[worker]fn
And also maintains useful project template. Run :
then <project dir> will have wrangler.jsonc, package.json and a Rust library crate.
#[ohkami::worker] async? fn({bindings}?) -> Ohkami is the Worker definition.
Local dev by npm run dev and deploy by npm run deploy !
See
worker*temaplates in template repositoryworker*samples in samples directory#[worker]'s documentation comment in macro definitions
for wokring examples and detailed usage of #[worker] (and/or openapi).
"rt_lambda" : AWS Lambda
- lambda_runtime v1.0.* with
tokio
Both Function URLs and API Gateway are supported, and WebSocket is not supported.
cargo lambda will be good partner. Let's run :
lambda_runtime::run(your_ohkami) make you_ohkami run on Lambda Function.
Local dev by
and deploy by
See
- README of template
- Cargo Lambda document
for details.
"sse" : Server-Sent Events
Ohkami responds with HTTP/1.1 Transfer-Encoding: chunked.
Use some reverse proxy to do with HTTP/2,3.
use ;
use DataStream;
use ;
async
async
"ws" : WebSocket
use ;
use ;
async
async
- On
"rt_worker", both normal ( stateless ) WebSocket and WebSocket on Durable Object are available! - On
"rt_lambda", WebSocket is currently not supported.
"openapi" : OpenAPI document generation
"openapi" provides highly integrated OpenAPI support.
This enables macro-less, as consistent as possible OpenAPI document generation, where most of the consistency between document and behavior is automatically assured by Ohkami's internal work.
Only you have to
- Derive
openapi::Schemafor all your schema structs - Make your
Ohkamicall.generate(openapi::OpenAPI { ... })
to generate consistent OpenAPI document.
You don't need to take care of writing accurate methods, paths, parameters, contents, ... for this OpenAPI feature; All they are done by Ohkami.
Of course, you can flexibly
- customize schemas by manual implemetation of
Schematrait - customize descriptions or other parts by
#[operation]attribute andopenapi_*hooks ofFromRequest,IntoResponse,Fang (Action) - put
tags for grouping operations byopenapi::Tagfang
use ;
use ;
use openapi;
// Derive `Schema` trait to generate
// the schema of this struct in OpenAPI document.
// `#[openapi(component)]` to define it as component
// in OpenAPI document.
async
// (optionally) Set operationId, summary,
// or override descriptions by `operation` attribute.
/// This doc comment is used for the
/// `description` field of OpenAPI document
async
async
- Currently, only JSON is supported as the document format.
- When the binary size matters, you should prepare a feature flag activating
ohkami/openapiin your package, and put all your codes aroundopenapibehind that feature via#[cfg(feature = ...)]or#[cfg_attr(feature = ...)]. - In
rt_worker,.generateis not available becauseOhkamican't have access to your local filesystem bywasm32binary on Minifalre. So ohkami provides a CLI tool to generate document from#[ohkami::worker] Ohkamiwithopenapifeature.
"tls"
HTTPS support up on rustls ecosystem.
- Call
howls( ashttpstohttp,wsstows) instead ofhowlto run with TLS. - You must prepare your own certificate and private key files.
- Currently, only HTTP/1.1 over TLS is supported.
Example :
[]
= { = "0.24", = ["rt_tokio", "tls"] }
= { = "1", = ["full"] }
= { = "0.23", = ["ring"] }
= "2.2"
use ;
use ServerConfig;
use ;
use File;
use BufReader;
async
async
For localhost-testing with browser (or curl without --insecure),
mkcert is highly recommended.
"nightly" : nightly-only functionalities
- try response
- internal performance optimizations
Snippets
Typed content
use ;
use ;
/* Deserialize for request */
/* Serialize for response */
async
Typed params
use ;
use ;
use ;
async
async
async :
)
async
Middlewares
use ;
;
/* utility trait; automatically impl `Fang` trait */
async
Database connection management with Context fang
use ;
use status;
use Context;
use ;
async
async
Typed errors
use ;
use ;
use Serialize;
use Context;
async
thiserror may improve such error conversion:
let name =
.bind
.fetch_one
// .await
// .map_err(MyError::Sqlx)?;
.await?;
Static directory serving
use ;
async
File upload
Multipart built-in claw and File helper:
use ;
use Deserialize;
async
Pack of Ohkamis
use ;
use ;
use Serialize;
async
async
async
async
Testing
use ;
use *; // <--
async
DI by generics
use ;
use ;
use Context;
use Serialize;
//////////////////////////////////////////////////////////////////////
/// errors
//////////////////////////////////////////////////////////////////////
/// repository
;
//////////////////////////////////////////////////////////////////////
/// routes
async
//////////////////////////////////////////////////////////////////////
/// entry point
async
Supported protocols
- HTTP/1.1
- HTTP/2
- HTTP/3
- HTTPS
- Server-Sent Events
- WebSocket
MSRV ( Minimum Supported Rust Version )
Latest stable
License
Ohkami is licensed under MIT LICENSE ( LICENSE or https://opensource.org/licenses/MIT ).