tide-rhai 0.0.2

A scritping component for tide.
Documentation

Overview

This component provides the ability to run rhai scripts to process http requests in tide. Currently it only supprts modifying the messages but additional features such as a http client are being considered.

Install

$ cargo add tide-rhai

Example

Create a tide server that points to a directory containing rhai scripts.

use tide_rhai::RhaiDir;
#[async_std::main]
async fn main() -> Result<(), std::io::Error> {
    tide::log::start();
    let mut app = tide::new();
    app.at("/*")
        .get(RhaiDir::new("/*", "./").unwrap());
    app.listen("127.0.0.1:8080").await?;
    Ok(())
}

The first parameter for new is the prefix and should be mapped to the at parameter. The second is the folder with the rhai scripts in

Creat a rhai script called headers.rhai that selects a header and returns it in a JSON Message. Note it doesn't have to be called .rhai but VS Code has suppport for that file extention.

let obj = #{};
obj.message = "Is this acceptable?" + ctx.headers["accept"];
obj

Here we are using the headers property of the context object. If this was a POST then the ctx object would also contain a data property with the JSON that has been sent to the server.

When you now run to https://localhost:8080/headers.rhai you should see the following:

{"message":"Is this acceptable?text/html,application/xhtml+xml,application/xml;q=0.9,
image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9"}

This example can also be ran by cloning this repository and running

$ cargo run --example samples

Then browse to:

http://localhost:8080/helloworld.rhai http://localhost:8080/headers.rhai http://localhost:8080/fetch.rhai

Todo List

  • Logging - Integrate with tide logging system
  • Http Client - A fetch-like API
  • Benchmarks - Set of scripts to manage regressions
  • Observability - Support for BPF/DTrace probes. Maybe flamegraphs or statemaps
  • Module System - Rhai supports modules but doesn't have a modeue system.