Crate openai_flows

source ·
Expand description

OpenAI integration for Flows.network

Quick Start

To get started, let’s write a tiny flow function.

use openai_flows::{
    chat::ChatOptions,
    OpenAIFlows,
};
use lambda_flows::{request_received, send_response};
use serde_json::Value;
use std::collections::HashMap;

#[no_mangle]
#[tokio::main(flavor = "current_thread")]
pub async fn run() {
    request_received(handler).await;
}

async fn handler(_qry: HashMap<String, Value>, body: Vec<u8>) {
    let co = ChatOptions::default();
    let of = OpenAIFlows::new();

    let r = match of.chat_completion(
        "any_conversation_id",
        String::from_utf8_lossy(&body).into_owned().as_str(),
        &co,
    )
    .await
    {
        Ok(c) => c.choice,
        Err(e) => e,
    };
     
    send_response(
        200,
        vec![(
            String::from("content-type"),
            String::from("text/plain; charset=UTF-8"),
        )],
        r.as_bytes().to_vec(),
    );
}

When the Lambda request is received, chat using [chat_completion] then send the response.

Real World Example

HackerNews Alert uses openai_flows chat_completion function to summarize a news article.

// In the ommitted code blocks, the program has obtained texts for news item,
// Now we feed the texts to the `get_summary_truncated()` function to get a summary. 
// texts in the news could exceed the maximum token limit of the language model,
// it is truncated to 11000 space delimited words in this code, when tokenized,
// the total token count shall be within the 16k limit of the model used.
async fn get_summary_truncated(inp: &str) -> anyhow::Result<String> {
    let mut openai = OpenAIFlows::new();
    openai.set_retry_times(3);

    let news_body = inp
        .split_ascii_whitespace()
        .take(11000)
        .collect::<Vec<&str>>()
        .join(" ");

    let chat_id = format!("news-summary-N");
    let system = &format!("You're a news editor AI.");

    let co = ChatOptions {
        model: ChatModel::GPT35Turbo16K,
        restart: true,
        system_prompt: Some(system),
        ..chatOptions::default()
    };

    let question = format!("Make a concise summary within 100 words on this: {news_body}.");

    match openai.chat_completion(&chat_id, &question, &co).await {
        Ok(r) => Ok(r.choice),
        Err(_e) => Err(anyhow::Error::msg(_e.to_string())),
    }
}

Modules

Structs

  • The main struct for setting the basic configuration for OpenAI interface.

Enums