Crate openai_flows
source ·Expand description
OpenAI integration for Flows.network
Quick Start
To get started, let’s write a very tiny flow function.
use openai_flows::{chat_completion, ChatModel, ChatOptions, FlowsAccount};
use lambda_flows::{request_received, send_response};
use serde_json::Value;
use std::collections::HashMap;
#[no_mangle]
#[tokio::main(flavor = "current_thread")]
pub async fn run() {
request_received(handler).await;
}
async fn handler(_qry: HashMap<String, Value>, body: Vec<u8>) {
let co = ChatOptions {
model: ChatModel::GPT35Turbo,
restart: false,
system_prompt: None,
retry_times: 2,
};
let r = match chat_completion(
FlowsAccount::Default,
"any_conversation_id",
String::from_utf8_lossy(&body).into_owned().as_str(),
&co,
)
.await
{
Ok(c) => c.choice,
Err(e) => e,
};
send_response(
200,
vec![(
String::from("content-type"),
String::from("text/plain; charset=UTF-8"),
)],
r.as_bytes().to_vec(),
);
}
When the Lambda request is received, chat using chat_completion then send the response.
Structs
- struct for setting the chat options.
- Response struct for the chat completion.
- Request struct for the completion.
- Request struct for the embeddings.
- Request struct for the image creation.
Enums
- Models for Chat
- The input type for the embeddings.
- The account name you provide to Flows.network platform, which is tied to your OpenAI API key.
Functions
- Create chat completion with the provided sentence. It use OpenAI’s GPT-3.5 model to make a conversation. It will keep the conversation history for 10 minutes for you. That means a new conversation will be created if you haven’t call this method for 10 minutes.
- Create completion for the provided prompt and parameters.
- Create embeddings from the provided input.
- Create image for the provided prompt and parameters.