Skip to main content

query_openai

Function query_openai 

Source
pub async fn query_openai(
    prompt: &str,
    pre_prompt: &str,
    data_model: &DataModel,
    root: &str,
    model: &str,
    multiple: bool,
    api_key: Option<String>,
) -> Result<Value, Box<dyn Error>>
Expand description

Queries the OpenAI API with a given prompt and pre-prompt, using a specified data model and root.

§Arguments

  • prompt - The main prompt to send to the OpenAI API.
  • pre_prompt - An additional pre-prompt to provide context or setup for the main prompt.
  • data_model - The data model used to generate the JSON schema for the response format.
  • root - The root name for the JSON schema.
  • model - The OpenAI model to use for the chat completion.
  • multiple - Whether to extract multiple objects.
  • api_key - Optional API key for OpenAI. If None, will try to read from environment variable.

§Returns

A Result containing a serde_json::Value with the parsed JSON response from the OpenAI API, or an error if the operation fails.

§Errors

This function will return an error if:

  • The JSON schema cannot be generated from the data model
  • The OpenAI API key is not provided and not found in environment variables
  • The API request fails
  • The response cannot be parsed as valid JSON