Skip to main content

Crate cortexai_fastly

Crate cortexai_fastly 

Source
Expand description

§Rust AI Agents - Fastly Compute

Run AI agents on Fastly Compute@Edge.

This crate provides a Fastly-native implementation using:

  • cortexai-llm-client for request/response logic
  • Fastly SDK for HTTP requests

§Usage

use cortexai_fastly::{FastlyAgent, FastlyAgentConfig};
use fastly::{Request, Response};

#[fastly::main]
fn main(req: Request) -> Result<Response, fastly::Error> {
    let config = FastlyAgentConfig::new(
        "openai",
        std::env::var("OPENAI_API_KEY").unwrap(),
        "gpt-4o-mini",
    );

    let mut agent = FastlyAgent::new(config, "llm-backend");
    let response = agent.chat("Hello!")?;

    Ok(Response::from_body(response.content))
}

Structs§

FastlyAgent
AI Agent for Fastly Compute
FastlyAgentConfig
Configuration for the Fastly agent
StreamingResponse
Streaming response iterator

Enums§

FastlyAgentError
Errors that can occur in the Fastly agent

Functions§

handle_chat_request
Create a simple chat completion handler for Fastly

Type Aliases§

Result