Crate lmql

Source
Expand description

§lmql-rs

An typesafe high-level LLM API for Rust, inspired by the Python library of the same name.

§Features

  • Multiple backend support, including Anthropic, OpenAI and OpenRouter
  • Async and Stream support, with cancelling to avoid wasting tokens on a bad response
  • Tools, with a type-safe interface
  • Macros for a prompt DSL like the LMQL Python library

§Usage

use futures::StreamExt;
use lmql::{PromptOptions, Chunk, Message, LLM};

#[tokio::main]
async fn main() {
    let claude = lmql::llms::anthropic::Claude::new_from_env(
        lmql::llms::anthropic::ClaudeModel::Claude_3_5_Haiku_20241022,
    );
    let mut stream = claude
        .prompt(
            &[Message::User("Please provide a poem about the moon.".into())],
            &PromptOptions::default(),
        )
        .unwrap();

    // Loop over each token as they arrive
    while let Some(t) = stream.next().await {
        if let Ok(Chunk::Token(t)) = t {
            print!("{}", t)
        } else {
            panic!("Unexpected chunk: {t:#?}")
        }
    }

    // Or use `lmql::TokenStreamExt` to collect the tokens together
    let mut stream = claude
        .prompt(
            &[Message::User("What is bitcoin?".into())],
            &PromptOptions::default(),
        )
        .unwrap();

    use lmql::TokenStreamExt;
    let response = stream.all_tokens().await.unwrap();
    assert_eq!(response.len(), 1);
    let Chunk::Token(t) = &response[0] else {
        panic!("Expected only text in response")
    };
    println!("{t}");
}

Re-exports§

pub use serde;
pub use serde_json;

Modules§

llms
The supported LLMs.

Structs§

PromptOptions
SerializedJson
Some serde_json::Value that has been serialized to a string.
Tool
A tool accessible to an LLM.
ToolCallChunk
ToolParameter
ToolParameters
The parameters of a tool available to an LLM.

Enums§

Chunk
Message
PromptError
ReasoningEffort
The effort to put into reasoning. For non-reasoning models, this is ignored. For non-open-ai models, this corresponds to the maximum number of tokens to use for reasoning.
SseError
TokenError

Constants§

DEFAULT_MAX_TOKENS
DEFAULT_TEMPERATURE

Traits§

JsonSchema
A type which can be described as a JSON Schema document.
LLM
Some hook into an LLM, which can be used to generate text.
TokenStreamExt
Utility methods for token stream sources.

Derive Macros§

JsonSchema