prompty
Prompty is a Rust runtime for the .prompty file format — a markdown-based asset class for LLM prompts. YAML frontmatter defines model configuration, connection details, tools, and schemas. The markdown body becomes templated instructions.
Quick Start
use ;
// Register built-in renderers and parsers
register_defaults;
// Load a .prompty file into a typed agent
let agent = load?;
// Render template + parse role markers → Vec<Message>
let messages = prepare.await?;
// Execute LLM call + process response
let result = run.await?;
// Or do it all in one shot with agent loop support
let result = turn.await?;
Pipeline
The runtime provides 5 public functions:
| Function | Description |
|---|---|
load() |
Parse a .prompty file → typed Prompty agent |
prepare() |
Render template + parse role markers → Vec<Message> |
run() |
Execute LLM call + process response (takes messages) |
invoke_from_path() |
One-shot: load → prepare → execute → process |
turn() |
Conversation round with optional tool-calling agent loop |
Providers
LLM providers are separate crates — register them before calling pipeline functions:
prompty-openai— OpenAI APIprompty-foundry— Azure OpenAI / Foundryprompty-anthropic— Anthropic Claude
// Register OpenAI provider
register;
Features
otel— Enables OpenTelemetry tracing backend
License
MIT — see LICENSE for details.