Expand description
Abstractions that tie a prompt to a concrete model and a typed response.
The artificial framework purposely keeps the public surface small. A developer usually needs only two traits to go from “some string fragments” to “ready-to-send payload”:
IntoPrompt
– turns any value into a list of chat messages.PromptTemplate
– adds metadata such as the target model and the expected JSON response schema.
Provider back-ends (e.g. artificial-openai
) accept any P
that
implements both traits. Thanks to Rust’s powerful type system the
compiler guarantees at compile time that
- the message type produced by the prompt matches what the back-end expects,
- the JSON returned by the provider can be deserialised into
P::Output
.
use artificial_core::template::{IntoPrompt, PromptTemplate};
use artificial_core::generic::{GenericMessage, GenericRole};
use artificial_core::model::{Model, OpenAiModel};
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, JsonSchema)]
#[serde(deny_unknown_fields)]
struct Hello { greeting: String }
struct HelloPrompt;
impl IntoPrompt for HelloPrompt {
type Message = GenericMessage;
fn into_prompt(self) -> Vec<Self::Message> {
vec![GenericMessage::new("Say hello!".into(), GenericRole::User)]
}
}
impl PromptTemplate for HelloPrompt {
type Output = Hello;
const MODEL: Model = Model::OpenAi(OpenAiModel::Gpt4oMini);
}
See examples/openai_hello_world.rs
for a fully working program.
Traits§
- Into
Prompt - Converts a value into a series of chat messages.
- Prompt
Template - High-level description of a prompt.