ask_llm 2.2.2

make a request to whatever llm is the best these days, without hardcoding model/provider
Documentation
use ask_llm::{Client, Model};

const PARAGRAPHS: &str = "\
The rapid advancement of artificial intelligence has transformed numerous industries, from healthcare \
to finance. Machine learning models can now diagnose diseases with remarkable accuracy, predict market \
trends, and even generate creative content that rivals human output. Yet these capabilities come with \
significant ethical considerations that society must address.

Perhaps the most pressing concern is the displacement of human workers. As AI systems become more \
capable, many traditional jobs face automation. However, history suggests that technological revolutions \
ultimately create more opportunities than they destroy, provided that education systems adapt to prepare \
workers for the new landscape of employment.";

#[tokio::main]
async fn main() {
	v_utils::clientside!();

	let model: Model = std::env::args()
		.nth(1)
		.map(|s| s.parse().expect("valid model: Cheap, Translate, Fast, Medium, Slow"))
		.unwrap_or(Model::Translate);

	let response = Client::default()
		.model(model)
		.ask(format!(
			"Translate the following English text to German. Output ONLY the translation, nothing else.\n\n{PARAGRAPHS}"
		))
		.await
		.unwrap();

	println!("=== Original ===\n{PARAGRAPHS}\n");
	println!("=== German Translation ===\n{}", response.text);
	println!("\n{response}");
}