pub async fn openai_chat_with_model( prompt: &str, model: &str, ) -> Result<String, OpenAIError>
Sends a prompt using a specified model (e.g., “gpt-3.5-turbo”).