pub fn llm_request_impl(
lua: &Lua,
args: (String, Option<Table>),
) -> Result<Table>Expand description
Executes an LLM chat request. Called from capability-gated context.
§Arguments (from Lua)
prompt- User message textopts- Optional table:provider- “ollama” (default), “openai”, “anthropic”base_url- Provider base URL (default per provider)model- Model name (default per provider)api_key- API key (falls back to env var)system_prompt- System prompt textsession_id- Session ID for multi-turn (nil = new session)temperature- Sampling temperaturemax_tokens- Max completion tokenstimeout- Request timeout in seconds (default: 120)
§Returns (Lua table)
ok- booleancontent- Response text (when ok=true)model- Model name from responsesession_id- Session ID (new or existing)error- Error message (when ok=false)error_kind- Error classification