Unofficial api client for Deepseek
Sign up for an account on https://platform.deepseek.com/sign_in to get your API key.
[dependencies]
deepseek-api-client = { path = "https://github.com/acscoder/deepseek-api-client.git" }
Get started
Load your API key in env var or any secret way
use *;
let api_key = var.expect;
Call synchronous function
- Get llm by function
chat_completion_sync
let mut llm_completion = chat_completion_sync ;
- Then you will get a function
llm_completion
that take input vector of Message and get back Result of Response unwrap response result then take the first choice response text by functionget_response_text
let messages = vec!;
let res = llm_completion;
let res_text = get_response_text;
dbg!;
3 . Do the same with function code_completion_sync
for code generation with deepseek-coder
model and llm_function_call_sync
for function calling
Call asynchronous function
- Get llm by function
chat_completion
let mut llm_completion = chat_completion ;
- It the same
chat_completion_sync
but it is async function that we can call with .await, i used tokio crate for async runtime
let rt = new.unwrap;
- Input vector of Message and get back Result then take the first choice response text by function
get_response_text
let messages = vec!;
let res = llm_completion;
let r = rt.block_on;
dbg!;
- Do the same with function
code_completion
for code generation withdeepseek-coder
model andllm_function_call
for function calling
Call asynchronous function stream
- Get llm by
chat_completion_stream
let mut llm_completion = chat_completion_stream ;
- We have async function that take input vector of Message and get back stream of Response
let rt = new.unwrap;
let messages = vec!;
let response_result = llm_completion;
let _ = rt.block_on;
- Do the same with function
code_completion_stream
for code generation withdeepseek-coder
model andllm_function_call_stream
for function calling