blackman_client/models/semantic_cache_stats.rs
1/*
2 * Blackman AI API
3 *
4 * A transparent AI API proxy that optimizes token usage to reduce costs. ## Authentication Blackman AI supports two authentication methods: ### 1. API Key (Recommended for integrations) Use the API key created from your dashboard: ```bash curl -X POST https://app.useblackman.ai/v1/completions \\ -H \"Authorization: Bearer sk_your_api_key_here\" \\ -H \"Content-Type: application/json\" \\ -d '{\"provider\": \"OpenAI\", \"model\": \"gpt-4\", \"messages\": [{\"role\": \"user\", \"content\": \"Hello!\"}]}' ``` ### 2. JWT Token (For web UI) Obtain a JWT token by logging in: ```bash curl -X POST https://app.useblackman.ai/v1/auth/login \\ -H \"Content-Type: application/json\" \\ -d '{\"email\": \"user@example.com\", \"password\": \"yourpassword\"}' ``` Then use the token: ```bash curl -X POST https://app.useblackman.ai/v1/completions \\ -H \"Authorization: Bearer your_jwt_token\" \\ -H \"Content-Type: application/json\" \\ -d '{...}' ``` ### Provider API Keys (Optional) You can optionally provide your own LLM provider API key via the `X-Provider-Api-Key` header, or store it in your account settings. ## Client SDKs Auto-generated SDKs are available for 10 languages: - **TypeScript**: [View Docs](https://github.com/blackman-ai/typescript-sdk) - **Python**: [View Docs](https://github.com/blackman-ai/python-sdk) - **Go**: [View Docs](https://github.com/blackman-ai/go-sdk) - **Java**: [View Docs](https://github.com/blackman-ai/java-sdk) - **Ruby**: [View Docs](https://github.com/blackman-ai/ruby-sdk) - **PHP**: [View Docs](https://github.com/blackman-ai/php-sdk) - **C#**: [View Docs](https://github.com/blackman-ai/csharp-sdk) - **Rust**: [View Docs](https://github.com/blackman-ai/rust-sdk) - **Swift**: [View Docs](https://github.com/blackman-ai/swift-sdk) - **Kotlin**: [View Docs](https://github.com/blackman-ai/kotlin-sdk) All SDKs are generated from this OpenAPI spec using [openapi-generator](https://openapi-generator.tech). ## Quick Start ```python # Python example with API key import blackman_client from blackman_client import CompletionRequest configuration = blackman_client.Configuration( host=\"http://localhost:8080\", access_token=\"sk_your_api_key_here\" # Your Blackman API key ) with blackman_client.ApiClient(configuration) as api_client: api = blackman_client.CompletionsApi(api_client) response = api.completions( CompletionRequest( provider=\"OpenAI\", model=\"gpt-4o\", messages=[{\"role\": \"user\", \"content\": \"Hello!\"}] ) ) ```
5 *
6 * The version of the OpenAPI document: 0.1.0
7 *
8 * Generated by: https://openapi-generator.tech
9 */
10
11use crate::models;
12use serde::{Deserialize, Serialize};
13
14#[derive(Clone, Default, Debug, PartialEq, Serialize, Deserialize)]
15pub struct SemanticCacheStats {
16 /// Average similarity score for cache hits
17 #[serde(rename = "avg_similarity_score")]
18 pub avg_similarity_score: f64,
19 /// Total cost saved from cache hits
20 #[serde(rename = "cost_saved")]
21 pub cost_saved: f64,
22 /// Cache hit rate as a percentage
23 #[serde(rename = "hit_rate")]
24 pub hit_rate: f64,
25 /// Total tokens saved from cache hits
26 #[serde(rename = "tokens_saved")]
27 pub tokens_saved: i64,
28 /// Total number of cache hits
29 #[serde(rename = "total_hits")]
30 pub total_hits: i64,
31 /// Total number of cache misses
32 #[serde(rename = "total_misses")]
33 pub total_misses: i64,
34}
35
36impl SemanticCacheStats {
37 pub fn new(avg_similarity_score: f64, cost_saved: f64, hit_rate: f64, tokens_saved: i64, total_hits: i64, total_misses: i64) -> SemanticCacheStats {
38 SemanticCacheStats {
39 avg_similarity_score,
40 cost_saved,
41 hit_rate,
42 tokens_saved,
43 total_hits,
44 total_misses,
45 }
46 }
47}
48