1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
//! Anthropic Claude Messages API client implementation.
//!
//! This module implements the official Anthropic Messages API with full support for:
//! - Extended thinking with configurable token budgets
//! - Prompt caching with 5-minute and 1-hour TTL
//! - Vision (images via base64/URL)
//! - Documents (PDFs and plain text)
//! - Server tools (web search, bash, code execution, text editor)
//! - Client tools with parallel execution
//! - Server-Sent Events (SSE) streaming
//!
//! # API Documentation
//!
//! Official API reference: <https://docs.anthropic.com/en/api/messages>
//!
//! # Features
//!
//! ## Extended Thinking
//!
//! Claude can show its step-by-step reasoning process before answering:
//!
//! ```ignore
//! // Adaptive thinking for Opus 4.6+ (recommended)
//! let config = AnthropicConfig {
//! thinking: Some(ThinkingConfig::adaptive()),
//! ..Default::default()
//! };
//!
//! // Legacy fixed-budget thinking for older models
//! let config = AnthropicConfig {
//! thinking: Some(ThinkingConfig::enabled(10000)),
//! ..Default::default()
//! };
//! ```
//!
//! ## Prompt Caching
//!
//! Enable Anthropic's automatic prompt caching helper to cache the request
//! prefix through the last cacheable block:
//!
//! ```ignore
//! let config = AnthropicConfig {
//! caching: Some(CachingConfig {
//! enabled: true,
//! ttl: CacheTTL::FiveMinutes,
//! }),
//! ..Default::default()
//! };
//! ```
//!
//! Appam maps this to Anthropic's top-level `cache_control` request field on
//! the direct Anthropic and Azure Anthropic transports. On AWS Bedrock,
//! Appam injects block-level `cache_control` checkpoints because Bedrock's
//! Anthropic InvokeModel shape expects explicit cache checkpoints in supported
//! fields instead of Anthropic's top-level helper.
//!
//! ## Vision & Documents
//!
//! Process images and PDFs as part of the conversation:
//!
//! ```ignore
//! use appam::llm::unified::{UnifiedMessage, UnifiedContentBlock, ImageSource};
//!
//! let msg = UnifiedMessage {
//! role: UnifiedRole::User,
//! content: vec![
//! UnifiedContentBlock::Image {
//! source: ImageSource::Url { url: "https://example.com/image.jpg".to_string() },
//! detail: Some("high".to_string()),
//! },
//! UnifiedContentBlock::Text { text: "Describe this image.".to_string() },
//! ],
//! id: None,
//! timestamp: None,
//! };
//! ```
// Re-exports
pub use AnthropicClient;
pub use ;
pub use ;
pub use RateLimiter;
pub use StreamEvent;
pub use ;