pub fn context_window_for_model(_model: &str) -> u64Expand description
Returns the input context window size for a given model, in tokens. Used as the denominator for the chatui context-usage bar and anywhere else the client needs to know how much prompt the model will accept.
All models default to 200k — matching Anthropic’s claude-code
behavior. Models that support a larger window (Opus 4.6+, Sonnet 4)
require explicit opt-in via the context-1m-2025-08-07 beta header.
Without the header, the model operates in 200k mode (better inference
quality — context rot at long contexts is real).
Use model_supports_1m to check whether a model can opt into 1M;
the actual decision happens at request time based on the user’s
context_window setting.