pub struct CompactResponseRequest {
pub model: String,
pub input: Option<InputParam>,
pub previous_response_id: Option<String>,
pub instructions: Option<String>,
pub prompt_cache_key: Option<String>,
}response-types only.Expand description
Request to compact a conversation.
Fields§
§model: StringModel ID used to generate the response, like gpt-5 or o3. OpenAI offers a wide range of models
with different capabilities, performance characteristics, and price points. Refer to the
model guide to browse and compare available models.
input: Option<InputParam>Text, image, or file inputs to the model, used to generate a response
previous_response_id: Option<String>The unique ID of the previous response to the model. Use this to create multi-turn
conversations. Learn more about conversation state.
Cannot be used in conjunction with conversation.
instructions: Option<String>A system (or developer) message inserted into the model’s context.
When used along with previous_response_id, the instructions from a previous response will
not be carried over to the next response. This makes it simple to swap out system (or
developer) messages in new responses.
prompt_cache_key: Option<String>A key to use when reading from or writing to the prompt cache.
Trait Implementations§
Source§impl Clone for CompactResponseRequest
impl Clone for CompactResponseRequest
Source§fn clone(&self) -> CompactResponseRequest
fn clone(&self) -> CompactResponseRequest
1.0.0 · Source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source. Read more