# AIP009 — cache-unfriendly-structure
**Category:** efficiency **Severity:** warning
## What
Detects prompt structures that may underutilize caching or incur unnecessary cache eviction.
## Why it matters
Prompt caching (in Claude and other LLMs) accelerates subsequent requests on the same cached prefix. Structure that repeats or varies unpredictably can waste cache tokens and slow multi-turn conversations.
## Example
```
System: {system_prompt}
---
User: {user_message}
Assistant: {response}
---
User: {another_message}
```
The structure changes with each turn, forcing cache misses.
## Fix
Stabilize the prefix structure:
```
System: {system_prompt}
---
{conversation_turns_here}
```
Keep the system prompt static and the conversation history in a stable append-only format.
## See also
- [Prompt caching guide](https://docs.anthropic.com)