davila7/claude-code-templates/prompt-caching
Caching strategies for LLM prompts including Anthropic prompt caching, response caching, and CAG (Cache Augmented Generation) Use when: prompt caching, cache prompt, response cache, cag, cache augmented.
Risk Score
0
out of 100
Popularity
19,944
Stars
1,854
Forks
Feb 9, 2026
Updated
CodeThreat AppSec
Full SAST + SCA agentic security analysis for MCP servers and Skills.