SIGN IN SIGN UP

OpenCode plugin for Morph - FastApply, WarpGrep codebase search, FlashCompact

0 0 0 TypeScript

fix: stop compact from invalidating prompt cache on every LLM call

The compact hook was rewriting message[0] on every messages.transform
call (fires per LLM API call), busting Anthropic's prompt prefix cache
each time — 10x cost increase, higher TTFT.

Changes:
- Replace sliding-window compaction with "compact once, freeze, discard
  on re-compact" — frozen block is byte-stable between compactions
- Replace fixed 100k char threshold with dynamic 70% of model context
  window (captured from chat.params hook)
- Preserve per-message structure using result.messages[] instead of
  collapsing into single blob
- Use deterministic IDs (no Date.now()) for cache stability
- Remove chunk-based cache, SHA-256 hashing, and related helpers

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
D
DhruvBhatia0 committed
c7edb6af1c4db4a46ba9685b6e88bbb9f5b44fb1
Parent: 7e2d014