AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
fix(backend/copilot): make system prompt fully static for cross-user prompt caching (#12790)
### Why / What / How **Why:** Anthropic prompt caching keys on exact system prompt content. Two sources of per-session dynamic data were leaking into the system prompt, making it unique per session/user — causing a full 28K-token cache write (~$0.10 on Sonnet) on *every* first message for *every* session instead of once globally per model. **What:** 1. `get_sdk_supplement` was embedding the session-specific working directory (`/tmp/copilot-<uuid>`) in the system prompt text. Every session has a different UUID, making every session's system prompt unique, blocking cross-session cache hits. 2. Graphiti `warm_ctx` (user-personalised memory facts fetched on the first turn) was appended directly to the system prompt, making it unique per user per query. **How:** - `get_sdk_supplement` now uses the constant placeholder `/tmp/copilot-<session-id>` in the supplement text and memoizes the result. The actual `cwd` is still passed to `ClaudeAgentOptions.cwd` so the CLI subprocess uses the correct session directory. - `warm_ctx` is now injected into the first user message as a trusted `<memory_context>` block (prepended before `inject_user_context` runs), following the same pattern already used for business understanding. It is persisted to DB and replayed correctly on `--resume`. - `sanitize_user_supplied_context` now also strips user-supplied `<memory_context>` tags, preventing context-spoofing via the new tag. After this change the system prompt is byte-for-byte identical across all users and sessions for a given model. ### Changes 🏗️ - `backend/copilot/prompting.py`: `get_sdk_supplement` ignores `cwd` and uses a constant working-directory placeholder; result is memoized in `_LOCAL_STORAGE_SUPPLEMENT`. - `backend/copilot/sdk/service.py`: `warm_ctx` is saved to a local variable instead of appended to `system_prompt`; on the first turn it is prepended to `current_message` as a `<memory_context>` block before `inject_user_context` is called. - `backend/copilot/service.py`: `sanitize_user_supplied_context` extended to strip `<memory_context>` blocks alongside `<user_context>`. ### Checklist 📋 #### For code changes: - [x] I have clearly listed my changes in the PR description - [x] I have made a test plan - [x] I have tested my changes according to the test plan: - [x] `poetry run pytest backend/copilot/prompting_test.py backend/copilot/prompt_cache_test.py` — all passed #### For configuration changes: - [x] `.env.default` is updated or already compatible with my changes - [x] `docker-compose.yml` is updated or already compatible with my changes - [x] I have included a list of my configuration changes in the PR description (under **Changes**) --------- Co-authored-by: Zamil Majdy <zamilmajdy@gmail.com>
Z
Zamil Majdy committed
c9fa6bcd629bf0b5e817a09e7441cd73890a5fde
Parent: c955b39
Committed by GitHub <noreply@github.com>
on 4/15/2026, 1:40:24 PM