[codex] feat(vscode): persist prompt logs#12019
[codex] feat(vscode): persist prompt logs#12019yzlu0917 wants to merge 1 commit intocontinuedev:mainfrom
Conversation
💡 Codex ReviewLines 42 to 46 in bc461a1
continue/extensions/vscode/src/extension/setupPromptLogging.ts Lines 11 to 15 in bc461a1
ℹ️ About Codex in GitHubYour team has set up Codex to review pull requests in this repo. Reviews are triggered when you
If Codex has suggestions, it will comment; otherwise it will react with 👍. Codex can also answer questions or update the PR. Try commenting "@codex address that feedback". |
Summary
core.llmLoggeroutput to~/.continue/logs/prompt.logVsCodeExtensionactivation and dispose the write stream when the extension unloadsWhy
The binary entrypoint already persists prompt logs to disk, but the VS Code extension runs core in-process and never wired
LLMLogFormatterto a file stream. That meant users running Continue only through the VS Code extension had no file-basedprompt.logoutput for debugging model inputs and outputs.Validation
npm test -- src/extension/setupPromptLogging.vitest.tsinextensions/vscodegit diff --checknpm run tsc:checkinextensions/vscode, but it still fails on existing unrelated repo issues (core/llm/llms/OpenRouter.tsexport mismatch and missing generatedsrc/.buildTimestamp)Closes #11669
Summary by cubic
Persist prompt logs in the VS Code extension by writing
core.llmLoggeroutput to~/.continue/logs/prompt.log. Matches Linear 11669 to give extension users the same file-based prompt debugging as the binary.setupPromptLoggingto pipellmLoggerthroughLLMLogFormatterto~/.continue/logs/prompt.log.VsCodeExtensionactivation and dispose the stream on unload; includes a test for stream creation and cleanup.Written for commit bc461a1. Summary will update on new commits.