fix(openai-adapter): add reasoning_content and handle max_completion_… #10478
+39
−11
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
This PR addresses a specific compatibility issue between the Continue OpenAI adapter and DeepSeek Reasoner (R1) models.
The Problem
The Solution
Fixes #10475
AI Code Review
@continue-reviewChecklist
Screen recording or screenshot
[ When applicable, please include a short screen recording or screenshot - this makes it much easier for us as contributors to review and understand your changes. See this PR as a good example. ]
Tests
modifyChatBodycorrectly injectsreasoning_contentand maps token parameters specifically for DeepSeek.npm run buildpassed successfully inpackages/openai-adapters.Summary by cubic
Fixes DeepSeek Reasoner (R1) compatibility in the OpenAI adapter by adding missing reasoning_content and mapping max_tokens to max_completion_tokens to prevent 400 errors.
Written for commit 62b4e77. Summary will update on new commits.