Skip to content

Add codex cli support#195

Open
albanx wants to merge 2 commits intoericc-ch:masterfrom
albanx:alban/addcodexsupport
Open

Add codex cli support#195
albanx wants to merge 2 commits intoericc-ch:masterfrom
albanx:alban/addcodexsupport

Conversation

@albanx
Copy link
Copy Markdown

@albanx albanx commented Feb 14, 2026

The codex CLI requires the /responses endpoint. Based on the documentation, adding by imitation codex support. Verified and worked.

Docs:

https://developers.openai.com/codex/config-advanced
openai/codex#7782
openai/codex#4278

Sample config of codex in C:\Users\username\.codex\config.toml

model = "gpt-5.3-codex"
model_reasoning_effort = "medium"
model_provider = "copilot"

[notice.model_migrations]
"gpt-5.2-codex" = "gpt-5.3-codex"

[windows]
sandbox = "unelevated"

[model_providers.copilot]
name = "copilot"
base_url = "http://localhost:4141/v1"
wire_api = "responses"

Steps to use it:

  1. Configure codex config as above
  2. Launch as usual npx copilot-api@latest start
  3. Launch codex and start using it (if prompted go to custom API usage)

@caozhiyuan
Copy link
Copy Markdown
Contributor

caozhiyuan commented Feb 14, 2026

@albanx This PR is a duplicate; please use https://github.com/caozhiyuan/copilot-api/tree/all . The current branch supports 5.3 CodeX in Claude Code, Open Code, and CodeX.

@caozhiyuan
Copy link
Copy Markdown
Contributor

kubaeror added a commit to kubaeror/copilot-api that referenced this pull request Mar 29, 2026
- Add /api/event_logging/batch endpoint returning 200 (PR ericc-ch#165)
- Add /v1/responses endpoint for OpenAI Codex CLI support (PR ericc-ch#195)
- Create response service with streaming support

Co-authored-by: Zevan770 <Zevan770@users.noreply.github.com>
Co-authored-by: albanx <albanx@users.noreply.github.com>
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants