Codex Models
Source URL: https://developers.openai.com/codex/models
Codex Models
Section titled “Codex Models”Recommended models
Section titled “Recommended models”gpt-5.3-codex
Section titled “gpt-5.3-codex”Most capable agentic coding model to date, combining frontier coding performance with stronger reasoning and professional knowledge capabilities.
- Model ID:
gpt-5.3-codex
gpt-5.3-codex-spark
Section titled “gpt-5.3-codex-spark”Text-only research preview model optimized for near-instant, real-time coding iteration. Available to ChatGPT Pro users.
- Model ID:
gpt-5.3-codex-spark
gpt-5.2-codex
Section titled “gpt-5.2-codex”Advanced coding model for real-world engineering. Succeeded by GPT-5.3-Codex.
- Model ID:
gpt-5.2-codex
For most coding tasks in Codex, start with gpt-5.3-codex. It is available for ChatGPT-authenticated Codex sessions in the Codex app, CLI, IDE extension, and Codex Cloud. API access for GPT-5.3-Codex will come soon. The gpt-5.3-codex-spark model is available in research preview for ChatGPT Pro subscribers.
Alternative models
Section titled “Alternative models”{” “}
gpt-5.2
Section titled “gpt-5.2”Our best general agentic model for tasks across industries and domains.
- Model ID:
gpt-5.2
gpt-5.1-codex-max
Section titled “gpt-5.1-codex-max”Optimized for long-horizon, agentic coding tasks in Codex.
- Model ID:
gpt-5.1-codex-max
gpt-5.1
Section titled “gpt-5.1”Great for coding and agentic tasks across domains. Succeeded by GPT-5.2.
- Model ID:
gpt-5.1
gpt-5.1-codex
Section titled “gpt-5.1-codex”Optimized for long-running, agentic coding tasks in Codex. Succeeded by GPT-5.1-Codex-Max.
- Model ID:
gpt-5.1-codex
gpt-5-codex
Section titled “gpt-5-codex”Version of GPT-5 tuned for long-running, agentic coding tasks. Succeeded by GPT-5.1-Codex.
- Model ID:
gpt-5-codex
gpt-5-codex-mini
Section titled “gpt-5-codex-mini”Smaller, more cost-effective version of GPT-5-Codex. Succeeded by GPT-5.1-Codex-Mini.
- Model ID:
gpt-5-codex
Reasoning model for coding and agentic tasks across domains. Succeeded by GPT-5.1.
- Model ID:
gpt-5
Other models
Section titled “Other models”Codex works best with the models listed above.
You can also point Codex at any model and provider that supports either the Chat Completions or Responses APIs to fit your specific use case.
Support for the Chat Completions API is deprecated and will be removed in future releases of Codex.
Configuring models
Section titled “Configuring models”Configure your default local model
Section titled “Configure your default local model”The Codex CLI and IDE extension use the same config.toml configuration file. To specify a model, add a model entry to your configuration file. If you don’t specify a model, the Codex app, CLI, or IDE Extension defaults to a recommended model.
model = "gpt-5.2"Choosing a different local model temporarily
Section titled “Choosing a different local model temporarily”In the Codex CLI, you can use the /model command during an active thread to change the model. In the IDE extension, you can use the model selector below the input box to choose your model.
To start a new Codex CLI thread with a specific model or to specify the model for codex exec you can use the --model/-m flag:
codex -m gpt-5.3-codexChoosing your model for cloud tasks
Section titled “Choosing your model for cloud tasks”Currently, you can’t change the default model for Codex cloud tasks.