An Azure machine learning service for building and deploying models.
Hello Joel,
Thank you for reaching out. It looks like you’re encountering two separate issues: one related to connecting Azure AI Foundry models to Codex in VS Code, and another regarding verification of your Microsoft Foundry Services (MfS) subscription.
1. Resolving the unexpected status 404 Error in VS Code Codex
A 404 Not Found error typically indicates that Codex is unable to locate your Azure AI Foundry endpoint. In most cases, this is caused by an incorrect endpoint URL or configuration.
Verify Your Codex Configuration
Please ensure your ~/.codex/config.toml file is configured as follows:
model = "YOUR_DEPLOYMENT_NAME"
Important Validation Points
The base_url must include:
- Your Azure OpenAI resource name
- The
.openai.azure.comdomain - The
/openai/v1suffix
For example:
https://myresource.openai.azure.com/openai/v1
A missing /openai/v1 path is one of the most common causes of this exact 404 error.
Additionally, please confirm that:
-
modelexactly matches your Azure deployment name (not the underlying base model name) -
model_provider = "azure"is specified -
wire_api = "responses"is present -
env_keycontains the environment variable name, not the API key itself
2. Set the Required Environment Variables
Before launching VS Code, export the API key from the same terminal session.
macOS / Linux / WSL
export AZURE_OPENAI_API_KEY="<your-api-key>"
Setting both variables is recommended, as the Codex CLI typically uses AZURE_OPENAI_API_KEY, while the VS Code extension may reference OPENAI_API_KEY.
Windows + WSL
If you are using WSL, please also define the same environment variables on the Windows host, as the VS Code extension may read variables from the Windows environment rather than the WSL session.
3. Launch VS Code from the Same Terminal
Always launch VS Code using:
code .
from the same terminal where you exported the environment variables.
Launching VS Code from the Start menu, desktop shortcut, or dock may prevent the extension from inheriting those variables.
4. Validate Codex from the Command Line
Before testing in VS Code, verify that the Codex CLI itself is working:
codex --version
If the CLI works successfully, then your Azure configuration is correct and the remaining issue is likely isolated to the VS Code extension environment.
5. Confirm Model Deployment
Please ensure that you have deployed a Codex-compatible model in Azure AI Foundry, such as gpt-5-codex
Also verify that the value specified in model is the exact deployment name of that model.
Please refer this
Troubleshooting Codex CLI & VS Code errors: https://learn.microsoft.com/azure/foundry/openai/how-to/codex?tabs=npm#troubleshooting
Using Codex in VS Code: https://learn.microsoft.com/azure/foundry/openai/how-to/codex#use-codex-in-visual-studio-code
Azure AI Foundry model-not-found guidance: https://learn.microsoft.com/azure/ai-foundry/reference/region-support
Azure Cost Management + Billing overview: https://learn.microsoft.com/azure/cost-management-billing/costs
I Hope this helps. Do let me know if you have any further queries.
Thank you!