Routin AI
Developer Tools

Use Codex

Codex integration guide - call AI models through the Routin API platform

Overview

This guide explains how to install and configure OpenAI Codex so it can call AI models via the Routin API platform. Codex is a powerful AI coding assistant for multiple languages and environments.

Install Codex

npm install -g @openai/codex@latest

Configure Codex

Step 1: Open the config file

Config locations by OS:

Windows

C:\Users\<your user>\.codex\config.toml

macOS / Linux

~/.codex/config.toml

Tip: If the file is missing, install the VS Code Codex extension and open settings through the extension UI.

Step 2: Edit the config

Open config.toml and add:

model = "gpt-5.2"
model_provider = "meteor-api"
disable_response_storage = true
approval_policy = "never"                  # options: "untrusted" | "on-failure" | "on-request" | "never"
sandbox_mode = "danger-full-access"

[model_providers.meteor-api]
name = "Routin API"
base_url = "https://api.routin.ai/v1"
env_key = "OPENAI_API_KEY"
wire_api = "responses"

Notes:

  • model: model name to use
  • model_provider: custom provider name
  • disable_response_storage: disable storing responses
  • approval_policy: automation level; never runs without prompts
  • sandbox_mode: danger-full-access allows full access

Important: model and model_provider must be at the top of the file.

Advanced example: reasoning summaries and higher reasoning effort

model = "gpt-5.2"
model_provider = "meteor-ai"
disable_response_storage = true
approval_policy = "never"                  # options: "untrusted" | "on-failure" | "on-request" | "never"
sandbox_mode = "danger-full-access"

rmcp_client = true
model_reasoning_effort = "xhigh"

# Reasoning summary: auto | concise | detailed | none (default: auto)
model_reasoning_summary = "detailed"

# Text verbosity for GPT-5 family (Responses API): low | medium | high (default: medium)
model_verbosity = "high"

# Force-enable reasoning summaries for the current model (default: false)
model_supports_reasoning_summaries = true

[mcp_servers.claude]
command = "claude"
# Optional
args = ["mcp", "serve"]

[model_providers.meteor-ai]
name = "meteor-ai"
base_url = "https://api.routin.ai/v1"
env_key = "METEOR_AI_API_KEY"
wire_api = "responses"

Step 3: Configure the API key

Store the API key as a system environment variable (avoid local files).

Windows (PowerShell)

setx OPENAI_API_KEY "sk-your-api-key-here"

macOS / Linux

export OPENAI_API_KEY="sk-your-api-key-here"

Replace sk-your-api-key-here with your Routin API key. Restart the terminal after setting it, or add the command to ~/.zshrc / ~/.bashrc for persistence.


Verify installation

After configuring, open a terminal:

codex

If everything is correct, Codex starts in interactive mode and is ready to use.


FAQ

Config file missing

If you cannot find the config file, create it manually:

Windows

# Create config directory
mkdir %USERPROFILE%\.codex

# Create config file
notepad %USERPROFILE%\.codex\config.toml

macOS / Linux

# Create config directory
mkdir -p ~/.codex

# Create config file
nano ~/.codex/config.toml

How do I get an API key?

See the “Get API Key” section in the Claude Code guide.


Get Help

On this page