Skip to main content
Codex is OpenAI’s open-source CLI coding agent. Route your Codex traffic through Edgee to gain full observability over every coding session and access all LLM providers through a single endpoint.

Quickstart with the Edgee CLI

The fastest way to get started — no config files to edit:
# Install the Edgee CLI
curl -fsSL https://install.edgee.ai | bash

# Authenticate and configure Codex automatically
edgee init

# Launch Codex through Edgee
edgee launch codex
That’s it. See the CLI quickstart for more details.

Manual Configuration

To use Edgee with Codex manually, update your Codex configuration file at ~/.codex/config.toml with the following:
model_provider = "edgee"

[model_providers.edgee]
name = "EDGEE"
base_url = "https://api.edgee.ai/v1"
http_headers = { "x-edgee-api-key" = "<YOUR_EDGEE_API_KEY>" }
wire_api = "responses"
Replace <YOUR_EDGEE_API_KEY> with your actual Edgee API key from the Edgee Console.

Usage

Once configured, Codex will automatically route all requests through Edgee. You can use Codex normally:
# Start an interactive session
codex

# Ask a question directly
codex "How do I implement a binary search tree in Python?"

# Run in full-auto mode
codex --approval-mode full-auto "Refactor this module to use async/await"

Benefits of Using Codex with Edgee

Unified Infrastructure

Access all LLM providers through Edgee while using Codex’s developer-friendly CLI.

Cost Control

Leverage Edgee’s cost tracking and intelligent routing to optimize your spending.

Reliability

Combine Codex’s agent capabilities with Edgee’s automatic failover and load balancing.

Observability

Monitor your Codex sessions with Edgee’s built-in observability features.