AI SDK Provider for OpenAI Codex CLI

1.1.0 · active · verified Wed Apr 22

The `ai-sdk-provider-codex-cli` package offers a community-maintained integration for Vercel's AI SDK v6, enabling access to OpenAI's Codex CLI and its GPT-5 class models (e.g., `gpt-5.1`, `gpt-5.2`, `gpt-5.3-codex`, `gpt-5.2-codex-max`) using an existing ChatGPT Plus/Pro subscription. Currently at version 1.1.0, the library is actively developed, with recent releases focusing on AI SDK v6 compatibility and new features like the `createCodexAppServer` persistent JSON-RPC client. It provides two primary modes: `codexExec` for a new process per call, and `createCodexAppServer` for a long-lived process with advanced session controls like `injectMessage` and `interrupt`. This Node-only provider works with `generateText`, `streamText`, and `generateObject` functions, and authenticates via `codex login` or `OPENAI_API_KEY`. Its key differentiator is leveraging the local Codex CLI for direct, granular control over model execution and sandboxing, providing an alternative to direct API integrations for users with existing ChatGPT subscriptions.

Common errors

Warnings

Install

Imports

Quickstart

Demonstrates how to use the `codexExec` provider with `generateText` to get a simple text response from a GPT-5 class model, showing basic configuration and token usage.

import { generateText } from 'ai';
import { codexExec } from 'ai-sdk-provider-codex-cli';

async function main() {
  // Ensure Codex CLI is installed and authenticated: `npm i -g @openai/codex && codex login`
  // You can also pass OPENAI_API_KEY as an environment variable.

  const model = codexExec('gpt-5.3-codex', {
    // allowNpx: true, // Set to true to allow npx usage if codex is not globally installed
    // skipGitRepoCheck: true, // Optional: skip git repository checks for local scripts
    approvalMode: 'on-failure', // Automatically approve successful executions, prompt on failure
    sandboxMode: 'workspace-write', // Allow Codex to write to the current workspace
    codexPath: process.env.CODEX_CLI_PATH ?? undefined // Optional: specify custom path to codex executable
  });

  const { text, usage } = await generateText({
    model,
    messages: [
      { role: 'user', content: 'Reply with a single, common greeting word.' }
    ],
  });

  console.log('Generated Text:', text);
  console.log('Token Usage:', usage);
}

main().catch(console.error);

view raw JSON →