Sunpeak AI App Framework and Testing
Sunpeak (current version 0.20.7) is an AI application framework, testing framework, and inspector specifically designed for building and debugging Multi-Platform (MCP) Apps, such as those for ChatGPT and Claude. It provides a convention-over-configuration approach to simplify wiring MCP servers, handling protocol messages, managing resource HTML bundles, and setting up dev environments with hot module replacement (HMR). Its key differentiators include a robust testing framework that simulates host runtimes and provides simulation fixtures for reproducible testing across various hosts, themes, and data states without relying on external accounts or API credits. It also offers a visual inspector for debugging. Sunpeak releases are frequent, primarily focusing on bug fixes, performance improvements, and feature enhancements within the 0.x.x minor version range, indicating active development.
Common errors
-
ReferenceError: require is not defined in ES module scope
cause Attempting to use CommonJS `require()` syntax in a modern Sunpeak project, which primarily uses ESM.fixConvert `require()` statements to ES module `import` syntax. Ensure your project's `package.json` has `"type": "module"` or configure your build tools for ESM. -
TypeError: mcp.renderTool is not a function
cause Using the deprecated `mcp.renderTool` method after the v0.20.1 breaking change, which moved rendering capabilities to the `inspector` fixture.fixReplace `mcp.renderTool(...)` with `inspector.renderTool(...)` in your test files. The `inspector` fixture needs to be destructured from the `test` parameters (e.g., `async ({ page, inspector }) => { ... }`). -
Error: Cannot find module 'sunpeak/test/live' or 'sunpeak/test/live/config'
cause Attempting to import from old testing module paths that were deprecated and moved in v0.19.1.fixUpdate import paths: `sunpeak/test/live` is now `sunpeak/test` (for fixtures like `mcp`), and `sunpeak/test/live/config` is now `sunpeak/test/config` (for Playwright's `defineConfig`). -
Error: Missing peer dependency "react"
cause The `react` or `react-dom` peer dependency is not installed, or its version is incompatible with Sunpeak.fixInstall `react` and `react-dom` (versions ^18.0.0 || ^19.0.0) in your project: `npm install react react-dom` or `yarn add react react-dom`. Verify versions are compatible.
Warnings
- breaking The test fixture API was redesigned, splitting the single `mcp` fixture into two: `mcp` (for MCP protocol methods) and `inspector` (for Sunpeak inspector rendering methods like `renderTool` and `host`). Code using `mcp.renderTool` will now fail.
- breaking Significant changes to the testing framework import paths and default fixtures. `sunpeak/test` became the primary module for the `mcp` fixture, replacing older `live` imports. `defineConfig()` moved from `sunpeak/test/live/config` to `sunpeak/test/config`.
- gotcha Unit tests were updated for ESM compatibility in v0.20.7, which might cause issues with older CommonJS-style test setups or specific module resolvers.
- gotcha While direct `vitest run` or `playwright test` commands still function, the recommended way to run tests is via `npx sunpeak test`. Bypassing the CLI might lead to missing environment setups, configurations, or features managed by the `sunpeak` CLI.
- gotcha Sunpeak has several peer dependencies, including `react`, `react-dom`, and the `@ai-sdk/*` packages. Incorrect versions or missing peer dependencies will lead to runtime errors or build failures.
Install
-
npm install sunpeak -
yarn add sunpeak -
pnpm add sunpeak
Imports
- test
const { test, expect } = require('sunpeak/test');import { test, expect } from 'sunpeak/test'; - useToolData
import { useToolData, useAppState } from 'sunpeak/react'; - defineConfig
import { defineConfig } from 'sunpeak/test/live/config';import { defineConfig } from 'sunpeak/test/config';
Quickstart
import { test, expect } from 'sunpeak/test';
import { createOpenAI } from '@ai-sdk/openai';
// To initialize a new Sunpeak project:
// npx sunpeak new
// Then, add a test file (e.g., tests/e2e/my-app.test.ts)
// Mock your AI model for consistent testing
const mockOpenAI = createOpenAI({ apiKey: process.env.OPENAI_API_KEY ?? 'mock-key' });
test('My MCP App loads and renders a tool', async ({ page, inspector, mcp }) => {
// Render a specific tool within the Sunpeak inspector
await inspector.renderTool({ name: 'my-awesome-tool', params: { query: 'test' } });
// Wait for the tool's UI to appear and interact with it
await expect(page.locator('text=Welcome to My Awesome Tool')).toBeVisible();
// You can also mock tool calls to the MCP server
mcp.useTool(async ({ name, params }) => {
if (name === 'my-awesome-tool') {
const result = await mockOpenAI.chat({
model: 'gpt-4o',
messages: [{ role: 'user', content: params.query as string }],
});
return { result: result.text };
}
return { result: 'Unknown tool' };
});
// Simulate an interaction or assert on UI changes
await page.locator('button', { hasText: 'Execute' }).click();
await expect(page.locator('text=Execution complete')).toBeVisible();
});
// To run the tests:
// npx sunpeak test --e2e