Getting Started
Set up Compose in your Next.js project in under 2 minutes
Quick Start
The fastest way to add Compose to your project is with the CLI scaffold:
Scaffold
npx @prototyperco/cli add composeThis creates two files and adds your API key placeholder to .env.local:
app/api/compose/route.ts— server handler that streams UI specs from an AI modelapp/compose/page.tsx— ready-to-use page with the<Compose />component
What was scaffolded
The CLI creates a minimal but complete setup:
app/api/compose/route.ts — A Next.js API route that calls an AI model and streams back JSONL patches:
import { createComposeHandler } from "@prototyperco/compose/server";
export const POST = createComposeHandler({
provider: "anthropic",
});app/compose/page.tsx — A client page that renders the streamed UI:
"use client";
import { Compose } from "@prototyperco/compose/react";
export default function ComposePage() {
return (
<div className="mx-auto max-w-4xl p-8">
<h1 className="mb-4 text-2xl font-bold">Compose</h1>
<p className="mb-8 text-muted-foreground">
Describe a UI and watch it appear in real-time.
</p>
<Compose endpoint="/api/compose" />
</div>
);
}The Compose Component
<Compose /> is an all-in-one component that handles prompt input, streaming, spec assembly, and rendering. It connects to your API route and manages the full lifecycle.
import { Compose } from "@prototyperco/compose/react";
<Compose endpoint="/api/compose" />;Props
| Prop | Type | Default | Description |
|---|---|---|---|
endpoint | string | — | Required. URL of the compose API route. |
placeholder | string | "Describe a UI..." | Placeholder text for the prompt input. |
onSpec | (spec: Spec) => void | — | Called when a new spec is received. |
onError | (error: Error) => void | — | Called on streaming errors. |
className | string | — | Additional CSS classes for the wrapper. |
Server Handler
createComposeHandler creates a Next.js-compatible POST handler. It builds a system prompt from the component catalog, calls the AI provider, and streams JSONL patches back to the client.
import { createComposeHandler } from "@prototyperco/compose/server";
export const POST = createComposeHandler({
provider: "anthropic",
model: "claude-sonnet-4-20250514", // optional, this is the default
maxTokens: 4096, // optional
rules: ["Use a dark color scheme"], // optional extra instructions
});| Option | Type | Description |
|---|---|---|
provider | "anthropic" | "openai" | Required. Which AI provider to use. |
apiKey | string | API key. Falls back to ANTHROPIC_API_KEY or OPENAI_API_KEY env vars. |
model | string | Model identifier. Defaults to claude-sonnet-4-20250514 (Anthropic) or gpt-4o (OpenAI). |
maxTokens | number | Max tokens for the AI response. Default 4096. |
rules | string[] | Additional rules appended to the system prompt. |
systemPrompt | string | Full system prompt override (bypasses catalog-based generation). |
mode | "generate" | "chat" | Prompt mode. "generate" outputs raw JSONL; "chat" wraps in code fences. |
onRequest | (params) => override | void | Hook to modify prompts before sending to AI. |
onResponse | (params) => void | Hook called after the AI response completes. |
maxPromptLength | number | Truncate user prompts to this character length. |
Using OpenAI
To use OpenAI instead of Anthropic:
npx @prototyperco/cli add compose --provider openaiOr change the route manually:
import { createComposeHandler } from "@prototyperco/compose/server";
export const POST = createComposeHandler({
provider: "openai",
});Set the OpenAI key in .env.local:
OPENAI_API_KEY=sk-your-openai-key-hereManual Setup
If you prefer to set things up yourself instead of using the CLI scaffold:
pnpm add @prototyperco/composenpm install @prototyperco/composeyarn add @prototyperco/composebun add @prototyperco/composeCompose requires react@^19 and react-dom@^19 as peer dependencies.
Create the two files shown in What was scaffolded above, then add your API key to .env.local.
Low-level API
For more control, use the lower-level hooks and renderer directly:
"use client";
import { Renderer, useUIStream } from "@prototyperco/compose";
import { prototyperComponents } from "@prototyperco/compose/components";
export function CustomCompose() {
const { spec, isStreaming, error, send, clear } = useUIStream({
url: "/api/compose",
});
return (
<div>
<button onClick={() => send({ prompt: "Build a login form" })}>
Generate
</button>
{error && <p>Error: {error.message}</p>}
{spec && (
<Renderer
spec={spec}
registry={prototyperComponents}
loading={isStreaming}
/>
)}
</div>
);
}The Renderer component accepts these props:
| Prop | Type | Description |
|---|---|---|
spec | Spec | null | The UI spec to render. Pass null to render nothing. |
registry | ComponentRegistry | Map of component type names to renderer functions. |
handlers | Record<string, ActionHandler> | Custom action handlers (merged with built-in actions). |
functions | Record<string, ComputedFunction> | Named functions for $computed expressions. |
navigate | (url: string) => void | Navigation callback for the navigate action. |
onConfirm | (confirm: ActionConfirm) => Promise<boolean> | Confirmation dialog handler. |
onStateChange | (state: StateModel) => void | Called whenever state changes. |
loading | boolean | Passed through to components (e.g. for skeleton states). |
Available Components
Compose includes wrappers for 20 Prototyper UI components:
| Category | Components |
|---|---|
| Actions | Button |
| Forms | Input, Textarea, Select, Checkbox, Switch, RadioGroup, Slider |
| Overlays | Dialog, Tooltip |
| Navigation | Tabs |
| Layout | Card, Accordion, Separator |
| Data Display | Heading, Text, Badge, Avatar, Alert |
| Feedback | Progress |
Import the full registry:
import { prototyperComponents } from "@prototyperco/compose/components";Or import individual wrappers to build a custom registry:
import { buttonRenderer } from "@prototyperco/compose/components";
const myRegistry = {
Button: buttonRenderer,
// ... add only what you need
};Next Steps
- Spec Format — Deep dive into the spec structure
- Expressions — Dynamic values and data binding
- Streaming — Connect to any AI model
- Actions — Handle user interactions