MCP Integration
Connect AI assistants to Compose via Model Context Protocol
The Model Context Protocol (MCP) is an open standard that lets AI assistants connect to external tools and data sources. The Prototyper UI MCP server exposes Compose's component catalog and prompt generation as tools that any MCP-compatible client (Claude, Cursor, etc.) can use.
Available Tools
The MCP server provides 7 tools. Two are specific to Compose:
get_ui_catalog
Returns the Compose component catalog as JSON Schema. This includes all available components with their props, events, slots, and descriptions.
The output is the same as calling catalog.jsonSchema() — a machine-readable schema that AI models can use to understand what components are available and how to construct valid specs.
get_ui_prompt
Returns the pre-built system prompt generated by buildSystemPrompt(catalog). This is a complete instruction set that teaches the AI model:
- The JSONL streaming output format (RFC 6902 JSON Patch)
- The flat spec structure
- All dynamic expressions (
$state,$bindState,$cond, etc.) - Every available component with prop types and examples
- Built-in and custom actions
- Rules for valid spec generation
Live Canvas Tools
The MCP server also includes 7 tools for the Live Canvas — a live AI-to-browser design system that pushes spec and theme changes over WebSocket:
| Tool | Description |
|---|---|
design_create | Create a new live design session with preview URL |
design_update | Push spec changes to the browser in real time |
design_theme | Update theme tokens (hue, chroma, radius, font) |
design_get | Get current session state (spec, theme, revision) |
design_list | List all active design sessions |
design_close | Close a design session |
design_export | Export session as standalone HTML or Compose spec |
In normal agent flows, the bridge auto-starts on the first design_create when the local bridge package is available. For local docs development you can still run pnpm dev, which serves the docs on port 3333 and the bridge on 4321.
Other Tools
The server also includes general Prototyper UI tools:
| Tool | Description |
|---|---|
list_components | List all available Prototyper UI components by category |
get_component | Get full docs, source code, and examples for components (batch up to 5) |
get_theme | Get CSS design tokens (OKLCH colors, surfaces, shadows, easings) |
search_docs | Full-text search across all documentation |
get_install_command | Get the CLI install command for components |
Setup
Add the MCP server to your AI assistant:
claude mcp add prototyper-ui -- bunx @prototyperco/mcp@latestAdd to your claude_desktop_config.json:
{
"mcpServers": {
"prototyper-ui": {
"command": "bunx",
"args": ["@prototyperco/mcp@latest"]
}
}
}Custom Base URL
By default, the MCP server fetches from https://prototyper-ui.com. To point it at a local dev server or custom deployment, set the PROTOTYPER_UI_BASE_URL environment variable:
PROTOTYPER_UI_BASE_URL=http://localhost:3333 bunx @prototyperco/mcp@latestWorkflow
A typical AI-assisted UI generation flow using MCP:
Discover Components
The AI calls get_ui_catalog to receive the full component catalog as JSON Schema. This tells it what components exist, what props they accept, and what events they emit.
Get the System Prompt
The AI calls get_ui_prompt to receive the system prompt. This prompt contains the complete spec format documentation, streaming instructions, and all the rules for generating valid specs.
Generate the Spec
Using the catalog knowledge and system prompt, the AI generates a streaming spec (JSONL patches) that builds a UI matching the user's request.
Render with Compose
The generated spec is passed to <Renderer> with the matching component registry. The UI renders progressively as patches stream in.
Resources
The MCP server also exposes resources that clients can read directly:
| Resource | URI | Description |
|---|---|---|
| Design tokens | prototyper://tokens/css | Full CSS design tokens file |
| Component index | prototyper://docs/index | Index of all components and documentation pages |
| Compose catalog | prototyper://compose/catalog | Component catalog as JSON Schema |
| Component docs | prototyper://docs/components/{name} | Per-component documentation (dynamic) |
Prompts
The server includes two built-in prompt templates:
| Prompt | Description |
|---|---|
build_ui | Generate a UI component or page using Prototyper UI components |
review_usage | Review code for correct Prototyper UI component usage |
These can be invoked directly by MCP-compatible clients that support the prompts capability.
Direct API Access
If you are not using MCP, you can access the same data through HTTP endpoints:
| Endpoint | Content |
|---|---|
/compose/catalog.json | Component catalog as JSON Schema |
/compose/prompt.txt | Pre-built system prompt |
/llms.txt | Component index for LLMs |
/llms-full.txt | Full documentation for LLMs |
/llms/{component} | Per-component documentation |
/prototyper-tokens.css | Extractable CSS design tokens |
These endpoints return the same data that the MCP tools serve, making it straightforward to integrate with any AI pipeline.