MCP Integration
Connect AI assistants to Stream UI via Model Context Protocol
The Model Context Protocol (MCP) is an open standard that lets AI assistants connect to external tools and data sources. The Prototyper UI MCP server exposes Stream UI's component catalog and prompt generation as tools that any MCP-compatible client (Claude, Cursor, etc.) can use.
Available Tools
The MCP server provides 7 tools. Two are specific to Stream UI:
get_ui_catalog
Returns the Stream UI component catalog as JSON Schema. This includes all available components with their props, events, slots, and descriptions.
The output is the same as calling catalog.jsonSchema() — a machine-readable schema that AI models can use to understand what components are available and how to construct valid specs.
get_ui_prompt
Returns the pre-built system prompt generated by buildSystemPrompt(catalog). This is a complete instruction set that teaches the AI model:
- The JSONL streaming output format (RFC 6902 JSON Patch)
- The flat spec structure
- All dynamic expressions (
$state,$bindState,$cond, etc.) - Every available component with prop types and examples
- Built-in and custom actions
- Rules for valid spec generation
Other Tools
The server also includes general Prototyper UI tools:
| Tool | Description |
|---|---|
list_components | List all available Prototyper UI components by category |
get_component | Get full docs, source code, and examples for components (batch up to 5) |
get_theme | Get CSS design tokens (OKLCH colors, surfaces, shadows, easings) |
search_docs | Full-text search across all documentation |
get_install_command | Get the CLI install command for components |
Setup
Add the MCP server to your AI assistant:
claude mcp add prototyper-ui -- npx -y @prototyperai/mcp-server@latestAdd to your claude_desktop_config.json:
{
"mcpServers": {
"prototyper-ui": {
"command": "npx",
"args": ["-y", "@prototyperai/mcp-server@latest"]
}
}
}Custom Base URL
By default, the MCP server fetches from https://prototyper-ui.com. To point it at a local dev server or custom deployment, set the PROTOTYPER_UI_BASE_URL environment variable:
PROTOTYPER_UI_BASE_URL=http://localhost:3333 npx @prototyperai/mcp-server@latestWorkflow
A typical AI-assisted UI generation flow using MCP:
Discover Components
The AI calls get_ui_catalog to receive the full component catalog as JSON Schema. This tells it what components exist, what props they accept, and what events they emit.
Get the System Prompt
The AI calls get_ui_prompt to receive the system prompt. This prompt contains the complete spec format documentation, streaming instructions, and all the rules for generating valid specs.
Generate the Spec
Using the catalog knowledge and system prompt, the AI generates a streaming spec (JSONL patches) that builds a UI matching the user's request.
Render with Stream UI
The generated spec is passed to <Renderer> with the matching component registry. The UI renders progressively as patches stream in.
Resources
The MCP server also exposes resources that clients can read directly:
| Resource | URI | Description |
|---|---|---|
| Design tokens | prototyper://tokens/css | Full CSS design tokens file |
| Component index | prototyper://docs/index | Index of all components and documentation pages |
| Stream UI catalog | prototyper://stream-ui/catalog | Component catalog as JSON Schema |
| Component docs | prototyper://docs/components/{name} | Per-component documentation (dynamic) |
Prompts
The server includes two built-in prompt templates:
| Prompt | Description |
|---|---|
build_ui | Generate a UI component or page using Prototyper UI components |
review_usage | Review code for correct Prototyper UI component usage |
These can be invoked directly by MCP-compatible clients that support the prompts capability.
Direct API Access
If you are not using MCP, you can access the same data through HTTP endpoints:
| Endpoint | Content |
|---|---|
/stream-ui/catalog.json | Component catalog as JSON Schema |
/stream-ui/prompt.txt | Pre-built system prompt |
/llms.txt | Component index for LLMs |
/llms-full.txt | Full documentation for LLMs |
/llms/{component} | Per-component documentation |
/prototyper-tokens.css | Extractable CSS design tokens |
These endpoints return the same data that the MCP tools serve, making it straightforward to integrate with any AI pipeline.