Prototyper UI

MCP Integration

Connect AI assistants to Stream UI via Model Context Protocol

The Model Context Protocol (MCP) is an open standard that lets AI assistants connect to external tools and data sources. The Prototyper UI MCP server exposes Stream UI's component catalog and prompt generation as tools that any MCP-compatible client (Claude, Cursor, etc.) can use.

Available Tools

The MCP server provides 7 tools. Two are specific to Stream UI:

get_ui_catalog

Returns the Stream UI component catalog as JSON Schema. This includes all available components with their props, events, slots, and descriptions.

The output is the same as calling catalog.jsonSchema() — a machine-readable schema that AI models can use to understand what components are available and how to construct valid specs.

get_ui_prompt

Returns the pre-built system prompt generated by buildSystemPrompt(catalog). This is a complete instruction set that teaches the AI model:

  • The JSONL streaming output format (RFC 6902 JSON Patch)
  • The flat spec structure
  • All dynamic expressions ($state, $bindState, $cond, etc.)
  • Every available component with prop types and examples
  • Built-in and custom actions
  • Rules for valid spec generation

Other Tools

The server also includes general Prototyper UI tools:

ToolDescription
list_componentsList all available Prototyper UI components by category
get_componentGet full docs, source code, and examples for components (batch up to 5)
get_themeGet CSS design tokens (OKLCH colors, surfaces, shadows, easings)
search_docsFull-text search across all documentation
get_install_commandGet the CLI install command for components

Setup

Add the MCP server to your AI assistant:

claude mcp add prototyper-ui -- npx -y @prototyperai/mcp-server@latest

Add to your claude_desktop_config.json:

{
  "mcpServers": {
    "prototyper-ui": {
      "command": "npx",
      "args": ["-y", "@prototyperai/mcp-server@latest"]
    }
  }
}

Custom Base URL

By default, the MCP server fetches from https://prototyper-ui.com. To point it at a local dev server or custom deployment, set the PROTOTYPER_UI_BASE_URL environment variable:

PROTOTYPER_UI_BASE_URL=http://localhost:3333 npx @prototyperai/mcp-server@latest

Workflow

A typical AI-assisted UI generation flow using MCP:

Discover Components

The AI calls get_ui_catalog to receive the full component catalog as JSON Schema. This tells it what components exist, what props they accept, and what events they emit.

Get the System Prompt

The AI calls get_ui_prompt to receive the system prompt. This prompt contains the complete spec format documentation, streaming instructions, and all the rules for generating valid specs.

Generate the Spec

Using the catalog knowledge and system prompt, the AI generates a streaming spec (JSONL patches) that builds a UI matching the user's request.

Render with Stream UI

The generated spec is passed to <Renderer> with the matching component registry. The UI renders progressively as patches stream in.

Resources

The MCP server also exposes resources that clients can read directly:

ResourceURIDescription
Design tokensprototyper://tokens/cssFull CSS design tokens file
Component indexprototyper://docs/indexIndex of all components and documentation pages
Stream UI catalogprototyper://stream-ui/catalogComponent catalog as JSON Schema
Component docsprototyper://docs/components/{name}Per-component documentation (dynamic)

Prompts

The server includes two built-in prompt templates:

PromptDescription
build_uiGenerate a UI component or page using Prototyper UI components
review_usageReview code for correct Prototyper UI component usage

These can be invoked directly by MCP-compatible clients that support the prompts capability.

Direct API Access

If you are not using MCP, you can access the same data through HTTP endpoints:

EndpointContent
/stream-ui/catalog.jsonComponent catalog as JSON Schema
/stream-ui/prompt.txtPre-built system prompt
/llms.txtComponent index for LLMs
/llms-full.txtFull documentation for LLMs
/llms/{component}Per-component documentation
/prototyper-tokens.cssExtractable CSS design tokens

These endpoints return the same data that the MCP tools serve, making it straightforward to integrate with any AI pipeline.

On this page