In active development

Speak it.
See it. Ship it.

SurfaceAI is an open architecture that turns natural language into live, interactive presentations — no code, no build step, no deploy pipeline.

Join the Waitlist Learn More ↓
you: "Show me a revenue dashboard with quarterly trends"

// SurfaceAI translates intent to IR...
Layout resolved — split panel with metrics
Chart bound to live data
Interactions wired — filters, drill-down
Persistence scoped — shared across sessions

surface: Dashboard is live and interactive in seconds.

The Problem

Software has too many steps

Today, turning an idea into something people can see and use requires a long, fragile pipeline.

Intent Code Compile Package Deploy Browser

SurfaceAI collapses this into something fundamentally simpler:

Conversation Stream Runtime

Core Concepts

What we're building

A minimal, complete instruction set that AI can speak natively to create anything a presentation needs.

AI-Native IR

A compact intermediate representation designed specifically for AI to produce. Not generated code — a direct stream of typed instructions that the runtime understands immediately.

37 Operations. That's It.

Six primitives and thirty-one macros cover every layout, content, data, input, navigation, and transition pattern. A complete visual language in a single byte per instruction.

Streaming Rendering

Content appears as it's generated. The runtime renders incrementally — no waiting for a full response. The moment the AI thinks it, you see it.

Reactive State

A built-in state table with observable bindings. User interactions, computed values, and data flows are first-class concepts — not afterthoughts bolted onto a UI framework.

Scoped Persistence

Every piece of state declares how long it lives and who can see it — from session-only locals to permanent shared records. No separate database or API layer required.

Open Architecture

An open specification anyone can implement. Runtimes can be built for the browser, native platforms, embedded systems — anywhere a surface can appear.


Principles

What we believe

The design decisions behind SurfaceAI come from a few strong convictions.

01

Presentations are a solved domain

Everything a human needs to show, interact with, or persist fits within a small, finite set of patterns. We don't need infinite flexibility — we need the right constraints.

02

AI should speak directly to runtimes

Generating source code so a compiler can turn it back into instructions is unnecessary indirection. Let the model produce the instructions directly.

03

Latency is a design choice

Streaming IR means the first pixel appears in milliseconds, not after a full round-trip. Progressive rendering turns generation time into a feature.

04

State is not someone else's problem

Reactivity, data binding, and persistence belong inside the runtime — not delegated to external frameworks, databases, or duct-tape integrations.

05

Stable means forever

Once an opcode is assigned, it never changes. Content created today will render correctly on any compliant runtime, indefinitely.


Early Access

Be part of what's next

We're building in the open. Join the list to get updates on the specification, early runtime access, and a seat at the table as the architecture evolves.

You're on the list. We'll be in touch.

No spam, ever. We'll only email about meaningful milestones.