Living Draft



Living Draft is an autonomous pipeline that generates production-ready websites from physical environments captured through smart glasses — no templates, no manual decisions. Capture, sketch your intent, publish. Built for the Figma Makeathon 2026. The design challenge was orchestrating a system where every stage compounds the uncertainty of the last — and still produces something coherent and intentional on the other end.

Role — Solo Designer & Developer · Figma Make · Claude by Anthropic · Autonomous AI Pipeline · Spatial Computing · Generative UI





The Problem
Translating physical space into digital presence requires a chain of interpretive decisions — what matters, what to emphasize, how to express it visually. Today that chain requires a human at every link. Living Draft compresses them into a single step.

The Pipeline
Capture → environmental analysis → content extraction → layout generation → live artifact. Each stage runs autonomously. The system doesn't ask for confirmation — it commits to interpretations and produces something you can publish.



The Output
A live, navigable website generated entirely from a glasses capture. No templates selected. No copy written. No layout decisions made by hand.

↓ Live demo — scroll + click to explore




Why It Matters
This is a proof of concept for a broader design principle: the most powerful AI tools don't assist the creative process — they compress it. Living Draft demonstrates what happens when the entire loop from perception to artifact is automated, and what new design decisions emerge when your job is to orchestrate that loop rather than execute it.


— Artifacts —




Want to build on this?

Duplicate it to your Figma drafts and make it your own.

Duplicate in Figma →

BUILT WITH

Ray-Ban Meta Smart Glasses · Claude by Anthropic · Figma Make · Playwright · fswatch · iCloud Drive · ntfy.sh