- Home
- The Thinking Wire
- Make Kits Shipped. Sora Collapsed. The Lesson Is the Same.
Make Kits Shipped. Sora Collapsed. The Lesson Is the Same.
In the same week of late March 2026, two things happened in AI-generated design. Figma shipped Make Kits, a feature that constrains its AI prototyping tool to production design system components. And OpenAI shut down Sora, its video generation product, after a collapse so fast it should be studied in business schools.
One product built governance into the generation process. The other did not. The outcomes diverged exactly as you would predict.
What Make Kits Actually Do
Figma Make, the company’s AI prototyping tool, had a problem common to every AI generation product: unconstrained output. Ask it to build a login screen and it would produce something that looked plausible but matched nothing in your design system. Custom buttons. Invented spacing. Typography that existed nowhere in your token library. The prototype looked like a product. It was not your product.
Make Kits solve this by letting design system teams package their production components, styles, and guidelines into reusable kits that Figma Make consumes at generation time. When a designer prompts Make to build a screen, the AI starts from real components, not imagined ones. The button is your button. The spacing follows your tokens. The layout respects your grid.
Ben Smit, the PM behind the feature (previously at Slack), and Darragh Burke, the engineer (previously at Tinder and Microsoft), described the core value in their announcement: teams can “generate in parallel without drifting from the design system.” Engineers “recognize the components and patterns immediately,” reducing the “Is this custom?” questions that slow every handoff.
Make Attachments, shipped alongside kits, let designers attach real project files (PDFs, markdown, datasets) to prompts. Compliance requirements, brand guidelines, legal constraints: these become part of the generation context, not something checked after the fact.
The principle is simple. Constrain the inputs, govern the outputs.
What Sora’s Collapse Actually Shows
Now consider the opposite case.
Sora peaked at 6 million monthly downloads in November 2025. One month later, that number fell to roughly 1.5 million. A 75% drop. By March 2026, OpenAI announced the shutdown.
The financials were brutal. Daily operating costs reached an estimated $15 million. A $1 billion deal with Disney died. Jensen Huang, in the same week he declared “AGI achieved,” offered this assessment of AI-generated video quality: “I don’t love slop myself.”
Brian Merchant, writing in Blood in the Machine, argued that Sora failed because AI-generated imagery is “fundamentally unsettling to users.” The aesthetic failure was not a bug to be patched. It was structural. Without constraints on what the model could produce, every output carried the uncanny signature of ungoverned generation.
The parallel to code is exact. Ungoverned code generation produces technical debt. Ungoverned design generation produces slop. Ungoverned video generation produces content that repels the audience it was supposed to attract. The failure mode is the same across domains: generation without constraints degrades quality until users leave.
The Governance Mechanism Is the Product
This is the point that separates what Figma built from what OpenAI shipped.
Make Kits function as a governance mechanism packaged as a product feature. The kit defines what the AI can use. Components not in the kit do not exist for the model. Tokens not in the library cannot be applied. The constraint is structural, not advisory.
We wrote about this principle when Figma launched its MCP beta: design systems become constraint layers for autonomous software. Make Kits take that principle further. MCP let agents read your component library. Make Kits let teams define exactly which components, with which guidelines, the AI should use for prototyping. The constraint surface got more specific.
Sora had no equivalent mechanism. There was no “production standard” for AI video. No component library of acceptable visual patterns. No token system defining what lighting, motion, or composition should look like for a given brand. Every generation was unconstrained. Every output was a roll of the dice.
The market’s response was unambiguous. Constrained generation (Figma Make with kits) produces output that engineers recognize and teams can build on. Unconstrained generation (Sora) produces output that users download once, find unsettling, and abandon.
Why “Structured Context” Is the Right Frame
Figma’s announcement uses a specific phrase worth adopting: “structured context.” Make Kits and Attachments bring structured context into the generation process. The components are structured. The guidelines are structured. The attached compliance documents are structured. The AI receives context that has been deliberately organized to constrain its output toward production standards.
This is governance vocabulary, whether Figma uses the word or not.
As we argued in our five-level design-first framework, the cheapest time to govern a decision is before it becomes an artifact. Make Kits apply that principle to design generation. The governance happens before the first pixel appears, embedded in the kit definition, not applied after the prototype exists.
And as we explored in the McKinsey critique, design without governance is decoration. Make Kits are the mechanism that prevents AI-generated design from becoming exactly that. They are not the only mechanism. But they are the first one shipped at platform scale by a company that owns the design workflow for millions of teams.
The Pattern Across Domains
Pull the lens back and a pattern emerges across every domain where AI generates artifacts.
In code, the equivalent of Make Kits is the combination of linting rules, architecture constraints, and project conventions that tools like Claude Code and Cursor consume through configuration files. When an AI coding assistant generates code within those constraints, the output is recognizable, reviewable, and mergeable. When it generates without them, you get the anti-patterns that Ox Security found in 80-100% of enterprise AI-generated codebases.
In content, the equivalent is a style guide loaded into the generation context. A brand voice document. Editorial standards. Without them, AI-generated marketing copy converges toward the same bland, buzzword-heavy mediocrity regardless of which company publishes it.
In video, there was no equivalent. That is why Sora failed.
The pattern: generation quality is a function of constraint quality. Better constraints produce better outputs. Zero constraints produce slop. This holds for code, design, content, and video. It will hold for whatever domain AI generation enters next.
What This Means for Design Operations
Three practical implications follow.
First, design system teams need to think of themselves as governance teams. The kit you publish to Figma Make is a constraint definition. Every component you include or exclude shapes what AI can produce across your entire organization. Component coverage is no longer a design maturity metric. It is a governance surface area metric. If your kit covers 60% of your interface patterns, the AI will improvise the other 40%.
Second, the Make Attachments feature deserves more attention than it received. Attaching compliance requirements, brand guidelines, and legal constraints directly to generation prompts is governance-as-context in its most literal form. Organizations in regulated industries (finance, healthcare, government) should treat attachment templates as governance artifacts, version-controlled and audited alongside the kit itself.
Third, Sora’s collapse is a case study, not a cautionary tale. It is evidence. When your board asks why design governance matters for AI adoption, the answer is a $15-million-per-day product that went from 6 million downloads to shutdown in four months because nobody built the constraint layer. The cost of ungoverned generation is not theoretical. It is quantified.
The Building Materials Changed Again
In our earlier analysis of Figma MCP, we described the trajectory: design systems moved from style guides to component libraries to engineering infrastructure to constraint layers for agents. Make Kits represent the next step. The design system is no longer just a constraint the agent reads. It is a package the platform consumes, with guidelines that shape generation behavior at the system level.
Sora’s shutdown represents the trajectory’s shadow. What happens when a generation product ships without ever building the constraint layer. Not “builds it badly.” Never builds it at all.
Both stories landed in the same week. The lesson is the same lesson we keep finding across every domain where AI generates artifacts. The governance infrastructure is not optional. It is not a phase-two concern. It is the product. Without it, you get slop. With it, you get prototypes that engineers recognize, components that match production, and teams that can generate in parallel without drifting.
Figma understood this. OpenAI, at least with Sora, did not.
This analysis synthesizes Figma’s Make Kits and Attachments announcement (April 2026) and Brian Merchant’s AI’s Aesthetics of Failure (March 2026).
Victorino Group helps enterprises turn design systems into governance infrastructure for AI-generated output. Let’s talk.
All articles on The Thinking Wire are written with the assistance of Anthropic's Opus LLM. Each piece goes through multi-agent research to verify facts and surface contradictions, followed by human review and approval before publication. If you find any inaccurate information or wish to contact our editorial team, please reach out at editorial@victorinollc.com . About The Thinking Wire →
If this resonates, let's talk
We help companies implement AI without losing control.
Schedule a Conversation