Sprint efficiency
~40%
less time on repeatable tasks -- copy, edge case mapping, specs, documentation
Token violations
Zero
reach production -- caught by AI audit at PR review, not discovered post-launch
Dev rebuild cost
Eliminated
Code Connect replaces manual component reconstruction from Figma inspection every sprint
Product coverage
3x
product areas supported by one team via AI-assisted Advisory tier -- no headcount added

On the ~40% figure: this is an estimate based on the known time cost of four tasks that AI now handles -- microcopy drafting (~1.5 hrs/sprint), edge case mapping (~1 hr), handoff spec writing (~1.5 hrs), and documentation (~2 hrs). Actual savings vary by sprint complexity. The zero-violation and eliminated-rebuild claims are structural, not estimated.

Five categories of advantage, each grounded in a specific change to the workflow. These are the frames that resonate outside the design and development team.

Faster delivery
Shorter path from brief to production-ready code
AI handles the translation work between each phase -- brief to copy, design to spec, spec to component scaffold. Developers receive answers to their questions before they ask them. Fewer mid-sprint blockers means fewer scope slips.
How: AI generates all microcopy and edge cases before Figma opens. Handoff specs are auto-generated. Code Connect eliminates manual component inspection.
Cost efficiency
Same team. More products. No headcount growth.
The Advisory + AI engagement tier makes it viable for this team to support three product areas simultaneously -- Licensing, Affiliate, and Publisher -- at different levels of investment. Previously, meaningful design support required full embedding. Now lightweight AI-assisted contributions are genuinely productive.
How: AI handles the repeatable execution work. Team judgment is reserved for decisions that actually require it. Contractor hours go further.
Fewer defects
Quality gates built into the process, not bolted on at QA
Token compliance is checked at every PR, not discovered during visual regression or post-launch. Accessibility is reviewed at the design phase before a line of code is written. Copy is consistent across all states because it was generated from the same prompt, not written piecemeal across multiple Figma files.
How: DESIGN.md enforces token rules at the repo level. AI a11y audit runs at design phase. Copy generation covers all states upfront.
Risk reduction
Institutional knowledge is documented, not in people's heads
Every architectural decision has an ADR. Every component has a Storybook story and a Confluence page. Every design decision has a rationale that the next team member can read. When a contractor rolls off, the knowledge stays. When a new developer joins, they can understand the system without a weeks-long onboarding.
How: AI drafts ADRs, Storybook stories, Confluence pages, and release notes every sprint. Team reviews and publishes -- does not write from scratch.
Scale without risk
The system gets more capable as it grows -- the investment compounds
Each Palak research summary that goes into the AI context stack makes every future design decision more informed. Each ADR that Manish writes prevents the same architectural debate from happening again. Each component Manish builds correctly in Storybook is one fewer component that gets rebuilt manually by a future developer. The workflow does not just maintain quality at scale -- it improves quality as more artifacts are created.
Concrete trajectory: Sprint 1 -- team learns the workflow. Sprint 4 -- context stack is rich enough that AI output requires minimal rework. Sprint 12 -- the documentation backlog is closed, every component has a Storybook story, every major decision has an ADR. New team members ramp in days, not months.
Before
Research
Brief
Design
copy written here
Handoff
manual inspection
Build
Ship
After
Research
AI synthesis
Brief
AI context
Design
AI copy + audit
Prototype
AI-generated
Handoff
Code Connect
QA
AI token scan
Ship
DESIGN.md
AI-assisted
Toolchain improvement
Handoff
Manual → Code Connect
Devs inspect Figma manually and rebuild MUI components from scratch every sprint.
After: Figma Code Connect maps components to live code. Zero manual translation.
Token compliance
Unchecked → Caught at PR
Off-token hex values enter production undetected, causing visual inconsistency.
After: DESIGN.md + AI audit flags every violation before merge.
Microcopy
Late → Before Figma opens
Copy is written during or after design review, often incomplete across states.
After: All states generated by AI before the designer touches Figma.
Research
Siloed → Feeds AI context
Research summaries are produced and referenced once, then disconnected from the design brief.
After: Palak's synthesis feeds AI context every sprint -- not just at kickoff.
Documentation
Skipped → Generated every sprint
ADRs, Storybook stories, and Confluence pages are perpetually deferred under time pressure.
After: AI drafts all documentation. Team reviews and publishes.
UX engineering
None → Manish owns the toolchain
No dedicated UXE. Developers rebuild the component library and toolchain ad-hoc.
After: Manish owns Tonic, Storybook, Code Connect, CI, and DESIGN.md.
KKelsey — UX Head
FTE attention to direction, not execution
AI handles the repeatable work. Kelsey and Joey spend their time on what actually requires senior judgment: product direction, stakeholder relationships, and prioritisation. The workflow does not add overhead to them -- it removes it.
JJoey — Design Systems
System design work, not maintenance triage
Manish owns the engineering layer. Joey focuses on component design decisions, Figma library governance, and usage rules -- not on chasing token drift or writing Storybook stories from scratch.
TTejas — UX Lead
Design with full context before Figma opens
Microcopy, edge cases, and a design brief exist before the first frame is placed. Design reviews move faster because the basics are pre-handled. Tejas focuses on layout, hierarchy, and decisions -- not on writing error message copy at 11pm before a review.
PPalak — Research
Research that actually shapes design, every sprint
Synthesis docs feed AI context directly rather than being referenced once and forgotten. Palak's research work has a measurable downstream effect on every design decision, not just at project kickoff.
BBansi — UX Designer
Design work, not copy coordination
All states and all copy exist before Bansi opens a frame. Component usage questions are answered by the rules doc and Storybook -- not by pinging Joey or Tejas mid-design.
MManish — UX Engineer
Build the system once, benefit every sprint
Code Connect means every component Manish builds correctly benefits every future sprint. The investment compounds. Developers stop rebuilding from Figma inspection and start pulling from a live, versioned library that Manish owns and maintains.

These are the gains that do not show up in the first sprint but compound significantly over 6--12 months.

Compound 01
Onboarding cost drops to near zero
DESIGN.md, Storybook, Confluence pages, and ADRs exist. A new designer or developer reads the system, understands the decisions behind it, and understands the current state of any product without a weeks-long onboarding or a dedicated handoff session from Tejas or Kelsey.
Compound 02
AI output quality improves as context grows
Sprint 1 AI output needs moderate rework. By Sprint 6, the context stack -- Palak's research summaries, the component rules doc, brand voice guidelines -- is rich enough that AI-generated copy and specs need minimal editing. The investment in building the context stack pays forward indefinitely.
Compound 03
Decisions stop getting relitigated
Every ADR prevents the same architectural debate from happening again. "Why do we use single-scroll instead of tabs?" has a written answer in Confluence. "Why did we choose this token structure?" has an ADR. New stakeholders and developers inherit the reasoning, not just the outcome.
Compound 04
Contractor continuity risk is neutralised
When a contractor rolls off, the knowledge stays in the documentation. The system is not dependent on any individual's memory of why decisions were made. A replacement contractor can become productive within days because the context is written down and AI-accessible.
Compound 05
Scope of the Advisory tier expands naturally
As the context stack grows, the Advisory + AI tier gets more capable. Lightweight contributions to Publisher Portal or new onboarding flows improve in quality without additional embedded time investment. The team's effective reach expands without the headcount growing.
Compound 06
Visual regression becomes automatic
Once Storybook and Chromatic are live, every component has a visual baseline. Regressions are caught automatically on every PR, not during QA by a human reviewer comparing Figma to a staging environment. The QA phase shrinks as the automated surface grows.

The workflow changes. The tools, team structure, and relationships do not.

Figma for UI design MUI as the component foundation Sprint ceremonies Jira for backlog FullStory for analytics Confluence for documentation Palak leads research Tejas leads Licensing UX Kelsey owns team direction Designers make design decisions Developers own implementation
The ~40% efficiency estimate
Based on four identifiable tasks now handled by AI: microcopy drafting (~1.5 hrs/sprint), edge case mapping (~1 hr), handoff spec writing (~1.5 hrs), and documentation (~2 hrs). Assumes a designer spending roughly 15 hours per sprint on these tasks. Actual savings vary by sprint complexity and how well the context stack is maintained.
The 3x product coverage claim
This team currently supports three product areas at different tier levels. "3x" reflects coverage breadth, not output volume. The Advisory tier is lighter than Embedded by design -- lightweight AI-assisted contributions are genuinely productive for specific work types (onboarding flows, catalog redesigns), not for complex interaction design.
Timeline to full benefit
The context stack (brand voice guidelines, component rules doc, research summaries) takes 2--3 sprints to build. Code Connect requires Manish's attention before it eliminates handoff overhead. Full benefit is not immediate -- it compounds over 2--3 months as the foundational assets are completed.
What AI does not replace
Visual layout judgment, stakeholder relationships, research moderation, deciding what to build, and the intuition developed from shipping products and watching users struggle with them. AI handles generation and compliance. Humans handle judgment. That distinction is structural to this workflow -- not a workaround.