Go Back

Design Review Framework

What is a design review?

A design review is a structured critique of UI work against a defined set of criteria — before it ships. Unlike a casual feedback round, a review framework gives the team a consistent, repeatable process for catching visual inconsistencies, accessibility failures, unclear copy, broken flows, and missing edge cases.

Use it when: you're reviewing high-stakes screens (pricing, checkout, onboarding, sign-up), releasing a significant UI change, or handing off work from design to engineering.

A design review is not a stakeholder sign-off meeting. It's a quality check run by design, with engineering and QA input, to ensure the work is ready for sign-off.

Review framework template

Work through each category. Score each item: ✅ Green (pass), ⚠️ Yellow (note + fix before ship), ❌ Red (blocker — must fix before handoff).

1. Visual consistency

Check Score Notes
Colours match design system tokens
Typography uses system type scale
Spacing uses defined grid/units
Icons are from the approved icon set
Component states (hover, focus, disabled) are all designed
No one-off styles invented for this screen

2. Clarity — copy, hierarchy, and CTA

Check Score Notes
Primary CTA is unambiguous (one clear action per screen)
Button and link labels say what they do (not "click here")
Headings establish clear hierarchy
Error messages are specific: what went wrong + how to fix
Microcopy (labels, hints, confirmations) is plain language
No jargon or internal language visible to users

3. Accessibility

Check Score Notes
Colour contrast meets WCAG AA (4.5:1 text, 3:1 UI components)
All interactive elements have a visible focus state
Touch targets ≥ 44×44px on mobile
Images have alt text defined
Form inputs have visible labels (not placeholder-only)
Reading order makes sense without visual layout

4. User flows

Check Score Notes
Happy path is complete: start → success
Entry points are designed (where do users come from?)
Exit points are handled (cancel, back, close)
Confirmation and success states are designed
Recovery paths from errors are designed
No dead ends in the flow

5. Edge cases

Check Score Notes
Empty state is designed (no data, new user, first visit)
Loading state is designed (skeleton, spinner, or progressive)
Error / 404 state is designed
Long content handled (long names, overflow, truncation)
Short content handled (one item in a list, one character names)
Mobile viewport tested (320px minimum)

Scoring guide

  • All green: ready for engineering handoff.
  • Yellow items: document, assign, and fix before end of sprint. Non-blocking but tracked.
  • Red items: design review fails. Must be resolved and re-reviewed before handoff.

Output: priority list

After scoring, produce a short priority list:

Fix first (Red — blocks handoff):

  1. [Issue] → [Owner] → [Resolution]

Fix before launch (Yellow — tracked):

  1. [Issue] → [Owner] → [Resolution]

Nice to have (Green — improvements for next iteration):

  1. [Issue] → [Note]

Why design review frameworks matter

  • Catch issues early: finding an accessibility failure in a review is 10× cheaper than finding it in production or a compliance audit.
  • Standardise quality: without a framework, review quality depends on who's in the room and what they happen to notice.
  • Create shared ownership: when design, engineering, and QA review against the same criteria, "who owns quality" is no longer ambiguous.
  • Protect the design system: the framework surfaces one-off styles before they become technical debt.
  • Speed up handoff: engineering trusts the designs more when they've been through a documented review process.

Examples

Example (the realistic one — pricing page redesign)

The design team reviews the new pricing page before handoff. Visual consistency: ⚠️ Yellow — the "Most popular" badge uses a colour not in the design system (needs updating to the correct token). Clarity: ❌ Red — the primary CTA reads "Get started" on all three plans; unclear which plan users are choosing. → Revised to "Start [Plan Name]". Accessibility: ❌ Red — the savings callout ("Save 40%") fails colour contrast at 2.8:1. → Updated to dark-on-light. Flows: ✅ Green. Edge cases: ⚠️ Yellow — no loading state for the "Compare all features" expand; added to backlog. Outcome: 2 red blockers resolved before handoff; yellow items logged for next sprint.

Common pitfalls

  • Running the review too late: reviewing after engineering has started building makes red-flag fixes expensive. → Do this instead: run the review at the end of design, before handoff begins.
  • Skipping edge cases: happy-path-only reviews leave empty states and error flows for engineers to invent. → Do this instead: always include section 5 of the framework; edge cases are design decisions.
  • Informal feedback as a review: "everyone liked it" is not a design review. → Do this instead: use the framework checklist, document scores, and produce a written priority list.
  • One person reviewing their own work: the same person who designed it will miss the same things they missed while designing. → Do this instead: at minimum, have one other designer and one engineer review against the framework.
  • No follow-up on yellow items: yellow issues get forgotten once the design ships. → Do this instead: log yellow items in your task tracker; assign owners; review at sprint retro.
  • Design review vs design critique: a critique is developmental (improving ideas in progress); a design review is evaluative (checking finished work against standards before it ships).
  • Design review vs QA: QA checks whether the built product matches the spec; a design review checks whether the spec is correct before building starts.
  • Design review vs usability testing: testing reveals whether users can succeed with the design; a review checks whether the design meets quality standards. Both are necessary.
  • Design system – the source of truth for visual consistency checks in section 1.
  • Accessibility – section 3 of the framework applies WCAG criteria.
  • Usability testing – complements the review; testing validates with real users after the review confirms quality.
  • QA guardrails – automated checks that catch issues after handoff; the design review catches them before.
  • Design brief – the brief defines what the review should evaluate against (scope + success signal + constraints).
  • Prototype – what the review is usually conducted on before implementation.

Next step

Run the framework on your next design before it goes to handoff. Start with sections 2 (Clarity) and 3 (Accessibility) — these catch the highest-impact issues fastest. Track every red and yellow item in your task manager before calling the review complete.

For high-stakes screens (pricing, checkout, onboarding), consider a structured screen review with your full team. The Unicorn Club newsletter covers practical design quality patterns in depth — subscribe to get the next issue.