Ethical Design
What is ethical design?
Ethical design is the practice of considering the impact of your design decisions on people and society, and choosing options that respect dignity, autonomy, and fairness. It includes avoiding dark patterns, being transparent about data and behaviour, protecting privacy, and designing for accessibility and inclusion. It’s “is this the right thing to do?” as well as “does it work?”
Use it when: you’re making choices about persuasion, data, defaults, or inclusion. Ethics isn’t a one-off checklist; it’s part of how you frame problems, run user research, and ship product.
Copy/paste checklist (ethical design in practice)
- [ ] No dark patterns – No tricking users into sign-up, subscription, or sharing more than they intend. Clear choices and easy exit.
- [ ] Transparency – Explain how the product works and how data is used; don’t hide important info in fine print.
- [ ] Meaningful consent – Consent for data and tracking is informed and revocable; no pre-ticked boxes or vague wording.
- [ ] Fair and inclusive – Design for a range of users; consider accessibility, cost, and context. Don’t discriminate or exploit.
- [ ] Responsible defaults – Defaults that protect privacy and well-being where it matters (e.g. privacy settings, notification frequency).
- [ ] Own impact – Ask “who could this harm?” and “what would we do if this went wrong?”; be willing to change.
Why ethical design matters
- Builds trust; deceptive or opaque design erodes it.
- Reduces harm (e.g. addiction, privacy violations, exclusion) and legal/reputational risk.
- Aligns with accessibility and inclusive design so more people are treated fairly.
- Helps teams make defensible choices when “growth” or “engagement” conflict with user welfare.
What good ethical design includes
Checklist
- [ ] Avoid dark patterns – No disguised ads, fake urgency, hidden costs, or hard-to-cancel flows. See dark patterns.
- [ ] Privacy by design – Collect only what you need; explain use; secure and retain appropriately. Consent where required.
- [ ] Transparent behaviour – Algorithmic or automated decisions that affect users (e.g. ranking, recommendations) are explainable where feasible; users aren’t deceived.
- [ ] Inclusive and accessible – Accessibility and inclusion are part of “doing right by users.”
- [ ] Accountable – Someone owns the impact; there’s a way to raise concerns and revisit decisions.
- [ ] Documented – Principles or guidelines (e.g. “we don’t use dark patterns”) so the team can align and review.
Common formats
- Principles: Short statements (e.g. “We don’t use dark patterns,” “We are transparent about data”). Use for alignment.
- Review: Ethics or impact considered in design review or go/no-go (e.g. “Could this harm or mislead?”).
- Privacy by design: Data minimisation, purpose limitation, and user control built into features, not bolted on.
Examples
Example (the realistic one)
Subscription: Cancel is in the same area as subscribe; no extra steps or phone-only cancel. Consent: Cookie or tracking consent is clear (“We use X for Y”); accept and reject are equally easy; no pre-ticked “marketing.” Defaults: New accounts get stricter privacy defaults; notifications off by default for non-critical. Content: You avoid patterns that exploit anxiety or addiction (e.g. infinite scroll with no pause, or “only 1 left!” when it’s not true). You add “Ethical design” to design standards or team principles and revisit in retros when something feels off.
Common pitfalls
- “We’re not doing anything illegal.” – Legal isn’t the same as ethical. → Do this instead: Ask “would we be comfortable if this were public?” and “who could this harm?”
- Dark patterns for short-term gain – Tricking users into sign-up or retention. → Do this instead: Clear value and clear choices; growth through trust, not deception.
- No transparency – Data use or algorithms hidden. → Do this instead: Explain in plain language; give users control where appropriate.
- Ignoring accessibility and inclusion – “We’ll do it later.” → Do this instead: Accessibility and inclusion are part of ethical design; build them in. See accessibility.
- No ownership – Nobody asks “is this right?” → Do this instead: Make ethics part of design standards or review; assign someone to raise concerns.
Ethical design vs. related concepts
- Ethical design vs accessibility: Accessibility is one dimension of ethical design (fair access); ethical design also covers privacy, transparency, and avoiding harm.
- Ethical design vs UX: Good UX can still be unethical (e.g. very usable but deceptive). Ethical design adds the “right thing to do” lens.
- Ethical design vs legal: Compliance is necessary but not sufficient; ethical design goes beyond what’s legally required.
Related terms
- Accessibility – part of treating users fairly.
- Design standards – can include ethical principles.
- Human-centred design – puts people at the centre; ethics extends to impact on society.
- User research – understanding users includes not exploiting them.
- Microcopy – transparent, clear copy supports ethics.
- Dark patterns – what to avoid.
Next step
Add one ethical principle to your team or design standards (e.g. “We don’t use dark patterns” or “We are transparent about data use”). In your next design or product review, ask “Could this mislead or harm anyone?” and document the answer. Read Accessibility to connect ethics and inclusion.