Summary
Traditional design metrics and KPIs are often geared towards measuring product success. Dark metrics challenge this paradigm by proactively measuring the unintended yet harmful psychological, social, and physical effects of our technologies. The examples within digital health are plentiful. From accelerating burnout among clinicians to widening racial disparities in quality of care, we can only reach the height of our most courageous solutions when we expose our deepest failures.
Key Insights
-
•
Traditional product metrics, like the Google HEART framework, often miss broader impacts on users, focusing narrowly on product success rather than holistic well-being.
-
•
Dark Metrics is a framework designed to measure negative unintended effects in digital health across four dimensions: disempowerment, exclusion, addiction, and distraction.
-
•
Disempowerment occurs when technology removes users’ autonomy, such as opaque black-box AI systems that undermine clinician or patient decision-making.
-
•
Exclusion can be subtle, as algorithms that proxy biased variables like healthcare cost can reproduce racial disparities without explicit intent.
-
•
Racial equity in design can be assessed using heuristics or rubrics co-created by diverse teams, as demonstrated by Raven’s IBM colleagues Dre Barbara, Sherees Cooper, and Morgan Foreman.
-
•
Addiction to technology is an overused concept in consumer health, but distinguishing healthy from excessive use requires linking engagement data to well-being measures.
-
•
Distraction from core tasks is common in clinical environments when new tech disrupts workflows, evidenced by studies with ER staff and clinical trial recruitment tools.
-
•
Ethics frameworks like the Institute for the Future’s Ethical OS help anticipate risks like surveillance, bias, and data control, which inform Dark Metrics design principles.
-
•
Engaging diverse stakeholders and including co-creation early in research helps uncover biases and unintended consequences before launch.
-
•
Addressing negative impacts requires transparency with clients and a strong ethical posture, even when business priorities may conflict with user protection.
Notable Quotes
"The traditional product metrics focus narrowly on the product or near-term impact but fail to capture what success means for the whole person."
"Within IBM Watson, we prefer the term augmented intelligence rather than artificial intelligence to emphasize support, not takeover."
"An AI algorithm that didn’t explicitly consider race still produced racial disparities by using healthcare costs as a proxy."
"I am a Black person, but I do not have every Black experience. Not having experienced something is not proof that it doesn’t exist."
"The difference between technology and slavery is that slaves are fully aware they are not free."
"Doctors want to help people, not be on a machine all day; many health technologies are more distracting than helpful."
"We can assess distraction by observing time spent on screens versus with patients, and self-reported mental effort and stress."
"It’s important to ask, before any new release, what are all the things that could possibly go wrong?"
"Our jobs are to protect users from harm. If clients don’t care about side effects, it may be necessary to draw a line and walk away."
"Storytelling is highly effective in helping stakeholders understand the complete performance of products, including the darker sides."
Or choose a question:
More Videos
"Start with trust by default; most people won’t disappoint you and will deliver responsibly."
Ana FerreiraDesigning Distributed: Leading Doist’s Fully Remote Design Team in Six Countries
January 8, 2024
"The ride along is a richer experience than watching session recordings because you can interact in real time."
Roy Opata OlendeHow Zapier Uses ‘All Hands Research’ to Increase Exposure to Users
August 6, 2020
"If growth board content overlaps with quarterly business reviews, don’t do both; instead, inject an outcome mindset into existing meetings."
Kit Unger Jackie Ho Veevi Rosenstein Vasileios XanthopoulosTheme 2: Discussion
January 8, 2024
"Not growing the team was disheartening but led me to create training for partners to start taking a user-centered approach themselves."
Abbey Smalley Sylas SouzaScaling UX Past the Size of Your Team
January 8, 2024
"The relationship between designers and data scientists can actually be pretty magical."
Helen ArmstrongAugment the Human. Interrogate the System.
June 7, 2023
"We answer questions differently when we’re living life and doing activities than when we’re sitting down facing the camera."
Bas Raijmakers, PhD (RCA) Charley Scull Prabhas PokharelWhat Design Research can Learn from Documentary Filmmaking
March 11, 2022
"Kids are not little adults. You can’t just simplify the text and call it for kids."
Mila Kuznetsova Lucy DentonHow Lessons Learned from Our Youngest Users Can Help Us Evolve our Practices
March 9, 2022
"ChatGPT's frameworks generate siloed insights but fail to show the connections and flows essential for storytelling."
Weidan LiQualitative synthesis with ChatGPT: Better or worse than human intelligence?
June 4, 2024
"If you want to improve the product team’s understanding, frame UX value in terms of their language and goals, not just UX jargon."
Christian CrumlishAMA with Christian Crumlish, author of Product Management for UX People
March 24, 2022