Summary
Traditional design metrics and KPIs are often geared towards measuring product success. Dark metrics challenge this paradigm by proactively measuring the unintended yet harmful psychological, social, and physical effects of our technologies. The examples within digital health are plentiful. From accelerating burnout among clinicians to widening racial disparities in quality of care, we can only reach the height of our most courageous solutions when we expose our deepest failures.
Key Insights
-
•
Traditional product metrics, like the Google HEART framework, often miss broader impacts on users, focusing narrowly on product success rather than holistic well-being.
-
•
Dark Metrics is a framework designed to measure negative unintended effects in digital health across four dimensions: disempowerment, exclusion, addiction, and distraction.
-
•
Disempowerment occurs when technology removes users’ autonomy, such as opaque black-box AI systems that undermine clinician or patient decision-making.
-
•
Exclusion can be subtle, as algorithms that proxy biased variables like healthcare cost can reproduce racial disparities without explicit intent.
-
•
Racial equity in design can be assessed using heuristics or rubrics co-created by diverse teams, as demonstrated by Raven’s IBM colleagues Dre Barbara, Sherees Cooper, and Morgan Foreman.
-
•
Addiction to technology is an overused concept in consumer health, but distinguishing healthy from excessive use requires linking engagement data to well-being measures.
-
•
Distraction from core tasks is common in clinical environments when new tech disrupts workflows, evidenced by studies with ER staff and clinical trial recruitment tools.
-
•
Ethics frameworks like the Institute for the Future’s Ethical OS help anticipate risks like surveillance, bias, and data control, which inform Dark Metrics design principles.
-
•
Engaging diverse stakeholders and including co-creation early in research helps uncover biases and unintended consequences before launch.
-
•
Addressing negative impacts requires transparency with clients and a strong ethical posture, even when business priorities may conflict with user protection.
Notable Quotes
"The traditional product metrics focus narrowly on the product or near-term impact but fail to capture what success means for the whole person."
"Within IBM Watson, we prefer the term augmented intelligence rather than artificial intelligence to emphasize support, not takeover."
"An AI algorithm that didn’t explicitly consider race still produced racial disparities by using healthcare costs as a proxy."
"I am a Black person, but I do not have every Black experience. Not having experienced something is not proof that it doesn’t exist."
"The difference between technology and slavery is that slaves are fully aware they are not free."
"Doctors want to help people, not be on a machine all day; many health technologies are more distracting than helpful."
"We can assess distraction by observing time spent on screens versus with patients, and self-reported mental effort and stress."
"It’s important to ask, before any new release, what are all the things that could possibly go wrong?"
"Our jobs are to protect users from harm. If clients don’t care about side effects, it may be necessary to draw a line and walk away."
"Storytelling is highly effective in helping stakeholders understand the complete performance of products, including the darker sides."
Or choose a question:
More Videos
"Most people with my assistive technology might learn to use the system at a different pace than non-disabled users."
Sam ProulxSUS: A System Unusable for Twenty Percent of the Population
December 9, 2021
"It's kind of Peace Corps for nerds – people come in for a tour and try to have impact at scale."
Michael LandEstablishing Design Operations in Government
February 18, 2021
"I tried to do everything all by myself at first and it didn’t catch on because no one else knew I was doing it."
Shipra KayanHow we Built a VoC (Voice of the Customer) Practice at Upwork from the Ground Up
September 30, 2021
"People care about titles way too much. I had a guy whose business card said star-bellied sneak—love that."
Ian SwinsonDesigning and Driving UX Careers
June 8, 2016
"The chief of staff is the bridge between our executive leadership team and the design ops practitioners."
Isaac HeyveldExpand DesignOps Leadership as a Chief of Staff
September 8, 2022
"Self-service was a model with the most unknowns but possibly the biggest rewards."
Amy EvansHow to Create Change
September 25, 2024
"We’re powering a 300 plus organization of designers, researchers, program managers, and strategists."
Kate Koch Prateek KalliFlex Your Super Powers: When a Design Ops Team Scales to Power CX
September 30, 2021
"The only way to get outside your bubble is to act as if an alternative belief is true and test it."
Dave GrayLiminal Thinking: Sense-making for systems in large organizations
May 14, 2015
"We wanted a living network of knowledge so yesterday’s insight connects to today’s evidence and tomorrow’s learning."
Matt DuignanAtomizing Research: Trend or Trap
March 30, 2020