Summary
You should not be doing research for the sake of doing research. Research takes time and needs to be well throughout. More importantly, you need to determine if your findings are actually meaningful to the organization. In this session we will look at the idea of statistical significance and meaningfulness when reporting research findings.
Key Insights
-
•
Research must be tied to business imperatives and KPIs to deliver value, not done arbitrarily.
-
•
Qualitative research is crucial to interpret quantitative metrics like NPS meaningfully.
-
•
Statistical significance depends on sample size, variability, and magnitude of change.
-
•
A statistically significant change may not always be meaningful to the business context.
-
•
Meaningfulness is determined by alignment with organizational goals and market conditions.
-
•
Increasing sample size alone can make every change appear significant, which is misleading.
-
•
Driver modeling helps prioritize initiatives by linking key metrics to business outcomes.
-
•
Benchmarking against competitors and cross-industry experiences anchors understanding of CX scores.
-
•
Effective communication involves framing findings differently for executives, managers, and individual contributors.
-
•
Frequent measurement without changes made leads to wasted effort and misinterpretation.
Notable Quotes
"Research is not as easy as putting on a shoe and just doing it; it requires careful consideration and alignment with imperatives."
"Every board member can interpret an NPS score differently depending on their role and responsibilities."
"Statistical significance tells you if a change is likely not random, but it does not tell you if the change matters to your business."
"If you measure an infinite number of customers, every change would be statistically significant, but not every change would be meaningful."
"Meaningfulness is linked directly to what your organization cares about in terms of goals and priorities."
"Driver modeling tells you which research initiatives matter most based on their impact on key outcome metrics."
"Customers benchmark your company to their last best experience, often outside of your industry."
"When executives start questioning your research details, that’s a good sign they’re engaged and trusting the process."
"You shouldn’t send another survey unless you’ve implemented changes and allowed time for behavior to adapt."
"Mixed methods research, combining qualitative and quantitative, gives the fullest understanding of customer experience."
Or choose a question:
More Videos
"You should plan 20 to 40 percent of your project budget on evals—it’s a lot more work than most people expect."
Peter Van DijckBuilding impactful AI products for design and product leaders, Part 2: Evals are your moat
July 23, 2025
"One of the core questions of research strategy is how do you decide what research to do when demand exceeds resources."
Chris Geison Cristen Torrey Eric MahlstedtWhat is Research Strategy?: A Panel of Research Leaders Discuss this Emergent Question
March 4, 2021
"Merge translates code from Storybook, npm, or third-party libraries into interactive, visual components in the editor."
Jack BeharHow to Build Prototypes that Behave like an End-Product
December 6, 2022
"Design systems are not just components but guidelines, rules, and principles to create consistent experiences."
Abbey Smalley Sylas SouzaScaling UX Past the Size of Your Team
January 8, 2024
"Order today is change, and the uncertainty that it brings rules us day and night."
Jon FukudaTheme One Intro
October 2, 2023
"Not all changes create value; some do nothing or even make things worse."
Erin WeigelGet Your Whole Team Testing to Design for Impact
July 24, 2024
"We choose to measure design ops value not because it is easy, but because it is hard."
John Calhoun Rachel PosmanMeters, Miles, and Madness: New Frameworks to Measure the (Elusive) Value of DesignOps
September 24, 2024
"They had no idea who we were or what value we provided, so first was hello, we exist."
Jacqui Frey Dan WillisPanel Discussion: Integrating DesignOps
November 7, 2018
"We categorized all our work into may-do, must-do, and desire-to-do buckets to better allocate our efforts."
Amy EvansHow to Create Change
September 25, 2024