Rosenverse

Log in or create a free Rosenverse account to watch this video.

Log in Create free account

100s of community videos are available to free members. Conference talks are generally available to Gold members.

Have fun with statistics?
Thursday, December 12, 2024 • Rosenfeld Community
Share the love for this talk
Have fun with statistics?
Speakers: Caroline Jarrett and Erin Weigel
Link:

Summary

Let’s face it, many of us feel daunted by statistics. But we also know that colleagues and clients ask whether our research has “statistically significant” results. Erin’s book Design for Impact helps you to test your hypotheses about improving design, and she guides you through deciding on your effect sizes to help you get those statistically significant results. Caroline’s book Surveys That Work talks about “significance in practice” and she’s not all that convinced about whether it’s worth aiming for statistical significance. Watch this lively session where Erin and Caroline compared and contrasted their ideas and approaches - helped by your questions and contributions.

Key Insights

  • Statistical significance often confuses practitioners because it requires mentally flipping hypotheses and disproving nulls, which is cognitively demanding.

  • Effect size is critical to understanding whether a change detected by statistics is meaningful in practice, a concept often neglected in statistics education.

  • Fast progress isn't necessarily good progress; teams benefit from slowing down and using statistics to ensure they're moving in the right direction.

  • Engineers can be reluctant to implement experiments due to the extra coding load, but they respond well when they understand the learning value gained.

  • Survey results (the numerical outcomes) are often confused with the number of respondents required for statistical significance, leading to misunderstandings.

  • Statistical thresholds like 95% confidence can be adjusted depending on project needs; lower confidence levels are sometimes acceptable.

  • A good hypothesis often starts as an intuitive guess, which gets refined over time through repeated testing and data collection.

  • Qualitative and quantitative research should be viewed as complementary tools in a holistic research approach rather than opposed methods.

  • AI tools can help generate first drafts of survey questions, but human-centered pilot testing is essential to avoid errors and misinterpretations.

  • It's common and acceptable to act on results that are significant in practice but not statistically significant, especially when outcomes clearly affect users.

Notable Quotes

"Statistics is hard because you have to flip flop in your head: think of a hypothesis, then a null hypothesis, then try to disprove the null."

"People confuse statistical significance with significance in practice — they want to know if the change is meaningful, not just mathematically significant."

"Fast is not a virtue in and of itself; moving slower and acting with intention ensures you go in the right direction."

"Engineers hate writing more code, so getting them to buy into experiments means showing the value of the learning on the other side."

"You can have an effect size that matters in practice but isn’t statistically significant, like five users failing a key task in usability testing."

"Most science starts with somebody pulling a number out of their ass — it’s okay to start with a gut instinct or guess."

"Statistics is another tool in our toolbox, part of a hierarchy of evidence that includes qualitative and quantitative methods."

"AI can create first drafts of survey questions, but unless you pilot test with real humans, you won’t know if your audience gets it."

"A lot of people think 95% confidence is the only way, but you can adjust confidence levels based on your situation and needs."

"Start with basics like means, minimums, and ranges — statistics rapidly becomes less mysterious and more useful with practice."

Ask the Rosenbot
Harry Max
Failure Friday #5: Lessons from a SaaS Design Failure
2025 • Rosenfeld Community
Maria Giudice
Becoming a Changemaker by Leading with Design
2023 • Advancing Research 2023
Gold
Maish Nichani
Sparking a Service Excellence Mindset at a Government Agency
2021 • Civic Design 2021
Gold
Barb Spanton
Doing Work That Matters: A Look Beyond The Idealistic Notion of 'Doing Meaningful Work'
2022 • Design at Scale 2022
Gold
Landon Barnes
Are My Research Findings Actually Meaningful?
2022 • Advancing Research 2022
Gold
Carol Smith
Operationalizing Responsible, Human-Centered AI
2023 • Enterprise UX 2023
Gold
Trisha Terhar
Empathizing with the Empowered: Non-Researcher Responses to Democratization
2022 • Advancing Research 2022
Gold
Mike Oren
Design Research Strategy & Strategic Design Research
2022 • Advancing Research Community
Michael Weir
Mixed Methods and Behavioural Science
2023 • Rosenfeld Community
Saara Kamppari-Miller
DesignOps for Inclusive Design and Accessibility
2022 • DesignOps Community
Marc Majers
Interrupted UX - Add A Dose of Reality To Usability Testing
2022 • Advancing Research 2022
Gold
Joshua Noble
Casual Inference
2023 • QuantQual Interest Group
Bria Alexander
Opening Remarks
2023 • Advancing Research 2023
Gold
Ryan Matthew
Bridging Design and Code: AI-Powered Design System Integration
2025 • Rosenfeld Community
Kim Holt
A Salesforce Panel Discussion on Values-Driven DesignOps
2022 • DesignOps Summit 2022
Gold
Peter Merholz
The Mysterious Case of the Missing UX Career Path
2022 • DesignOps Community

More Videos

Joerg Beringer

"You get a lot of different research outputs in a matter of minutes from your scope input."

Joerg Beringer Thomas Geis

Scaling User Research with AI: Continuous Discovery of User Needs in Minutes

June 10, 2025

Nidhi Singh Rathore

"Candor and passionate feedback from participants shows they believe their input matters and are willing to be vulnerable."

Nidhi Singh Rathore Amber Davis

Embracing participation to unlock deeper truths in commercial research

March 12, 2025

Mackenzie Cockram

"Heat maps showed users clicking on areas they thought were clickable but weren’t, resulting in click rage."

Mackenzie Cockram Sara Branco Cunha Ian Franklin

Integrating Qualitative and Quantitative Research from Discovery to Live

December 16, 2022

Jamie Beck Alexander

"Anything that you can do is needed because our institutions were inherited from the past and need us to lead them toward the future."

Jamie Beck Alexander Nina Gregg Shawn Petersen Bill DeRouchey

How can you find your role in climate?

January 17, 2024

Caroline Vize

"Getting executive buy-in is so critical to the success of research, yet it remains a challenge for many."

Caroline Vize

The State of UX: Five Lessons from 2021 to Accelerate Digital Experience in 2022

March 9, 2022

Dr. Karl Jeffries

"Design ops has the opportunity to set a new default future, not follow the path set by the past."

Dr. Karl Jeffries

The Science of Creativity for DesignOps

January 8, 2024

Alberto Ferreira

"We aim to keep the reporting simple for stakeholders while incorporating multiple data points in the background."

Alberto Ferreira

Making it Count: Developing a custom digital metric framework that works

October 15, 2021

Shipra Kayan

"AI is not just speeding up what I used to do, it's changing what I used to do."

Shipra Kayan Robert Kortenoeven Eileen Tang

Emerging principles for using AI in Design: What the product design team at Miro has learned from deeply integrating AI in their workflow

June 11, 2025

Dana Bishop

"UX design ranked number 13 on LinkedIn’s ’Jobs on the Rise’ list in 2021."

Dana Bishop

2022: The Year UX Demonstrates its Business Impact

March 11, 2022