Log in or create a free Rosenverse account to watch this video.
Log in Create free account100s of community videos are available to free members. Conference talks are generally available to Gold members.
Summary
Quantitative instruments are frequently sought because 1) they can be quickly fielded to lots and lots of people, and 2) when carefully sampled, they can be generalizable to the population of users/customers. However, because many times the focus is on speed to launch because decision-makers need results quickly, there is not much depth given to their development, nor an investigation of the validity evidence. In the session, I will share a framework that centers validity and is necessarily a mixed methods approach to research. I will also share ideas on how to scale the research over time so that findings and insights are able to be iteratively delivered to stakeholders, while also iteratively informing one another in a qual-quant research dance that brings more trustworthy, user-centered evidence to decision-makers. Finally, I will share ideas for a course I am developing for supporting qualitative researchers to become more mixed in their approach.
Key Insights
-
•
Validity involves five evidence sources: content, response processes, internal structure, relations to other variables, and consequences of testing.
-
•
Qualitative methods, especially cognitive interviews, are crucial for understanding how respondents interpret survey items, supporting validity.
-
•
Surveys should be treated as products that need ongoing iteration and testing, not one-off tools.
-
•
Ethical considerations extend beyond data privacy to how survey results affect user experience and product decisions.
-
•
Mixed methods approaches leverage both qualitative insights and quantitative analyses to build a stronger validity argument.
-
•
Breaking down survey validation work across multiple teams and 'bite-sized' efforts makes the process manageable.
-
•
Revising surveys over time to improve validity complicates measuring change longitudinally but increases trustworthiness.
-
•
Stakeholder buy-in improves when validity processes are communicated as phased insights offering tangible results quickly.
-
•
Analyses such as factor analysis and Rasch modeling reveal survey internal structure and help identify item bias across subpopulations.
-
•
It is important to revisit validity considerations continuously, especially after product or user base changes.
Notable Quotes
"Validity is the degree to which evidence and theory support the interpretations of test scores for proposed uses."
"Qualitative research in a mixed methods setting needs to think about validity to support the bigger validity argument."
"What would happen if you do think about validity? Would it change your process or research plans?"
"Surveys are products too, so they need to be iteratively tested."
"You don’t know what you don’t know. Let’s write surveys that cover those blind spots."
"If the survey wording changes, measuring change over time becomes difficult, but improving the survey builds trust."
"Ethical considerations should go beyond typical privacy reviews and think deeply about user impact throughout the process."
"It is better to partner with someone who has quantitative expertise to interpret internal structure analyses."
"Management and stakeholders often care more about getting usable information quickly than understanding the full validity process."
"We can get quick insights in phases to keep teams engaged and slowly build a complete validity argument."
Or choose a question:
More Videos
"I try as a product manager to involve them in those UX practices as often as possible."
Brad Peters Anne MamaghaniShort Take #1: UX/Product Lessons from Your Industry Peers
December 6, 2022
"Creating an environment where people feel safe to try design helps build enthusiasm and psychological safety."
Lona MooreScaling Design Beyond Designers
June 11, 2021
"The future should not be self-driving. It’s up to us, not the technology, to figure out the right way to use AI."
Josh Clark Veronika KindredSentient Design, AI, and the Radically Adaptive Experience (1st of 3 seminars)
January 15, 2025
"Two researchers can’t come close to digging into the customer problems for 17 product teams."
Erin May Roberta Dombrowski Laura Oxenfeld Brooke HintonDistributed, Democratized, Decentralized: Finding a Research Model to Support Your Org
March 10, 2022
"People were more likely to explain their thought process and just more context around the particular situation of their org."
Tara TresselInvestigating qualitative depth of AI-moderated interviews
March 10, 2026
"Workshops often reveal gaps in documentation, which is a spicy take but true."
Charles Lee Jennie YipBuilding a New Home for the Atlassian Design System
October 22, 2020
"The future of education lies in collaboration and adaptation."
Kristin SkinnerFive Years of DesignOps
September 29, 2021
"If you see a problem or opportunity, you should do something about it."
Megan BlockerGetting to the “So What?”: How Management Consulting Practices Can Transform Your Approach to Research
March 26, 2024
"The layer cake framework helps notice where and what kind of power actually resides in systems."
Shanti Mathew Natalie Sims Natalia RadywylCivic Design at Scale: Introducing the Public Policy Layer Cake
December 9, 2021