Log in or create a free Rosenverse account to watch this video.
Log in Create free account100s of community videos are available to free members. Conference talks are generally available to Gold members.
Summary
Quantitative instruments are frequently sought because 1) they can be quickly fielded to lots and lots of people, and 2) when carefully sampled, they can be generalizable to the population of users/customers. However, because many times the focus is on speed to launch because decision-makers need results quickly, there is not much depth given to their development, nor an investigation of the validity evidence. In the session, I will share a framework that centers validity and is necessarily a mixed methods approach to research. I will also share ideas on how to scale the research over time so that findings and insights are able to be iteratively delivered to stakeholders, while also iteratively informing one another in a qual-quant research dance that brings more trustworthy, user-centered evidence to decision-makers. Finally, I will share ideas for a course I am developing for supporting qualitative researchers to become more mixed in their approach.
Key Insights
-
•
Validity involves five evidence sources: content, response processes, internal structure, relations to other variables, and consequences of testing.
-
•
Qualitative methods, especially cognitive interviews, are crucial for understanding how respondents interpret survey items, supporting validity.
-
•
Surveys should be treated as products that need ongoing iteration and testing, not one-off tools.
-
•
Ethical considerations extend beyond data privacy to how survey results affect user experience and product decisions.
-
•
Mixed methods approaches leverage both qualitative insights and quantitative analyses to build a stronger validity argument.
-
•
Breaking down survey validation work across multiple teams and 'bite-sized' efforts makes the process manageable.
-
•
Revising surveys over time to improve validity complicates measuring change longitudinally but increases trustworthiness.
-
•
Stakeholder buy-in improves when validity processes are communicated as phased insights offering tangible results quickly.
-
•
Analyses such as factor analysis and Rasch modeling reveal survey internal structure and help identify item bias across subpopulations.
-
•
It is important to revisit validity considerations continuously, especially after product or user base changes.
Notable Quotes
"Validity is the degree to which evidence and theory support the interpretations of test scores for proposed uses."
"Qualitative research in a mixed methods setting needs to think about validity to support the bigger validity argument."
"What would happen if you do think about validity? Would it change your process or research plans?"
"Surveys are products too, so they need to be iteratively tested."
"You don’t know what you don’t know. Let’s write surveys that cover those blind spots."
"If the survey wording changes, measuring change over time becomes difficult, but improving the survey builds trust."
"Ethical considerations should go beyond typical privacy reviews and think deeply about user impact throughout the process."
"It is better to partner with someone who has quantitative expertise to interpret internal structure analyses."
"Management and stakeholders often care more about getting usable information quickly than understanding the full validity process."
"We can get quick insights in phases to keep teams engaged and slowly build a complete validity argument."
Or choose a question:
More Videos
"Timely interactions that log users out without saving progress cause abandonment, especially for people with disabilities."
Sam ProulxOnline Shopping: Designing an Accessible Experience
June 7, 2023
"Evaluations and promotions can be nebulous and awkward, especially in collaborative design work."
Ignacio MartinezFair and Effective Designer Evaluation
September 25, 2024
"Doubt and fear come up if you haven’t established the right communication and trust."
Sarah Kinkade Mariana Ortiz-ReyesDesign Management Models in the Face of Transformation
June 8, 2022
"The Shaker village sidewalks followed people’s natural inclinations, not forcing unnatural movement."
Daniel GloydWarming the User Experience: Lessons from America's first and most radical human-centered designers
May 9, 2024
"We’re moving from theory of change to theory of service: starting with what people actually need before creating anything."
Patrick BoehlerFishing for Real Needs: Reimagining Journalism Needs with AI
June 10, 2025
"With proper prompting, AI can get you really close to a deliverable summary, much closer than expected."
Andy Barraclough Betsy NelsonFrom Costly Complexity to Efficient Insights: Why UX Teams Are Switching To Voxpopme
September 23, 2024
"Policy is ideally driven by values, while UX in the private sector is driven by delight and profit."
Alexandra SchmidtWhy Ethics Can't Save Tech
November 18, 2022
"If you don’t have any conflict, that means you have no trust."
Louis RosenfeldDiscussion: What Operations can teach DesignOps
November 6, 2017
"AI decision-making on behalf of billions of people isn’t just scary; it’s terrifying and dangerous, so ethics must be embedded."
Mitchell BernsteinOrganizing Chaos: How IBM is Defining Design Systems with Sketch for an Ever-Changing AI Landscape
September 29, 2021
Latest Books All books
Dig deeper with the Rosenbot
What lessons does Cheryl Platz share from Duolingo’s successes and failures in gamified product design?
How does scaling impact the ways service performance is measured in large organizations like Autodesk?
Who are the multiple user types involved in designing services for public sector innovation portfolios?