Log in or create a free Rosenverse account to watch this video.
Log in Create free account100s of community videos are available to free members. Conference talks are generally available to Gold members.
Summary
Quantitative instruments are frequently sought because 1) they can be quickly fielded to lots and lots of people, and 2) when carefully sampled, they can be generalizable to the population of users/customers. However, because many times the focus is on speed to launch because decision-makers need results quickly, there is not much depth given to their development, nor an investigation of the validity evidence. In the session, I will share a framework that centers validity and is necessarily a mixed methods approach to research. I will also share ideas on how to scale the research over time so that findings and insights are able to be iteratively delivered to stakeholders, while also iteratively informing one another in a qual-quant research dance that brings more trustworthy, user-centered evidence to decision-makers. Finally, I will share ideas for a course I am developing for supporting qualitative researchers to become more mixed in their approach.
Key Insights
-
•
Validity involves five evidence sources: content, response processes, internal structure, relations to other variables, and consequences of testing.
-
•
Qualitative methods, especially cognitive interviews, are crucial for understanding how respondents interpret survey items, supporting validity.
-
•
Surveys should be treated as products that need ongoing iteration and testing, not one-off tools.
-
•
Ethical considerations extend beyond data privacy to how survey results affect user experience and product decisions.
-
•
Mixed methods approaches leverage both qualitative insights and quantitative analyses to build a stronger validity argument.
-
•
Breaking down survey validation work across multiple teams and 'bite-sized' efforts makes the process manageable.
-
•
Revising surveys over time to improve validity complicates measuring change longitudinally but increases trustworthiness.
-
•
Stakeholder buy-in improves when validity processes are communicated as phased insights offering tangible results quickly.
-
•
Analyses such as factor analysis and Rasch modeling reveal survey internal structure and help identify item bias across subpopulations.
-
•
It is important to revisit validity considerations continuously, especially after product or user base changes.
Notable Quotes
"Validity is the degree to which evidence and theory support the interpretations of test scores for proposed uses."
"Qualitative research in a mixed methods setting needs to think about validity to support the bigger validity argument."
"What would happen if you do think about validity? Would it change your process or research plans?"
"Surveys are products too, so they need to be iteratively tested."
"You don’t know what you don’t know. Let’s write surveys that cover those blind spots."
"If the survey wording changes, measuring change over time becomes difficult, but improving the survey builds trust."
"Ethical considerations should go beyond typical privacy reviews and think deeply about user impact throughout the process."
"It is better to partner with someone who has quantitative expertise to interpret internal structure analyses."
"Management and stakeholders often care more about getting usable information quickly than understanding the full validity process."
"We can get quick insights in phases to keep teams engaged and slowly build a complete validity argument."
Or choose a question:
More Videos
"Alternative navigation users pick different tools based on the task and how they feel at that moment, not just one technology."
Sam ProulxSUS: A System Unusable for Twenty Percent of the Population
December 9, 2021
"There are lots of little mushroom patches of design sprouting up, but they're disconnected and not sharing."
Michael LandEstablishing Design Operations in Government
February 18, 2021
"Ownership of VOC is a hot potato, so we built a coalition with a facilitator to keep it collaborative and effective."
Shipra KayanHow we Built a VoC (Voice of the Customer) Practice at Upwork from the Ground Up
September 30, 2021
"Your career is a design project. It’s the only one you own, so own it like you would a product."
Ian SwinsonDesigning and Driving UX Careers
June 8, 2016
"Communication is an integral part of the chief of staff role, whether with leadership, the org, or partners."
Isaac HeyveldExpand DesignOps Leadership as a Chief of Staff
September 8, 2022
"Change is messy and it can be uncomfortable, much like baking bread—it’s hard to imagine sticky dough turning into a perfect loaf."
Amy EvansHow to Create Change
September 25, 2024
"Design ops is undergoing a transformation into a CX focused team with a refined mission to drive efficiency and customer experience excellence."
Kate Koch Prateek KalliFlex Your Super Powers: When a Design Ops Team Scales to Power CX
September 30, 2021
"The only way to get outside your bubble is to act as if an alternative belief is true and test it."
Dave GrayLiminal Thinking: Sense-making for systems in large organizations
May 14, 2015
"Curation is super important, but also super hard."
Matt DuignanAtomizing Research: Trend or Trap
March 30, 2020