Accessible only to conference ticket holders.
Log in Create account Buy conference recordings
For 90 days after a conference, only paid ticket holders can watch conference videos. After that, all Gold members have access.
If you have purchased recording access and cannot see the video, please contact support.
Summary
Quantitative ethnography is the niche subfield you’ve never heard of, but it’s one you’ve been increasingly pressured to practice in over the past couple of years. It’s the math that turns words into numbers underlying generative AI, and LLMs have been getting in between you and a radically new approach to working with verbatims, transcripts, and other texts. Business stakeholders are always pushing for greater efficiency, faster turnarounds. Qualitative researchers are always looking for more contact with users, and greater engagement with findings and reporting. Quantitative ethnography (and epistemic network analysis) offers a compromise: by trading structure and semantics for human sensemaking in the analysis part of research, perhaps both groups can get what they want. I’ve had the opportunity to conduct quantitative ethnographic analyses in enterprise studies involving dozens of products, and impacting hundreds of thousands of end-users. Stakeholders were willing to accept a different kind of analysis, and engage more deeply with the process, in exchange for quicker answers. In this talk, I’ll share how quantitative ethnography differs from qualitative ethnography, the tradeoffs you’ll have to make, and the kinds of results you can expect. This isn’t a tools talk, but you won’t need to do any math, either. I’ll close with a look into the near future, one where you can talk with as many users as will take your call with effectively zero additional analysis work; where you can have the analysis running live during your session, and have the user participate in the sensemaking process on-the-fly; and the dream of every product manager, one where stakeholders can have dashboards of evidence updated live as users talk.
Key Insights
-
•
Quantitative ethnography unifies qualitative ethnographic methods with quantitative statistical validation, avoiding typical mixed-methods back-and-forth.
-
•
Formalizing coding rules in a detailed code book is essential to scale qualitative insights and enable automation.
-
•
Defining mechanistic signifiers, such as keywords or phrase rules, is necessary to automate qualitative coding effectively.
-
•
Intra-sample statistical analysis uses each coded line as a data point rather than each respondent, enabling meaningful stats from small sample sizes.
-
•
Partnering with data scientists is critical because quantitative ethnography requires specialized, adjusted statistical methods that differ from conventional ones.
-
•
Researchers must regularly validate coding accuracy and statistical assumptions over time, a process called closing the interpretive loop.
-
•
Quantitative ethnography can scale from a handful of interviews to thousands of verbatim responses, maintaining rigor at all scales.
-
•
Epistemic network analysis helps identify and quantify relationships between qualitative codes within the text data.
-
•
Large language models can automate parts of quantitative ethnography but require sacrificing some control over code definitions and initial synthesis.
-
•
Quantitative ethnography opens the possibility for near-real-time insights by automating coding and saturation metrics during ongoing data collection.
Notable Quotes
"Business stakeholders push researchers for faster turnarounds and numbers, often favoring surveys over deep interviews."
"Quantitative ethnography isn’t mixed methods; it’s a unified method using both qualitative theory and quantitative validation."
"If you can’t come up with a rule for something, you can’t code it."
"Each coded line is a data point, which enables statistical power even with small numbers of respondents."
"Partner with data scientists to pick and adjust statistical tests because quantitative ethnography requires new assumptions."
"Closing the interpretive loop means regularly checking that your coding and stats hold up as new data arrives."
"Epistemic network analysis reveals meaningful connections between codes, suggesting but not proving why ideas cluster."
"Large language models cluster text using semantic relationships rather than shared vocabulary like traditional QDA."
"Using generative AI math lets you skip stats, but you lose control over what codes start your synthesis."
"If rules and stats update in real time, you could know when saturation is reached as data streams in."
Or choose a question:
More Videos
"Creating and holding space is like putting bumpers up in bowling so people know they won’t fail if they engage."
Gina MendoliaTherapists, Coaches, and Grandmas: Techniques for Service Design in Complex Systems
December 3, 2024
"Validity is the degree to which evidence and theory support the interpretations of test scores for proposed uses."
Chris EngledowlA Mixed Method Approach to Validity to Help Build Trust
April 28, 2023
"Caring personally isn't about memorizing birthdays or throwing big team offsites; it’s about real conversations and learning what’s important to people."
Etienne FangThe Power of Care: From Human-Centered Research to Humanity-Centered Leadership
March 10, 2021
"We cannot remove barriers we aren’t aware of."
Samuel ProulxDesigning beyond caricatures: Embracing real, diverse user needs
December 4, 2024
"If nobody uses this because they don’t trust it, it doesn’t matter how accurate the model is."
Brian T. O’Neill Maria Cipollone Luis Colin Manuel Dahm Mike OrenDoes Designing and Researching Data Products Powered by ML/AI and Analytics Call for New UX Methods?
February 18, 2022
"Move fast and break things doesn’t work when you’re designing for cancer patients or disaster victims."
Barb SpantonDoing Work That Matters: A Look Beyond The Idealistic Notion of 'Doing Meaningful Work'
June 10, 2022
"What if flying was as easy as driving a car? That was our main question designing user interfaces."
Teresa SwinglerLook, Up in the Sky! UX/UI for Aerospace
October 27, 2022
"I do an inventory first to make a long list, then I prioritize to make a shorter list, then distribute the work, coordinate progress, and finally measure the impact."
Peter BoersmaHow to Define and Maintain a DesignOps Roadmap
October 3, 2023
"You’re hired for the wrong reasons—those buzzwords and expensive stamps aren’t what really make you successful."
Mackenzie GuinonM.C. Escher’s UX Research Career Ladder
March 9, 2022