Log in or create a free Rosenverse account to watch this video.
Log in Create free account100s of community videos are available to free members. Conference talks are generally available to Gold members.
Summary
Drs. DeSutter and Scopelitis discussed how User Experience (UX) researchers can triangulate and enrich information from one-on-one interviews by attending to users’ co-speech gestures—the spontaneous movements that humans make with their hands and body when communicating. Gestures are a “window to the mind” and can reveal unspoken information about users’ emotional states as well as the structure and composition of their mental models. They concluded with a practical guide for efficiently implementing gesture research.
Key Insights
-
•
Gestures provide a non-verbal window into users' mental models, often revealing thoughts not expressed in speech.
-
•
Representational gestures, especially those made in personal gesture space, indicate cognitive processes and implicit imagery.
-
•
Users commonly hold multiple, context-dependent mental models rather than a single static one.
-
•
In interviews, interviewer gestures increase participant gesturing and improve conversational rapport.
-
•
Video interviews pose challenges for capturing gestures fully; positioning and prompting can mitigate this.
-
•
Speech-gesture mismatches often signal ongoing mental model construction or word searching by users.
-
•
Gestures can reveal emotional attachment or disengagement with technology, influencing adoption and retention.
-
•
Mental models can be anchored by recent technology prototypes, such as chat GPT for AI understanding.
-
•
Structured interview protocols that elicit gesturing and separate talking from tool use optimize gesture data collection.
-
•
Open source motion tracking and gesture analysis tools can aid qualitative research by quantifying gesture patterns.
Notable Quotes
"Gestures are a window to the mind."
"Gesture and speech form an integrated system; they reinforce one another."
"We’re really leaving half of our data on the table by not attending to gesture when eliciting mental models."
"Gesture is not computer and smartphone gestures, but spontaneous movements people make with hands and arms."
"Four me gestures happen in that personal gesture space and serve as thinking tools for the speaker."
"When gestures and speech mismatch, it often means the speaker is still refining their mental model."
"Without looking at the gesture, we would have come to a less complete mental model."
"Users have more than one mental model; they can be constructed on the fly depending on context."
"The degree to which the user feels in control with an intelligent agent brings up conversational mental models."
"The more you gesture, the more your interviewee will gesture."
Or choose a question:
More Videos
"Those days are over this new big tech is coming in so time to move on."
Dan HillStrategic design, slowdown, and the infrastructures of everyday life
April 21, 2022
"Who is missing from our efforts to make our paths more frictionless?"
Dave MaloufTheme 3: Introduction and Provocation
January 8, 2024
"User obsession became a strategic goal after making data available and showing progress to leadership including our CFO."
Adel Du ToitGet Your CFO To Say: 'Our Strategic Goal is User Obsession'
June 10, 2022
"Silos were a problem then, and I have a feeling silos are still a problem today."
Louis RosenfeldWelcome / Housekeeping
June 6, 2023
"The secondary first, primary second sequence may seem a numerical mismatch, but it is the way to go."
Joerg Beringer Thomas GeisScaling User Research with AI: Continuous Discovery of User Needs in Minutes
September 10, 2025
"Gatekeeping in research is often misunderstood; it’s about preserving quality, not exclusion."
Aras Bilgen Ari ZelmanowResearch Democratization: A Debate
March 29, 2023
"Some of the same things that help you scale can also feel like they get in your way."
Melissa Schmidt Adam MenterHow UX Research Hit It Big in Las Vegas
June 4, 2019
"Owning your specificity and positionality makes you more open to diasporic experiences and deeper conversations."
Florence OkoyeAfroFuturism and UX Research
March 27, 2023
"Learning is the foundational design pattern of complex systems. It's emergent, relational and contextual."
Jen BriselliLearning Is The Engine: Designing & Adapting in a World We Can’t Predict
April 16, 2025
Latest Books All books
Dig deeper with the Rosenbot
What lessons can service designers learn from electoral participation metrics about measuring engagement and errors?
What makes community-led health outcome contracts different from traditional models?
What are the four archetypes of AI behaviors identified in this talk and how do they help alignment?