Log in or create a free Rosenverse account to watch this video.
Log in Create free account100s of community videos are available to free members. Conference talks are generally available to Gold members.
Summary
Drs. DeSutter and Scopelitis discussed how User Experience (UX) researchers can triangulate and enrich information from one-on-one interviews by attending to users’ co-speech gestures—the spontaneous movements that humans make with their hands and body when communicating. Gestures are a “window to the mind” and can reveal unspoken information about users’ emotional states as well as the structure and composition of their mental models. They concluded with a practical guide for efficiently implementing gesture research.
Key Insights
-
•
Gestures provide a non-verbal window into users' mental models, often revealing thoughts not expressed in speech.
-
•
Representational gestures, especially those made in personal gesture space, indicate cognitive processes and implicit imagery.
-
•
Users commonly hold multiple, context-dependent mental models rather than a single static one.
-
•
In interviews, interviewer gestures increase participant gesturing and improve conversational rapport.
-
•
Video interviews pose challenges for capturing gestures fully; positioning and prompting can mitigate this.
-
•
Speech-gesture mismatches often signal ongoing mental model construction or word searching by users.
-
•
Gestures can reveal emotional attachment or disengagement with technology, influencing adoption and retention.
-
•
Mental models can be anchored by recent technology prototypes, such as chat GPT for AI understanding.
-
•
Structured interview protocols that elicit gesturing and separate talking from tool use optimize gesture data collection.
-
•
Open source motion tracking and gesture analysis tools can aid qualitative research by quantifying gesture patterns.
Notable Quotes
"Gestures are a window to the mind."
"Gesture and speech form an integrated system; they reinforce one another."
"We’re really leaving half of our data on the table by not attending to gesture when eliciting mental models."
"Gesture is not computer and smartphone gestures, but spontaneous movements people make with hands and arms."
"Four me gestures happen in that personal gesture space and serve as thinking tools for the speaker."
"When gestures and speech mismatch, it often means the speaker is still refining their mental model."
"Without looking at the gesture, we would have come to a less complete mental model."
"Users have more than one mental model; they can be constructed on the fly depending on context."
"The degree to which the user feels in control with an intelligent agent brings up conversational mental models."
"The more you gesture, the more your interviewee will gesture."
Or choose a question:
More Videos
"Trauma-informed approaches are like seat belts and airbags for tech products."
Carol Scott Melissa EgglestonAvoid Harming Your Team and Users: Promoting Care and Brand Reputation with Trauma-Informed UX Practices
February 5, 2025
"Find a mentor—an AI, a person, or a project—and use community resources like Open Austin to learn and grow."
Bryce Benton[Demo] AI-powered UX enhancement: Aligning GitHub documentation with USWDS at Austin Public Library
June 4, 2024
"Sharzad will share wisdom she earned and learned the really hard way from working both inside and outside."
Dan WillisTheme 3: Intro
January 8, 2024
"Moving partners from passive to resistant helped reveal their fears and motivations so we could tailor our conversations to them."
Anat Fintzi Rachel MinnicksDelivering at Scale: Making Traction with Resistant Partners
June 9, 2022
"There is a tight coupling of physical and digital objects that makes the physical world more fluid and the virtual world gain physical traits."
Lukas Moro“Feels Like Paper!”: Interfacing AI through Paper
June 11, 2025
"Instead of asking if research is participatory or not, we should ask when and how participatory the research is."
Sarah FathallahA Typology of Participation in Participatory Research
March 28, 2023
"Taking time for visionary conversations is a mechanism to break out of incrementalism and incremental UX debt."
Satyam KantamneniDo You Have an Experience Vision?
March 23, 2023
"Low-fidelity prototypes let us keep fundamental changes flexible as long as possible."
Daniel Korczynski Justyna ParmeeFrom generic to contextual research insights with AI | Live Q&A
March 11, 2026
"Even if there is an element of enrichment in research participation, all processes have some extraction and must be minimized."
Rachael Dietkus, LCSW Uday Gajendar Dr. Dawn Emerick Dawn E. Shedrick, LCSWLeading through the long tail of trauma
July 7, 2022