Summary
Under biometric privacy laws like BIPA and CCPA, user research recordings containing users’ faces or voices can put your company at risk for lawsuits and fines. Legal departments are increasingly requiring more stringent redaction, and in some cases banning recording outright. This comes at a high cost for UX teams who are already being asked to do more with less, as losing access to recordings can increase duplicative research effort and reduce the accuracy of results. AI offers new solutions for UX teams who want to keep research recordings longer without violating biometric privacy laws. In this demo, we’ll show how we used off-the-shelf tools to intelligently redact users’ voices, faces, and bodies in research videos. By removing biometric identifiers, you can compliantly archive research recordings indefinitely, enabling your team to mine them for insights for years to come.
Key Insights
-
•
Biometric privacy laws increasingly restrict recording and retaining user faces and voices in UX research, risking large fines.
-
•
Replacing users with AI-generated avatars preserves the richness of user recordings while protecting privacy.
-
•
Wonder Studio can segment and animate a user's body and face with a 3D avatar fully automatically, simplifying production.
-
•
11 Labs technology enables seamless voice replacement using AI-generated speech in user research videos.
-
•
Replacing the entire body, not just the face, mitigates risks from distinctive identifying marks like tattoos and future privacy laws.
-
•
Traditional redaction methods like blurring or keeping only transcripts are either insufficient or lose valuable context.
-
•
Current AI tools have usage limits, require legal reviews, and sometimes restrictive licensing affecting enterprise use.
-
•
Hosting AI models in-house offers more data control but adds technical complexity compared to cloud-based solutions.
-
•
Using synthetic avatars raises complex ethical questions about authenticity, identity representation, and users' rights to be forgotten.
-
•
Legal risk tolerance varies, so negotiating with legal teams is essential to implement AI avatar redaction in research workflows.
Notable Quotes
"User recordings are your most valuable asset but have become riskier due to biometric privacy laws."
"Even if you’re not doing facial recognition, storing face and voice data is under growing legal scrutiny."
"Blurring user faces loses context and doesn’t address voice privacy, making it an inadequate solution."
"Wonder Studio automatically segments actors, maps their movements to a 3D model, and renders a synthetic avatar video."
"Replacing the entire body with an avatar future-proofs against unanticipated identifiers like tattoos or moles."
"DeepFakes can look too realistic and might introduce new privacy risks if donor faces come from real people."
"Processing limits and licensing terms currently restrict how much video and audio these AI tools can handle."
"Ultimately, AI avatar tools offer UX teams options to keep recordings while meeting legal compliance."
"If you can’t see a human in a video, how do you know the entire conversation wasn’t fabricated?"
"Synthetic duplicates living on after data deletion raise ethical questions about users’ right to be forgotten."
Or choose a question:
More Videos
"Our research practice was perceived largely as design testers, empathy vehicles, and policing functions — a narrow, reactive role."
Nalini KotamrajuResearch After UX
March 25, 2024
"Society grows great when old people plant trees they know they shall never sit in."
Dean BroadleyNot Black Enough to be White
January 8, 2024
"When we start changing our behavior, we start changing the voices and behavior around us."
Denise Jacobs Nancy Douyon Renee Reid Lisa WelchmanInteractive Keynote: Social Change by Design
January 8, 2024
"What does ‘best work’ mean? It’s different for everyone, so our solutions have to be flexible and meaningful."
Kim Fellman CohenMeasuring the Designer Experience
October 23, 2019
"Good design is honest and yes, the Jewel e-cigarette will honestly kill you."
George AyeThat Quiet Little Voice: When Design and Ethics Collide
November 16, 2022
"You need to be the connector between people trying to drive decisions and others influencing them."
Nathan CurtisBeyond the Toolkit: Spreading a System Across People & Products
June 9, 2016
"If agents do well, we do well. The more we know about you, the more we can make you better or fulfill your goals and aspirations."
Greg PetroffThe Compass Mission
March 10, 2021
"If you don’t do local research before launching a product, you risk disastrous failure."
Chloe Amos-EdkinsA Cultural Approach: Research in the Context of Glocalisation
March 27, 2023
"Some stakeholders still prefer their beliefs to data, even in academia."
Mackenzie Cockram Sara Branco Cunha Ian FranklinIntegrating Qualitative and Quantitative Research from Discovery to Live
December 16, 2022