Rosenverse

This video is only accessible to Gold members. Log in or register for a free Gold Trial Account to watch.

Log in Register

Most conference talks are accessible to Gold members, while community videos are generally available to all logged-in members.

[Demo] Deploying AI doppelgangers to de-identify user research recordings
Gold
Wednesday, June 5, 2024 • Designing with AI 2024
Share the love for this talk
[Demo] Deploying AI doppelgangers to de-identify user research recordings
Speakers: Llewyn Paine
Link:

Summary

Under biometric privacy laws like BIPA and CCPA, user research recordings containing users’ faces or voices can put your company at risk for lawsuits and fines. Legal departments are increasingly requiring more stringent redaction, and in some cases banning recording outright. This comes at a high cost for UX teams who are already being asked to do more with less, as losing access to recordings can increase duplicative research effort and reduce the accuracy of results. AI offers new solutions for UX teams who want to keep research recordings longer without violating biometric privacy laws. In this demo, we’ll show how we used off-the-shelf tools to intelligently redact users’ voices, faces, and bodies in research videos. By removing biometric identifiers, you can compliantly archive research recordings indefinitely, enabling your team to mine them for insights for years to come.

Key Insights

  • User recordings are invaluable in UX for stakeholders and can drive cost savings for companies.

  • Biometric privacy laws, such as GDPR, are imposing new recording restrictions in the EU and US.

  • Some companies prohibit recording user research due to legal concerns, leading to repeated, costly research.

  • Transcripts and notes are less effective compared to video records for capturing context during UX research.

  • AI tools can effectively replace user voices and bodies to maintain anonymity without compromising research quality.

  • Using synthetic avatars can help sidestep future legal concerns around identification.

  • There is a trade-off between user-friendliness and legal compliance of the available AI tools.

  • As AI technologies advance, the processing limitations currently present in video tools may improve over time.

  • Ethically, designers must consider the implications of creating synthetic duplicates of users in their research.

  • Collaboration with legal teams is vital to implement new technologies responsibly.

Notable Quotes

"User recordings are arguably your most valuable asset."

"This situation is going on at companies all over."

"Storing face and voice information is coming under more scrutiny from legal departments."

"Transcripts often contain errors and lose a lot of context."

"That means if they get pushed back on a research finding after that time, there's no way to prove they're right."

"What if you could automatically remove face and voice information and still have a rich detailed recording?"

"Work that used to require multiple different types of specialized artists is now accessible to anyone."

"If you can't see a human in a video, how do you know the entire conversation wasn't fabricated?"

"If we remove visual aspects of identity, then we can miss out on recognizing the unique perspective that different groups have to offer."

"These tools may fulfill our legal obligations, but what is our ethical obligation to users?"

More Videos