Log in or create a free Rosenverse account to watch this video.
Log in Create free account100s of community videos are available to free members. Conference talks are generally available to Gold members.
Summary
We’re all aware of a big push to implement AI everywhere, including in the services that many of us are working on. It seems only fair to try to give the AI some good quality input in the hope of getting decent output from it. Or, being more pessimistic: we probably expect to get some level of errors from the AI, but what do we know about the error rates in what we’re putting into the AI? In this session, we will compare our ideas on identifying errors and measuring error rates, including thinking about errors in six ways: 1) Problems along the way 2) Wrong result 3) Unnecessary action 4) Delayed-impact problem 5) Non-uptake or over-uptake 6) Technology problem We’ll wrap up with “tips and next steps”: an opportunity to consider what we now need to find out or do differently.
Key Insights
-
•
Errors in data collection and user input are foundational issues that compromise AI and service outcomes.
-
•
Users often 'fudge' answers due to ambiguous questions, privacy concerns, or to achieve a desired outcome.
-
•
Non-uptake, where users abandon a form or process, is a major source of error but is rarely published or measured.
-
•
Mistakes can be categorized as problems along the way, wrong results, unnecessary actions, and delayed impact issues.
-
•
Multiple accounts creation often occurs due to users forgetting existing accounts, leading to data duplicates and service inefficiencies.
-
•
Measuring error rates is complex; different metrics (per person, per attempt, completion vs. start) yield different perspectives.
-
•
Elections provide a useful model for measuring data quality, using turnout, participation, and eligibility rates.
-
•
Data quality deteriorates over time due to changes like moving, name changes, loss of documents, or organizational restructuring.
-
•
AI initiatives can provide a compelling rationale and funding opportunity for improving longstanding data quality problems.
-
•
Frameworks like the UK Government Data Quality Framework help organizations systematically assess and address data issues.
Notable Quotes
"If we get garbage in, we get garbage out — this is true for AI as much as for surveys or forms."
"People can make all sorts of inventive mistakes on their forms that AI struggles to interpret."
"Sometimes a form forces you into a wrong answer by giving inappropriate options."
"I’ve seen people fudge their date of birth so their child can attend a summer camp they aren’t technically eligible for."
"A major error in many services is users creating multiple accounts because they can’t find or reuse existing ones."
"An error might not be immediate; data can be fine when collected but deteriorate over time and cause problems later."
"Completion rates (conversion rates) and dropout rates are simple metrics but often not tracked or shared."
"Organizations rarely know their error rates, which limits their ability to improve user experience or data accuracy."
"Linking data quality efforts to AI initiatives can help secure attention and budget for necessary improvements."
"Data quality involves accuracy, completeness, uniqueness, timeliness, and representativeness—not just error reduction."
Or choose a question:
More Videos
"It’s our job to see the person in front of us, and if that means having an uncomfortable conversation, have that conversation."
Lin NieWhen Thought-worlds Collide: Collaborating Between Research and Practice
March 10, 2021
"Using a screen reader demo helps leadership hear firsthand why accessibility matters and gain buy-in."
Suzan Bednarz Hilary SunderlandAccessibilityOps for All
January 8, 2024
"Choosing a research repository is a big battle in your organization; the right decisions can really help you succeed."
Sofia QuinteroThe Product Philosophy Behind EnjoyHQ
March 10, 2021
"Knowing the difference between hands and brains is key to managing expectations."
Sol MeszHands or Brains? How to Hire for Strategy, Strategically
January 8, 2024
"If you work with humans, you work with trauma. Assume everybody carries some form of vulnerability."
Matt Bernius Sarah Fathallah Hera Hussain Jessica Zéroual-KaraTrauma-informed Research: A Panel Discussion
October 7, 2021
"In large complex systems, the dots are scattered across silos and time zones, making progress feel elusive."
Gina MendoliaTherapists, Coaches, and Grandmas: Techniques for Service Design in Complex Systems
December 3, 2024
"Being those who create constructive discomfort in the industry is necessary to bring our Latin researcher identity forward."
Verónica Urzúa Jorge MontielThe B-side of the Research Impact
March 12, 2021
"Diversity, equity, inclusion, and belonging goals need to tie into annual reviews and bonuses—it's everybody’s work."
Lisanne NormanWhy I Left Research
March 27, 2023
"Treat your stakeholders like research participants — listen openly and without judgment."
Ovetta SampsonTurning UX Passion into Real Product Influence
June 7, 2023