This video is featured in the AI and UX playlist.
Summary
AI-enabled systems that are responsible and human-centered, will be powerful partners to humans. Making systems that people are willing to be responsible for, and that are trustworthy to those using them enables that partnership. Carol will share guidance for operationalizing the work of making responsible human-centered AI systems based on UX research. She will share methods for UX teams to support bias identification, prevent harm, and support human-machine teaming through design of appropriate evidence of system capabilities and integrity through interaction design. Once these dynamic systems are out in the world, critical oversight activities are needed for AI systems to continue to be effective. This session will introduce each of these complex topics and provide references for further exploration of these exciting issues.
Key Insights
-
•
Responsible AI systems require humans to retain ultimate responsibility and control.
-
•
Bias in AI data is inevitable, but awareness and mitigation of harmful bias is crucial.
-
•
Human-machine teaming must be designed with clear responsibilities and transparency.
-
•
AI systems are dynamic and constantly evolving, making continuous oversight essential.
-
•
Speculative exercises like 'What Could Go Wrong' support anticipating harms proactively.
-
•
Calibrated trust in AI means users neither overtrust nor undertrust the system.
-
•
Ethical frameworks such as the Three Q Do No Harm help plan for impact on vulnerable groups.
-
•
Diverse teams improve innovation by being more aware of biases and ethical variation.
-
•
UX practitioners should understand AI concepts to effectively contribute without needing deep technical skills.
-
•
Designing safe AI includes making unsafe actions difficult and safe states easy to maintain.
Notable Quotes
"Responsible systems are systems that keep humans in control."
"Data is a function of our history; it reflects priorities, preferences, and prejudices."
"AI will ensure appropriate human judgment, not replace it."
"We want people to gain calibrated levels of trust, not overtrust or undertrust."
"If the system is not confident, it should transparently communicate that and hand off to humans."
"Ethical design is not superficial; if we don't ask the tough questions, who will?"
"We need to be uncomfortable and get used to asking hard questions about AI."
"Humans are still better at many activities and those strengths should be prioritized."
"Adopting technical ethics gives teams permission to question implications beyond opinions."
"These systems aren’t stable like old software; they change as data and models evolve."
Or choose a question:
More Videos
"Scaffolding is about hacking and doing things differently, not lowering the bar but delivering value continuously."
Ben Reason Aline Horta Majid Iqbal Fabiano LeoniMaking the system visible: The fastest path to better decisions
November 20, 2025
"The tools on mobile are free and built-in, unlike desktop where accessibility tools often require licenses and procurement."
Sam ProulxMobile Accessibility: Why Moving Accessibility Beyond the Desktop is Critical in a Mobile-first World
March 10, 2022
"Our monthly voice of the customer reports we call Stories in Dovetail; we share them with the team every month."
Anna Nguyen Emily BroganWhy Our Voice of the Customer is Better Than Yours
March 10, 2022
"Translation is a traitor because meaning is lost in translation between design and code."
Abby Covert Tomer SharonPanel: Collaboration Tools
November 6, 2017
"One in five of those people will have an accessibility need of some kind."
Phil HeskethDesigning Accessible Research Workflows
September 29, 2021
"Granicus is like Lego blocks — you can create any type of engagement and experience you want."
Irina Tikhonova Kari DietrichSmall Wins, Big Impact: Leveraging and Elevating User Engagement
December 9, 2021
"Becoming a B Corp really made us redesign everything about the business."
Tim FrickThe journey of building a sustainable design practice
April 23, 2025
"It almost feels like cheating because your users are showing and telling you what they need. It's almost too easy."
Catherine DubutBridging Physical and Digital Spaces: Approaches to Retail Service Design
March 18, 2021
"Designers must mitigate risk for their projects, but they personally face bigger risks from failure."
Justin Entzminger Terrance Smith Tracy M. Colunga Mai-Ling GarciaRisk and Reward: How to Diversify the Field of Civic Innovators and Designers
November 17, 2022