Rosenverse

Log in or create a free Rosenverse account to watch this video.

Log in Create free account

100s of community videos are available to free members. Conference talks are generally available to Gold members.

AI: Passionate defenses and reasoned critique [Advancing Research Community Workshop Series]
Wednesday, September 18, 2024 • Advancing Research Community

This video is featured in the AI and UX playlist.

Share the love for this talk
AI: Passionate defenses and reasoned critique [Advancing Research Community Workshop Series]
Speakers: Rachael Dietkus, LCSW , Llewyn Paine , Nishanshi Shukla and David Womack
Link:

Summary

AI adoption is rapidly accelerating in the insights space, and researchers are rushing to explore the possibilities and pitfalls it presents. Without a doubt, it will change the nature of our work, but where do we stand now? Our panelists will examine passionate defenses for the value of AI, offer reasoned critiques, discuss practical applications, and discuss how we can collectively move forward in an ethical and human-centered manner. Attend all of our Advancing Research community workshops Each free virtual workshop is made up of panelists who will share short provocations on engaging ideas to discuss as a group, as well as a leader in our field to moderate. If you're looking for discussions that challenge the status quo and can truly advance research, look no further than our workshop series. (P.S. We’ll be drawing most of our Advancing Research 2025 conference speakers from those who present at upcoming workshops—so tune in for a sneak peek of what's to come from #AR2025!) July 24, 4-5pm EDT Watch Video Theme 1: Democratization Working with it, not against August 7, 11am-12pm EDT Watch Video Theme 2: Collaboration Learning from market research, data science, customer experience, and more August 21, 4-5pm EDT Watch Video Theme 3: Communication Innovative techniques for making your voice heard September 4, 11am-12pm EDT Watch Video Theme 4: Methods Expanding the UXR toolkit beyond interviews October 2, 11am-12pm EDT Watch Video Theme 6: Junctures for UXR Possible futures and the critical decisions to move us forward October 16, 4-5pm EDT Watch Video Theme 7: Open Call Propose ideas that don’t match our other workshops’ themes

Key Insights

  • AI is essential for modeling complex natural systems but tends to generalize towards the center, ignoring critical edge cases where innovation happens.

  • Bias in AI is often unintentional but reflects dominant cultural and power structures, disproportionately harming marginalized groups.

  • Addressing bias requires interdisciplinary collaboration and including voices from impacted communities, especially those historically excluded.

  • There is a scarcity of positive, concrete examples of AI used ethically and effectively, contributing to public fear and skepticism.

  • Ethics and responsibility must be central in AI design, guided by questions about who benefits, who is harmed, and who participates in the process.

  • AI’s promise in scientific research lies in enabling new types of comprehensive analysis and modeling previously impossible for humans alone.

  • Inclusivity efforts in AI can sometimes perpetuate existing power imbalances rather than eliminate them if not critically examined.

  • Bias assessment involves self-reflection on positionality, rigorous questioning, and iterative validation with diverse teams.

  • Human-to-human interaction remains essential to complement AI tools and counterbalance their limitations and biases.

  • Balancing AI’s environmental costs with its potential to solve urgent problems like climate change is a complex but critical discussion.

Notable Quotes

"AI is both absolutely necessary and completely terrifying for science."

"The greatest scientific breakthroughs tend to come from edge cases, which AI tends to ignore."

"AI reflects dominant hegemonic views, creating virtual worlds where counter views do not exist."

"Who was involved in the process? Who benefited? Who was harmed? These are essential questions in AI design."

"Nothing is inherently better because it was produced by human intention or machine learning; interrogate the goal first."

"There is always going to be a power gap in inclusiveness efforts unless we critically question who is missing."

"AI allows us to explore multiple imaginaries and possibilities, expanding how we question and understand the world."

"Bias is constantly evolving; awareness requires trusted human relationships, not just technology validation."

"Sometimes the most ethical and just path for humans is also the most effective for preserving natural systems."

"It's all about balance: being aware of AI’s issues while remaining open to its incredible opportunities."

Ask the Rosenbot
Josh Clark
Sentient Design: New Design Patterns for New Experiences (3rd of 3 seminars)
2025 • Rosenfeld Community
Kim Lenox
Leading Distributed Global Teams
2019 • Enterprise Community
Savina Hawkins
Harnessing AI in UXR: Practical Strategies for Positive Impact
2024 • Advancing Research 2024
Gold
Sofia Quintero
The Product Philosophy Behind EnjoyHQ
2021 • Advancing Research 2021
Gold
Jemma Ahmed
Theme Three Intro
2023 • Advancing Research 2023
Gold
Frances Yllana
DesignOps Exposed: What do our peers really think of us?
2025 • DesignOps Summit 2025
Conference
Tracy McGoldrick
IBM User Experience Program—The What, Why and How
2021 • Advancing Research Community
Sofía Delsordo
Public Policy for Jalisco's Designers to Make Design Matter
2021 • Civic Design 2021
Gold
Billy Carlson
Principles of Team Wireframing
2023 • DesignOps Summit 2023
Gold
Jorge Arango
Exploding the Notebook: How to Unlock the Power of Linked Notes (2nd of 3 seminars)
2024 • Rosenfeld Community
Alicia Mooty
Design Staffing Models
2021 • DesignOps Summit 2021
Gold
Briana Thomas
When Design Ops Comes in H.O.T. : A Tale of a Transformed Design Org
2021 • DesignOps Summit 2021
Gold
Juhan Sonin
Design Now! The Agenda for Action
2025 • Rosenfeld Community
Alla Weinberg
Design Teams Need Psychological Safety: Here’s How to Create It
2022 • DesignOps Summit 2022
Gold
Anne Mamaghani
How Your Organization's Generative Workshops Are Probably Going Wrong and How to Get Them Right
2023 • Advancing Research 2023
Gold
How to Identify and Increase your "Experience Quotient"
2018 • Enterprise Experience 2018
Gold

More Videos

Sam Proulx

"No assistive technology user uses default settings; we spend years customizing everything to work for us."

Sam Proulx

SUS: A System Unusable for Twenty Percent of the Population

December 9, 2021

Michael Land

"It's kind of Peace Corps for nerds – people come in for a tour and try to have impact at scale."

Michael Land

Establishing Design Operations in Government

February 18, 2021

Shipra Kayan

"I spent too much time trying to change company culture and should have focused more on just getting things done."

Shipra Kayan

How we Built a VoC (Voice of the Customer) Practice at Upwork from the Ground Up

September 30, 2021

Ian Swinson

"A workshop helps people map their skills, identify gaps, and make specific goals for career jumps."

Ian Swinson

Designing and Driving UX Careers

June 8, 2016

Isaac Heyveld

"We went from a team of 80 to 170 overnight, and that was a wide open cheeks flushed moment for me."

Isaac Heyveld

Expand DesignOps Leadership as a Chief of Staff

September 8, 2022

Amy Evans

"Would you feel confident leaving your project success to the flip of a coin based on the fact that almost half of all change fails?"

Amy Evans

How to Create Change

September 25, 2024

Kate Koch

"Flight is about adaptability – being ready to switch gears and pivot quickly in a global and changing environment."

Kate Koch Prateek Kalli

Flex Your Super Powers: When a Design Ops Team Scales to Power CX

September 30, 2021

Dave Gray

"Change only happens in the present moment, not the past or future."

Dave Gray

Liminal Thinking: Sense-making for systems in large organizations

May 14, 2015

Matt Duignan

"Curation is super important, but also super hard."

Matt Duignan

Atomizing Research: Trend or Trap

March 30, 2020