Summary
The Internet and Web have reached a tipping point. We’re now witnessing the surfacing of harmful patterns and norms that we designed—often unintentionally—into our products, services, and communities, and the world we live in. Designers who work in the enterprise are, like their peers in startups and big dotcoms, vulnerable and culpable and need to consider some big questions: How well do we manage our data? How inclusive are our development practices? How broadly and deeply do we think about the impact of what we build and deploy before we scale it for our customer base? We need to move forward with intent. We need to govern our digital spaces. A necessary first step towards that goal involves designers examining—with honesty and introspection—our role in the creation of what’s online. The World Wide Web is nothing more than the accumulation of what digital makers have put there. We made this mess, and we need to talk about how we are going to clean it up. Digital governance expert Lisa Welchman will reflect on how 25 years of passionate and agile web development got us where we are today, and the consequences of the lack of self-governance by the digital maker community. She will show us a path forward from this mess, outlining questions we can ask and steps we can take to govern better what we have created and what we will create in the future.
Key Insights
-
•
Digital governance is fundamentally about decision making and organizational responsibility, not just tools or workflows.
-
•
Many digital governance failures stem from unclear ownership of strategy, policy, and standards within organizations.
-
•
Collaborative governance involves multiple levels: core strategy teams, distributed content makers, working groups, and community contributors.
-
•
External vendors often deepen digital silos if not properly integrated into governance frameworks.
-
•
Governance can be designed to enable speed and innovation, not just control or restriction.
-
•
The internet and digital technologies undergo a lengthy maturation cycle similar to historic technologies like automobiles.
-
•
Algorithmic biases often reflect organizational biases; fixing algorithms requires fixing institutions.
-
•
Proactive digital safety can be conceptualized like crash-test dummies for online systems, focusing on inclusivity, morality, and safety.
-
•
Participation in internet and web governance organizations like W3C or the Internet Society is crucial but underutilized by digital professionals.
-
•
Generosity and sharing cultures, as exemplified by the development of the three-point seatbelt, are critical for progressing digital governance.
Notable Quotes
"People can have the same values and ideas but if you don’t tune them properly, you just don’t get what you want."
"Digital governance is about who’s supposed to make the decision, not what the decision is."
"Governance isn’t the byproduct of a project; digital is a system you have to design and iterate continuously."
"You can’t expect people to comply with standards if you don’t know who they are."
"Your external vendors may not have your organizational best interests at heart because it’s not their business model."
"Governance frameworks can facilitate whatever pace or style of work an organization wants."
"Every bad thing that can happen in the real world can now happen on the internet — and every good thing too."
"Human biases are the real problem behind algorithmic bias because organizations embed those biases first."
"We are the fix — everything online is made by people, so we can change it together."
"Governance participation isn’t optional if you want to avoid reactive impositions down the line."
Or choose a question:
More Videos
"Every degree rise in temperature correlates with increased health issues."
Alex Hurworth Bonnie John Fahd Arshad Antoine MarinDesigning a Contact Tracing App for Universal Access
October 23, 2020
"Prioritization work is a political process, and politics are hard to do in technology."
Mark Interrante Harry MaxAI for Prioritization (3rd of 3 seminars)
July 11, 2024
"I just feel like you’re talking to robots in the system a lot. They don’t have any awareness of other people’s cultures or worldviews."
Deirdre Hirschtritt Cesar Paredes Marie PerrotResearch is Only as Good as the Relationships You Build
November 17, 2022
"You’ll never have enough context; being able to act with incomplete information is a crucial skill."
Deanna Washington Tim Allen Jeff Courcelle John Maeda Matt Raw Erica TjaderScaling Success: Paving the Path from DesignOps to VP
October 4, 2023
"It’s really about design doing, not just knowing the theory but acting and behaving differently."
Marc Rettig Julie Baher Phil Gilbert Nathan ShedroffDiscussion
May 14, 2015
"We can design the experience of working with us just like we design for end users."
Christian Crumlish Wendy Johansson Rich Mironov Aditi Ruiz Adam ThomasAfternoon Insights Panel
December 6, 2022
"Technology helps efficiency but cannot replace human connection—eye contact and handshake remain essential."
Iram ShahClosing Keynote: The View from the Top
June 4, 2019
"If diversity alone made research inclusive, then police could never be anti-black because they include black officers."
Victor UdoewaBeyond Methods and Diversity: The Roots of Inclusion
March 26, 2024
"There is a point where scaling back a workshop too much will make it ineffective and that trade-off must be explicit."
Anne MamaghaniHow Your Organization's Generative Workshops Are Probably Going Wrong and How to Get Them Right
March 28, 2023