Summary
The Internet and Web have reached a tipping point. We’re now witnessing the surfacing of harmful patterns and norms that we designed—often unintentionally—into our products, services, and communities, and the world we live in. Designers who work in the enterprise are, like their peers in startups and big dotcoms, vulnerable and culpable and need to consider some big questions: How well do we manage our data? How inclusive are our development practices? How broadly and deeply do we think about the impact of what we build and deploy before we scale it for our customer base? We need to move forward with intent. We need to govern our digital spaces. A necessary first step towards that goal involves designers examining—with honesty and introspection—our role in the creation of what’s online. The World Wide Web is nothing more than the accumulation of what digital makers have put there. We made this mess, and we need to talk about how we are going to clean it up. Digital governance expert Lisa Welchman will reflect on how 25 years of passionate and agile web development got us where we are today, and the consequences of the lack of self-governance by the digital maker community. She will show us a path forward from this mess, outlining questions we can ask and steps we can take to govern better what we have created and what we will create in the future.
Key Insights
-
•
Digital governance is fundamentally about decision making and organizational responsibility, not just tools or workflows.
-
•
Many digital governance failures stem from unclear ownership of strategy, policy, and standards within organizations.
-
•
Collaborative governance involves multiple levels: core strategy teams, distributed content makers, working groups, and community contributors.
-
•
External vendors often deepen digital silos if not properly integrated into governance frameworks.
-
•
Governance can be designed to enable speed and innovation, not just control or restriction.
-
•
The internet and digital technologies undergo a lengthy maturation cycle similar to historic technologies like automobiles.
-
•
Algorithmic biases often reflect organizational biases; fixing algorithms requires fixing institutions.
-
•
Proactive digital safety can be conceptualized like crash-test dummies for online systems, focusing on inclusivity, morality, and safety.
-
•
Participation in internet and web governance organizations like W3C or the Internet Society is crucial but underutilized by digital professionals.
-
•
Generosity and sharing cultures, as exemplified by the development of the three-point seatbelt, are critical for progressing digital governance.
Notable Quotes
"People can have the same values and ideas but if you don’t tune them properly, you just don’t get what you want."
"Digital governance is about who’s supposed to make the decision, not what the decision is."
"Governance isn’t the byproduct of a project; digital is a system you have to design and iterate continuously."
"You can’t expect people to comply with standards if you don’t know who they are."
"Your external vendors may not have your organizational best interests at heart because it’s not their business model."
"Governance frameworks can facilitate whatever pace or style of work an organization wants."
"Every bad thing that can happen in the real world can now happen on the internet — and every good thing too."
"Human biases are the real problem behind algorithmic bias because organizations embed those biases first."
"We are the fix — everything online is made by people, so we can change it together."
"Governance participation isn’t optional if you want to avoid reactive impositions down the line."
Or choose a question:
More Videos
"Most users want a quick resolution, so many calls start as problem-solving and then transition to open-ended research conversations."
Prayag Narula Abhinav KrishnaDialing for Research: How to Reach the Unreachable
March 10, 2022
"Agents are models using tools in the loop, calling normal software functions as part of their process."
Peter Van DijckDesigning AI-first products on top of a rapidly evolving technology
June 10, 2025
"It’s exhausting every day doing extra emotional and cultural labor beyond the design work itself."
Jennifer Strickland Lesley-Ann NoelFireside Chat: How Design Addresses a World on Fire
March 18, 2022
"During grief, our creative mind shuts down; fear, safety and security rule the day."
Maria GiudiceEmpowering change: Reigniting purpose, passion and impact in research
March 13, 2025
"When innovation cuts directly at people’s control over their work, fear and hesitancy are inevitable."
Steve BatyBreaking Out of Ruts: Tips for Overcoming the Fear of Change
June 9, 2016
"If you don’t have any work, make some spec work just to show your skills."
Dave Hoffer Joanne WeaverUX Job Search AMA #2 with Joanne Weaver and Dave Hoffer
April 3, 2025
"Starting small, pick your closest collaborator, provide and prove value before trying to scale."
Andy WarrUnder My (Research) Umbrella: The Benefits and Challenges of Building a Unified Insights Function
March 25, 2024
"My manager says never ask me to take a day off, just tell me — it’s a huge cultural shift."
Allison SandersOperating with Purpose
January 8, 2024
"Choosing among all these research tools can be really overwhelming and intimidating."
JP Allen Holly HoldenNavigating the UX Tools Landscape
October 1, 2021
Latest Books All books
Dig deeper with the Rosenbot
How do shared ‘watch parties’ or team review sessions enhance understanding of accessibility challenges?
In what ways can organizations use scaffolding to overcome bureaucratic constraints in service design?
Why does Mary Lynn believe imperfection in design and content is becoming more valued today?