Summary
The Internet and Web have reached a tipping point. We’re now witnessing the surfacing of harmful patterns and norms that we designed—often unintentionally—into our products, services, and communities, and the world we live in. Designers who work in the enterprise are, like their peers in startups and big dotcoms, vulnerable and culpable and need to consider some big questions: How well do we manage our data? How inclusive are our development practices? How broadly and deeply do we think about the impact of what we build and deploy before we scale it for our customer base? We need to move forward with intent. We need to govern our digital spaces. A necessary first step towards that goal involves designers examining—with honesty and introspection—our role in the creation of what’s online. The World Wide Web is nothing more than the accumulation of what digital makers have put there. We made this mess, and we need to talk about how we are going to clean it up. Digital governance expert Lisa Welchman will reflect on how 25 years of passionate and agile web development got us where we are today, and the consequences of the lack of self-governance by the digital maker community. She will show us a path forward from this mess, outlining questions we can ask and steps we can take to govern better what we have created and what we will create in the future.
Key Insights
-
•
Digital governance is fundamentally about decision making and organizational responsibility, not just tools or workflows.
-
•
Many digital governance failures stem from unclear ownership of strategy, policy, and standards within organizations.
-
•
Collaborative governance involves multiple levels: core strategy teams, distributed content makers, working groups, and community contributors.
-
•
External vendors often deepen digital silos if not properly integrated into governance frameworks.
-
•
Governance can be designed to enable speed and innovation, not just control or restriction.
-
•
The internet and digital technologies undergo a lengthy maturation cycle similar to historic technologies like automobiles.
-
•
Algorithmic biases often reflect organizational biases; fixing algorithms requires fixing institutions.
-
•
Proactive digital safety can be conceptualized like crash-test dummies for online systems, focusing on inclusivity, morality, and safety.
-
•
Participation in internet and web governance organizations like W3C or the Internet Society is crucial but underutilized by digital professionals.
-
•
Generosity and sharing cultures, as exemplified by the development of the three-point seatbelt, are critical for progressing digital governance.
Notable Quotes
"People can have the same values and ideas but if you don’t tune them properly, you just don’t get what you want."
"Digital governance is about who’s supposed to make the decision, not what the decision is."
"Governance isn’t the byproduct of a project; digital is a system you have to design and iterate continuously."
"You can’t expect people to comply with standards if you don’t know who they are."
"Your external vendors may not have your organizational best interests at heart because it’s not their business model."
"Governance frameworks can facilitate whatever pace or style of work an organization wants."
"Every bad thing that can happen in the real world can now happen on the internet — and every good thing too."
"Human biases are the real problem behind algorithmic bias because organizations embed those biases first."
"We are the fix — everything online is made by people, so we can change it together."
"Governance participation isn’t optional if you want to avoid reactive impositions down the line."
Or choose a question:
More Videos
"We shortened the network distance between any two randomly selected government staff involved in COVID response, making collaboration faster and more effective."
Gordon Ross12 Months of COVID-19 Design and Digital Response with the British Columbia Government
December 8, 2021
"I’ve started giving up on product management and focusing more on product strategy and senior executives."
Dr. Jamika D. Burge Steve Portigal Alba Villamil Sam LadnerThe Future of Research: Bridging the Gaps
July 29, 2021
"Use AI responsibly; never input sensitive info without governance, or it could lead to dangerous outcomes."
Changying (Z) ZhengPractical DesignOps: From Ideas to Tools That Teams Actually Use
September 25, 2025
"The seventh generation principle calls for long-term decision making that transcends self-interest."
Jayne Engle Tanya Chung-Tiam-FookCivic Design for the Next Seven Generations—A Discussion on Sacred Civics
August 25, 2022
"Physical remotes have a lower power draw and are often more purpose-driven than phones, which are sources of stress."
Cheryl PlatzDemystifying Multimodal Design: The Design Practice You Didn't Know You're Doing
April 4, 2024
"Budget and time are the enemies of inclusivity in research."
Chloe Amos-EdkinsA Cultural Approach: Research in the Context of Glocalisation
March 27, 2023
"We tend to put the end user first, but we must also ask who we might be doing harm to beyond primary users."
Cornelius RachieruHandling Complexity: Framing a Scale of Design
June 9, 2021
"To avoid bias in teams, it’s easier to gravitate toward what’s comfortable, but we need voices that challenge us."
Joi FreemanA New Vantage Point: Building a Pipeline for Multifaceted Research(ers)
March 30, 2020
"A style guide is an artifact, but a design system is a living, funded product with a roadmap serving an ecosystem."
Craig VillamorResilient Enterprise Design
June 8, 2017
Latest Books All books
Dig deeper with the Rosenbot
What are de-risking sessions in service design and how do they help prepare stakeholders for co-creation?
What is the ethical complexity around deception in design, such as placebo buttons versus dark patterns?
What does the mandala metaphor tell us about the temporary nature of service design artifacts?