Move Fast, Break Things — And Be Accountable For It
What I learned about mistakes, power, and responsibility in tech.
One of my favorite photos comes from a friend of mine who used to work at Facebook — the company that coined the phrase “move fast and break things.” The photo? A new sign that read: Slow down and fix your shit.
That about sums up where we are.
“Move fast and break things” has always struck me as sort of obnoxiously aggressive. At Google, we had a different phrase: launch and iterate. That’s a better way to capture the good part of this idea — that if you’re so afraid of making a mistake, you can’t innovate. You can’t fix things.
It needs to be okay to make a mistake. You can’t innovate if it’s not okay to make a mistake. And it even needs to be okay to admit mistakes in situations where it’s really not okay to make mistakes — like hospitals. That’s what’s behind Amy Edmondson’s work on psychological safety — if you can’t admit a mistake, you’re doomed to make it over and over again. And paradoxically, the hospitals where the most mistakes were reported were also the safest hospitals. That was not what she expected.
Decisions that affect the entire fabric of our society shouldn’t be made behind closed doors in conference rooms at tech companies.
Launch and iterate works when you’re refining search results. It does not work at a nuclear power plant. You don’t launch and iterate on nuclear safety. Or in hospitals. Or, for that matter, when you are laying off thousands of people or dismantling public health infrastructure.
These mistakes have consequences. Real ones. Human ones.
And even when the consequences of failure are not that dire — like Apple — they were not a launch and iterate kind of culture because they were making hardware. You couldn’t just push a fix out. Apple was much more of a measure-a-hundred-times-cut-once kind of culture.
It’s very hard to measure the value of a well-functioning society. But I promise you — it matters more than engagement metrics.
I’ve spent a lot of time reflecting on my career in tech — and on what has gone wrong in recent years. I have to go all the way back to my first job out of business school at the FCC in 1996, when the Telecom Act came out. That’s where Section 230 was born — the section that lets tech platforms off the hook for accountability when it comes to content moderation.
At the time, it made a certain amount of sense. These companies didn’t even exist yet — Google hadn’t been founded; Zuckerberg was in third grade. But in retrospect, we wrote a blank check for all time. And now the check is coming due.
Mistakes made by tech companies today don’t just “break things.” They break people. They break democracy.
Facebook’s own research has shown that their algorithm promotes polarizing content because it gets more engagement. Engagement is what drives their business. People tend to engage more with fear, outrage, and FOMO — fear of missing out. The result? More eating disorders. More political polarization. Genocide planned on their platform.
And we, the public, bear the cost.
It’s very hard to measure the value of a well-functioning society. But I promise you — it matters more than engagement metrics.
This is not theoretical for me. At Google, I managed AdSense. I was responsible both for growing revenue and for policy enforcement — taking down sites that violated our policies. I believed — and still believe — that you weed your garden. If you let your garden get overrun by weeds because you’re chasing growth at all costs, you lose the garden.
But here’s what I learned: It’s not enough for leaders inside companies to debate these issues privately. These decisions are too important. There needs to be democratic oversight. There needs to be public debate.
I’ll never forget when my friend Josh Cohen told me, after I described a tough content moderation decision at Google: You are making those decisions? At first I was offended. But he was right. Decisions that affect the entire fabric of our society shouldn’t be made behind closed doors in conference rooms at tech companies.
Yet here we are. And the consequences are everywhere.
Slow down; fix shit.
I want to be clear — I’m not here to bash Facebook or lionize Google. I know good people at both companies who care deeply about these issues. But as long as the market rewards short-term earnings, as long as companies operate without accountability, as long as engagement tops everything — we are headed in the wrong direction.
“Move fast and break things” might have made sense when these companies were small. When Facebook’s mistakes didn’t really matter because Facebook didn’t really matter.
But that is not the world we live in now.
It is time to slow down. Fix our shit. And start paying the price for what’s been broken.
Radical Respect is a weekly newsletter I am publishing on LinkedIn to highlight some of the things that get in the way of creating a collaborative, respectful working environment. A healthy organization is not merely an absence of unpleasant symptoms. Creating a just working environment is about eliminating bad behavior and reinforcing collaborative, respectful behavior. Each week I’ll offer tips on how to do that so you can create a workplace where everyone feels supported and respected. Learn more in my new book Radical Respect, available wherever books are sold! You can also follow Radical Candor® and the Radical Candor Podcast more tips about building better relationships at work.