Are you breeding snakes?
We love machine metaphors.
We wish life could be neat like that. Press a button, or pull a lever, or flip a switch … and something predictable happens. Something turns on or off, goes left or right, begins or stops.
Clean, predictable, and safe.
There’s little at risk, because it does just what it promised it would.
And then–beyond turning on the light or heating up the oven or typing on the keyboard–there’s the rest of messy life.
You do something involving people and the system you’re in … and all kinds of complications follow.
It starts well enough. After you’ve taken some action, there’s some minor variation, plus or minus, on the expected outcome. It’s not exactly what you thought might happen. But it’s close-ish.
But then comes a whole grabbag of unintended consequences.
The “cobra effect” is a story, maybe true and maybe not, that’s sometimes told to make the point. When India was under British rule, a bounty was offered for every dead cobra.
That worked for a bit, until it didn’t. The Indians, seeing a good chance to make some easy money, started breeding cobras so as to keep up the supply.
When the British realized they were being fleeced, they stopped paying the bounty. The Indians, now with an excess of stock, released all the snakes they’d bred.
The cobra problem ended up worse, not better. (It’s but one example of many.)
We tend to focus hard on the intended consequences, and ignore for as long as possible the unintended consequences.
The question that I ask to imagine the unseen disaster is this: What’s the opposite outcome I’m hoping for? And how might the thing I’m trying to do here lead to that very outcome?
It’s the flip to Roger Martin’s famous strategic question, “What needs to be true for this to happen?” In both cases, you start at the end point and with curiosity and bemusement, work back.
Want The Works in your inbox?
Sign up (free) here