Recently it struck me that there is a metaphorical similarity between technological disasters and geological earthquakes. The last 100 or so years have seen a big increase in the amount of damage caused by earthquakes, both to lives and to property. The US Geological Survey maintains a list of the most destructive earthquakes in history, those which caused more than 50,000 deaths. I counted 22 earthquakes. Of those, exactly half of them occurred since 1900.
So why are these geological phenomena killing so many more now than in the past?
I’m sure a lot of details are hiding somewhere in the platonic fold, but here are a few that come to mind:
- Some external force is causing the earthquakes (oil drilling, fracking, Dr. Evil, …)
- We have more sensitive monitoring equipment (if a tree falls in the forest…)
- Global population is higher - thus there are more people to affect
- More people live closer to seismically active regions
I don’t have the scientific know-how to determine what is causing the earthquakes and equipment sensitivity just doesn’t belong in a discussion about earthquakes causing more than 50,000 deaths. I vividly remember being violently shaken awake by the Northridge CA quake in 1994 - my sensory equipment was working just fine thank you very much.
The final two items on that list are related, and I’m going to zero in on the last one: Many people choose to live on fault lines. As to why some bungling band of boneheads decided to build a city in an area known for seismic disturbances is a discussion for another time and person. David Wald of the US Geological Survey says that these tectonic nightmares are going to occur once every 250 years or so on average. That’s not so bad right? Well when you have about 25 such cities, a devastating earthquake will occur about once every 10 years.
“You can name about 25 cities that are like Port-au-Prince. They’re not going to shake but every 250 years [on average]. But if you can name 25 of them, you’re going to have an event like this every 10 years”
David Wald, a seismologist with the U.S. Geological Survey
So the fact that more people than ever have clustered around likely earthquake epicenters is one of the factors that amplifies their destructiveness.
What does this have to do with technology?
Just think back within the last several months. A number of tremors have sent shock waves of fear uncertainty and doubt rippling through the techscape. Think of Heartbleed, Target, eBay, or maybe don’t if you suffer from hypertension. idtheftcenter.org maintains a list of all discovered data breaches. So far this year it has uncovered 335 such breaches. How do you measure the damage caused by cyber-attacks, outages, and digital housekeeping negligence? We’ve had our collective consciousness flooded with images of death and destruction caused by literal earthquakes; a cyber attack can crash through your life like a tsunami and leave you clinging for dear life to the telephone pole. But in the aftermath, nobody sees mud and dried blood caked on your face. The foreclosure sign in front of your house isn’t quite as foreboding as a heaving pile of rubble. And FEMA doesn’t come around passing out water bottles.
Why are these disasters on the rise? For the same reason that earthquakes are killing more people: More and more of the global village is packing up and immigrating to the fault line that is communications technology. Those that don’t aren’t affected (as much) by these tremors. Just like you can’t die at sea unless you’re at sea in the first place, you won’t feel the tech-quake if you’re too far away from the epicenter.
A hundred years ago, these types of technological tremors didn’t exist. The fault hadn’t opened up yet. Even by the middle of the 20th century only governments and universities were beginning to set up a makeshift camp around this curiosity. Fast forward to 2014 and, if you’re reading this article, you’re renting a little hovel and living next door to me here in the ghetto. We’re all in it, huddling together around a fire in a garbage can with an moth-eaten military blanket draped over our shoulders. And for the most part, we’re loving every minute of it. It’s all great fun games and productivity until the poo hits the propeller - some big incident comes along and rocks the bedrock of our networked world.
Again, why are these tech-tremors occurring with such unnerving frequency? Why is the crust of our software planet so fragile and fractured?
The software industry has made huge strides in the last 60 years but it’s still a young industry which much to learn - too much to learn. It’s nigh impossible for one person to know it all. Even if it were, he/she wouldn’t be infallible. We’ve leveraged the limited brainpower we do have to hide all of the mind-blowing complexity behind layer after layer of abstraction and encapsulation and to a large extent we just trust that the object/service/API we’re programming to just works. But the result is that our software and networking infrastructure more resembles the shanties of Port-au-Prince than the spires and colonnades of Rivendell. Seriously, if we could analyze their source code, how much of the apps to which we’ve entrusted so much are little better than prototypes in production?
Our software and networking infrastructure more resembles the shanties of Port-au-Prince than the spires and colonnades of Rivendell.
Automated tests have helped but the tests only catch what we remember to test for. In many cases testing for every possible permutation of input and output is combinatorially impossible. So we do the best that we can, oftentimes with the rod of tight schedules on our backs and the whip of unstable requirements ringing in our ears.
But of course the migration will continue. We will keep clustering closer and closer around the grinding and churning tectonic platters of networked software. So when the next inevitable slip happens and the platter clatters, data splatters, and the industry’s in tatters, be sure that the software you built… was up to code.