Whenever I hear about a new technology that "empowers the individual," I know that one thing is likely to be true about it: It will soon (if not already) be turned to negative and harmful ends. And yet, we as a society keep falling for the line that somehow every new technology will give us more control over our lives and make us somehow happier, more connected, safer and more powerful (but only in a good way).
It's true that practically any technology can be turned toward harmful ends; we haven't banned knives because they are used both to cut food and kill people. But it is the scale of damage that can be done by an individual that is changing.
Newspaper columnist Molly Ivins used to joke that she was not anti-gun, but pro-knife. In a 1993 column she wrote:
In the first place, you have catch up with someone in order to stab him. A general substitution of knives for guns would promote physical fitness. We'd turn into a whole nation of great runners. Plus, knives don't ricochet. And people are seldom killed while cleaning their knives.
Ivins was getting at the increased scale of damage that can be done by, say, automatic weapons versus a knife.
Guns have been around for centuries and have been made more lethal over time. But their lethality may someday soon seem quaint given the future of "empowerment" that awaits us.
I start with unmanned aerial vehicles which are more familiar to us as drones. Their initial use case was actually as toys, remote-controlled model airplanes for which there remains robust demand among hobbyists. How innocent all that seems compared to the killer drones now deployed by militaries around the world! Hardly a week goes by without a report about what is called a "drone strike." Last week was no exception.
As terrible as the power to kill individuals or groups remotely from thousands of miles away seems, more terrible is the evolution of military drones toward autonomous attacks without any contemporaneous human supervision. The ultimate expression of this evolution comes from a chilling short video seemingly depicting a sales presentation for so-called "slaughterbots," cheap, small-scale, artificial intelligence (AI) enabled drones that can be programmed to seek out and kill specific individuals and large groups. The video is, of course, fictional, but not science fiction according to the director, Stuart Russell, a computer science professor at the University of California, Berkeley. The technology to make this type of drone a reality is already available.
To take things one step further, imagine a private individual not affiliated with any military or police organization wants to kill a rival, a spouse, or people of a race or ethnicity he doesn't like. He may soon be able to outfit a drone to do his mayhem for him while never getting near the site of the murder (or mass murder as the case may be).
The lethality of drones and AI together and the ubiquitousness of cheap drones may turn what is also a convenient way to deliver goods, rescue people, take aerial photos or send aid to remote areas into a cause for constant surveillance of everyone—to make sure they don't launch their own private drone strikes.
Another example of "empowerment" that is now portrayed as a "hobby" could lead to catastrophic consequences, either intentional or unintentional. For some time genetic engineering kits have been available online for do-it-yourselfers. But what exactly will they do? Making yeast glow as one kit allows seems harmless.
The technology, however, could certainly be adapted to other forms of life; viruses come to mind. Playing around with viruses for fun could get tricky and no one is selling hobby kits to do that (that I can find). But if a virus hobbyist kit arrives, it will be hard to distinguish ahead of time those simply trying to be entertaining from those hoping to be dangerous. The ability to make dangerous designer viruses has been around for a while now and the consequences could be civilization-destroying. What advances in the life sciences could be worth risking that? The question is almost never asked.
"The technology and economics of large-scale DNA synthesis have driven the cost of gene synthesis down approximately 250-fold in just 10 years," according to this 2018 research article. As it becomes even cheaper to engage in what is called synthetic biology, more people will have access to it—and not necessarily well-intentioned ones.
Our lust for the new and the advanced is multiplying the systemic and catastrophic risks we face as a global society. In my view, if we as a species want to survive the century, we must do the unthinkable: Abandon technologies that pose the risk of systemic ruin or at the very least severely restrict and monitor their use. One group is calling for a global ban on autonomous weapons. Regulating synthetic biology will be tricky as laid out in this piece.
Finally, it is important to realize that threats from various novel technologies do not stand in isolation. These threats can be combined to increase their dangers. I'm imagining a drone that simultaneously delivers a lethal designer virus to a group of cities targeted by an adversary, whether a country or a non-state group. You can bet that someone else is imagining that, too!
Kurt Cobb is a freelance writer and communications consultant who writes frequently about energy and environment. His work has appeared in The Christian Science Monitor, Resilience, Common Dreams, Naked Capitalism, Le Monde Diplomatique, Oilprice.com, OilVoice, TalkMarkets, Investing.com, Business Insider and many other places. He is the author of an oil-themed novel entitled Prelude and has a widely followed blog called Resource Insights. He can be contacted at kurtcobb2001@yahoo.com.
1 comment:
You may be unaware of your own recency bias. It’s only human to think we know the breadth of all history (which we cannot) and to then make assumptions followed by assertions that are incorrect. Those who do not learn from history are destined to repeat it - ad infinitum. Such is the condition of all people - we are merely looking through a keyhole at an entire universe of human history. In this piece you tend toward focusing concern on issues that you think are only part of this moment in history - which may be lending to the unhappiness your father warned you about in your 2014 piece on the subject of writing hand-wringing material as you sometimes do (a good read btw). Perhaps most obviously laced with recency bias is your assertion that the original use case for drones being hobbyist in nature — JFK’s brother died executing a WWII drone strike against the Nazi’s during project Aphrodite which was certainly not hobbyist in nature. For reference: https://airandspace.si.edu/stories/editorial/remembering-death-lt-joe-kennedy-jr-and-americas-first-combat-drones
Post a Comment