To bridge the gap between now and then, the 63-year-old Kurzweil downs 200 pills a day consisting of various herbs, vitamins or other supplements to "reprogram" his body's biochemistry and improve his chances of reaching what he calls the "singularity," a time after which technological change will occur at a pace so fast that the only way we will be able to understand it is to merge with our machines. Humans will at that point become human-machine hybrids.
(It would not matter much what Kurzweil thinks were it not for his globetrotting speaking tours and widely read books that have influenced much of the world's elite who also seem similarly bereft of a suitable education in the relevant sciences. Moreover, his meta message seems to be that we should just sit back and let technological geniuses like him fix every problem including climate change and resource depletion.)
What Kurzweil misses is that humans became human-machine hybrids with the first stone spear tip, and that the results of our marriage with tools have been mixed. Not to worry, Kurzweil tells us in the film, "technology has been the only thing that's enabled us to overcome problems." There's not a hint of recognition that technological solutions have a habit of spawning new problems. There's not a hint of recognition that as we catapult into the digital and biotech ages, we are actually losing basic knowledge about how to interact with the Earth around us in ways not dependent on fragile, hypercomplex industrial systems.
The ultimate expression of Kurzweil's vision is his desire to resurrect his beloved father from the grave using technology that will supposedly become available by the time the singularity arrives. It reminds me a bit of Jurrasic Park, the Michael Crichton novel depicting the resurrection of dinosaurs from DNA found in ancient preserved mosquito blood. It seems the premature death of Kurzweil's father continues to be a cause for genuine heartache for Kurzweil to this day. Who wouldn't want to bring back deceased loved ones so as to enjoy their company again?
Kurzweil's misguided lunacy is summarized by a neuroscientist interviewed late in the film. Kurzweil has misunderstood death as a technological problem with a potential technological solution when, in fact, it is a spiritual problem without any technological solution. Can human life be extended by technology? Of course. Can the quality of human life be improved in old age by technology? Of course. Can death be avoided altogether by technology? Of course not.
Still, Kurzweil insists on his website for his book The Singularity is Near: When Humans Transcend Biology that death will be overcome:
We will be able to assume different bodies and take on a range of personae at will. In practical terms, human aging and illness will be reversed; pollution will be stopped; world hunger and poverty will be solved. Nanotechnology will make it possible to create virtually any physical product using inexpensive information processes and will ultimately turn even death into a soluble problem.
The film makes a point of emphasizing Kurzweil's faith in the continued exponential growth in the power of information technologies. Whereas a competent ecologist rightly fears the results of exponential growth, Kurzweil embraces it as the solution to everything. And, he extrapolates the rapid change we've seen in IT to nearly every aspect of our society without any discussion of the physical speed limits that the environment and society place on such change. He predicts that solar power--which he regards as part of information technology--will expand so fast in the next 20 years that it will supply all of our energy needs. It's a nice thought.
But the physical constraints of extracting the minerals needed--rare elements are currently essential--are not considered. Nor is the needed ramp-up of manufacturing and installation capacity, assuming the necessary materials were even available. Not yet developed nanoengineered materials will supposedly make this possible. Naturally, there won't be any negative, unintended side effects.
Kurzweil's view really depends on the concept that we can "manage" the biosphere and even the universe through information technology, and that assumes that we can input all the knowledge necessary to do this "managing" into that technology. Never does it occur to him that some or most of that knowledge is still unknown and that perhaps most of it could never be digitized in a way that would give us complete mastery over the universe.
And that is in the end what Kurzweil is really about. His views are the logical extension of Enlightenment ideas that rational thought will ultimately lead to complete mastery of all physical processes and even the ability to transcend death. He predicts that we will send nanotechnology-based probes into the universe to harness its materials and to infuse it with "intelligence." He states flatly:
In the future everything will become intelligent. Nanobots will infuse all the matter around us with information. Rocks, trees, everything will become these intelligent computers. So at that point we are going to expand out into the universe...The universe will wake up. It will become intelligent, and that will multiply our intelligence trillions of trillions-fold.
(I've always thought that the universe was pretty intelligent without our help, and that it is we who must learn from it.) To accept Kurzweil's view, one must throw out much of what we know about biology, ecology and evolution. Perhaps he knows something about settled facts in those disciplines that the rest of us don't. More likely, he doesn't.
The "information" in information technology is information as we humans with our limited powers of perception define it. And our power to gather that information and categorize it usefully has limits as well. Naturally, Kurzweil assumes that the machines of tomorrow will do that for us and better. But he assumes that humans know what they are talking about when it comes to the workings of a vast universe. And, he assumes that we can never run out of the resources or energy to make his future happen.
Our history belies that. Everything we do has unforeseen, hidden and occasionally devastating consequences because of our puny understanding. Complex societies have disappeared before, and history tells us that they will again.
To imagine otherwise is to live in the same virtual reality that appears to inform Ray Kurzweil. To assume we humans have no limits is a dangerous mind game that the ancient Greeks long ago recognized as hubris. And, hubris, as everyone knows, is always followed by nemesis.
3 comments:
I first read about the so-called singularity only last year, coming across his site by way of an ecology blog. I honestly thought, at first, that the site was based on satire. As I read further, I slowly came to realise that he was serious. Meanwhile the only image I could conjure in my mind was that of a virus.
We can't live respectfully within our present environments on the good spaceship earth. Image us mindlessly mining the universe and inhabiting it with hyrid humans whose sole/primary capacity is to become supremely efficient at extraction and consumption. The borg (star trek critters) look absolutely benign compared to the singularity.
I fear the man is obsessed with demise. His own. As with all the obsessed, his vision is limited and his understanding warped. Alas, it turns out he is only human afterall.
He is a genius, however he is not immune to magical thinking. He did create a great digital sampled piano which captured the nuances of the various changes of attack and decay depending on how hard you hit a key. He did this before anyone else.
You are assaulting Kurzweil in very arrogant manner emphasizing how stupid he is compared to your sober view.
I don't see him pretending to be all-knowing analyst, rather fiction writer and philosopher, where provocative thinking is very acceptable.
(think of Jules Verne.. launch to moon from a canon.. how stupid he must be to write it)
Singularity is conceptual category - technology driven by technology. Kurzweil himself has explained that
1) technology develops (is developed) regardless of whether we individually try to stimulate it or block it.
2) it may as well never happen, because of collapse of civilization or technology or just plain nature laws.
3) if it happens- only a few of scenarios are relatively happy for then-living humans.
I found it thoght-provoking.
Post a Comment