The vast quantities of information now available require some kind of filtering, and so various filtering services including news aggregators, weblogs, and specialty sites of all kinds have arisen. All of that is to the good. True, much of the information on the Internet is of questionable veracity. And, much of what passes for information not only on the Internet, but also in the broader media is nothing more than polemic dressed up as analysis. And, of course, the sheer volume of it all would be overwhelming were it not mitigated by the available filters or by simply turning away from the computer, the television and the radio.
But so many people cannot or will not turn away for any extended period of time. Instead, they believe they need to be "updated" on a regular basis. I put "updated" in quotes since to me the news seems more or less the same every day with a few widely spaced and prominent exceptions. It is these exceptions that I pay attention to. But most stories fit into rather predictable categories which I label as follows:
- Prices are going up (or, more rarely, down).
- There's corruption in government. (Who knew?)
- The corporations are out to get us.
- It's dangerous out there. (Crime stories)
- Isn't that weird? (Human interest stories)
- GI Joe. (War coverage)
- How to lose 10 pounds without dieting. (Service stories)
Perhaps you can think of other categories. And, while stories in some of these categories are indeed important, those stories rarely provide the context or the intelligent analysis required to make them useful. On the other hand, crime stories are usually just sensationalism designed to attract subscribers and viewers.
Putting into the proper context what information we actually do need for something other than aiding and abetting our consumption--for, say, understanding public policy--requires conceptual training that can only come from reading well-written books and articles and engaging with other rigorous minds who challenge our own point of view. That is a much slower training process, and it will never occur at Internet speeds.
Environmental education giant David Orr likes to say that what we lack is "slow" knowledge. It is easy to learn how to take down a whole forest with a chainsaw. That's fast knowledge. But as I wrote in a previous post:
Teaching people the importance of trees in creating and protecting the soil, encouraging biodiversity, preventing runoff, storing carbon and influencing climate is a task that requires time, concentration and reflection. It assumes a body of knowledge about the natural world that most people simply don't have and therefore must acquire. And, it assumes an eye trained to look for subtleties in the natural landscape. Moreover, such learning does not yield the immediate and visible economic benefits of the chainsaw.
But even if we take the time to acquire the slow knowledge we need, we cannot solve the knowledge problem with more information. The world is too complex to comprehend by merely apprehending its parts. And, no human being can see all of the universe or even his or her part of it well enough to give anything but a very fragmentary account. We will always have huge areas of ignorance, particularly about the long-term consequences of the actions we take to reshape the ecosphere to our purposes.
And, even where we believe we have a lot of information--for example, the confident predictions about world oil and natural gas reserves or about the amount of uranium that can be extracted from the Earth's crust--we ought to look not to what we know for confirmation, but to what we don't know for guidance regarding the risks we face. Orr suggests that those lacunae in our knowledge should entreat us to employ wide margins of safety both in our daily actions and even more so in our collective policies.
It is possible, for example, that the optimistic estimates of the world's energy supplies are correct. The consequences of that would be that business as usual could proceed for a few more decades during which we could take a very leisurely attitude toward making the transition to a new energy economy. (I am, of course, setting aside the very serious risks related to climate change in this illustration.) The consequences of being wrong, however, could include catastrophic collapse. Hence, Orr's suggestion that we employ wide margins of safety when acting on what we think we know.
The hubris of the information society is that it imagines that data matter more than understanding and that we are moving closer and closer every day to completing the book of knowledge. The truth is we are creating vast new areas of ignorance. Two examples, one domestic and one industrial, illustrate the problem. Our highly productive modern farming and food production system has allowed the vast majority of people to forego learning anything about plants in their immediate area which are edible. And, since public policy in the United States (but no longer in Europe) puts the onus on the public to prove that a new chemical is harmful before it is banned (rather than putting the onus on industry to prove it is safe), industry releases thousands of new chemicals each year into the environment ignorant of their possible negative effects on humans and on ecosystems.
The most important first step in countering this trend is to recognize it and to act with the heightened sense of attentiveness, care and prudence which that recognition demands.