Sunday, April 15, 2018

Fake news, algorithmic sentinels, and facts from the future

The suggestion that social media outlets need to police so-called "fake news" rings true on its face. Who wants to read news coverage known to be false? But what rates as "fake news" will be harder to define than we think.

And, putting algorithms in charge of policing those vast information flows claiming to be news will almost certainly not solve the problem. In a piece reflecting on artificial intelligence (AI) on the 50th anniversary of the release of the film, "2001: A Space Odyssey" writer Michael Benson tells us that "[d]emocracy depends on a shared consensual reality."

Well, actually everything we do in groups, whether it's democracy or going to a hockey game, depends on shared consensual reality. And, therein lies the problem. We are now in a fight not over opinions concerning the import of agreed upon facts, but over the consensus itself—whether scientific findings can be trusted, whether corporate-owned media can be believed, whether "objective" reporting is even possible, whether the history we were taught is indeed the "true" history of our country and our world.

Which consensus prevails will be crucial to every facet of our society. It is true that consensus views are constantly being challenged by events. To the extent that events can be fit into consensus views, the consensus can survive. In fact, the consensus can be tweaked when necessary. The idea that free trade is always good has been tweaked in the past to admit that it is not good for everyone and that those who lose their jobs need special assistance. The consensus survived and free trade agreements continued to flourish.

Now, the consensus is vanishing. Large parts of society do not believe that the current system serves them well. Wealth is being shifted up the income ladder as middle- and low-income families find their wages stagnant or declining. Wars that purport to defend America and Europe from terrorism don't seem to have any definable benefit (other than to military contractors). Rural areas are neglected as large cities grow wealthier. Free trade seems only to have devastated formerly prosperous industrial cities in wealthy countries.

As the declining personal circumstances for so many people pound down on our frayed consensus and those people seek answers for their plight, they are prone to believe even wild tales concocted by commentators and bloggers to explain the baffling decline. Whether people see through these tales depends on their access to other views and their willingness to check things out for themselves.

Returning to Michael Benson's piece for a moment, he writes: "We still have it in our power to purge malicious abuse of these systems, but Facebook, Twitter, YouTube and others would need to plow much more money into policing their networks — perhaps by themselves deploying countermeasures based on A.I. algorithms." Sounds like a solution. But is it?

I wonder what will count as "fake news" in these algorithms. My special interest is in sustainability related claims. Will statements that human societies are facing demonstrable limits in available resources be allowed through the news firewall of tech titans who believe we have unlimited growth ahead? (Check out the singularity idea.) Will suggestions that we must move toward an economy that uses less not more resources be filtered out? Will analyses which show links between climate change and disruptions of agriculture through flooding and drought be rejected?

If you believe this is an unwarranted concern, you need look no further than ThinkProgress—publishers of ClimateProgress, an outstanding site covering climate change news—which is being blocked from receiving advertising because its coverage is deemed "too controversial" by online advertising networks.

Now, here are a few samples of what I believe will not get filtered: 1) Straight line projections of world economic growth through 2050, 2) population projections through the same year, and 3) the notion that "humans will be living and working on Mars in colonies entirely independent of Earth by the 2030s."

All of these things are stated as facts even though they lie far in the future. I'm putting them in a special category which I call "future facts." These are claims which cannot possibly fit the category of facts since they have not happened. And yet, the media and many others treat these projections as if they are simply facts.

Now, it turns out that anything we call a fact does not actually stand alone. It stands in relationship to many other facts which we must accept first. Climate change is a fact only because it stands in relationship to literally millions of other facts that have been meticulously observed and verified by methods accepted by the scientific community worldwide. We should not pull an isolated "fact" out its context. And yet, that is what happens every day in our daily discourse and in our media.

What I'm getting at is that the superficial way in which many people evaluate facts fails to recognize that facts are contingent and not absolute. That doesn't make facts less useful. But, it should make all of us much more careful about what we accept as facts.

As I'm looking at my comparison of things which might get filtered versus those which I believe most assuredly will not get filtered, I'm asking myself which set of so-called "facts" looks more like "fake news."

Definitions of "fake news" depend entirely on context and assumptions. In other words, they are contingent.  I'm not giving up on distinguishing fact from fiction. What I'm calling for is a careful look at big pronouncements about the future of humankind and the biosphere and the context and assumptions behind such pronouncements: Are such pronouncements mere extrapolations of present trajectories? Are they mere statements of faith about the future?

This kind of analysis is demanding. It requires some digging and above all a lot of thinking. But that's what most of those who are feeding us what passes for news are hoping we won't do. And, it's doubtful that any algorithm can do the sorting and analysis for us without excluding a lot of real news about our climate and environmental predicament.

Kurt Cobb is a freelance writer and communications consultant who writes frequently about energy and environment. His work has appeared in The Christian Science Monitor, Resilience, Common Dreams, Le Monde Diplomatique, Oilprice.com, OilVoice, TalkMarkets, Investing.com, Business Insider and many other places. He is the author of an oil-themed novel entitled Prelude and has a widely followed blog called Resource Insights. He is currently a fellow of the Arthur Morgan Institute for Community Solutions. He can be contacted at kurtcobb2001@yahoo.com.

No comments: