The question of whether you own your own face may not be as clear as you might think. Companies are already buying and selling information worldwide based on facial recognition technology. In January of this year I proposed that the United States adopt a constitutional amendment which would give each person ownership of his or her information including facial likenesses and any other biometric data. Now, some U.S. senators think that those gathering your likeness into their databases should have your permission first to do so.
Those senators are not alone. In September Portland, Oregon passed a sweeping ban on facial recognition technology for both government and businesses. San Francisco, Boston and Oakland have passed bans as well, but only for governmental agencies.
Those supporting such bans have cited racial and gender biases built into the algorithms controlling the technology as a central reason for the ban. Beyond this, a California legislator who led the fight to ban such technology for use in conjunction with police body cameras—including passing recordings through facial recognition software later—found out something even more disturbing. The technology which depends on a variety of algorithms is woefully inaccurate. The legislator and 25 of his colleagues were misidentified as persons listed in a law enforcement database as having criminal records.
That suggests another question: Would it be okay to deploy facial recognition technology everywhere governments and companies would like to if that technology were, say, 99.9 percent accurate?
There are those who would say, "If you aren't doing anything wrong, then you don't have anything to worry about." There are three problems with this statement. First, there are so many laws—traffic laws, sanitation laws such as laws against spitting in public, seatbelt laws, ordinances about conduct in public places including parks—that all of us are bound to violate some law at some time everyday. Have you ever walked across the street at some place other than a designated crosswalk? Do we want the police to be able to hassle anyone at anytime for a minor infraction as a way of intimidating the public?
Despite claims that facial recognition technology will be focused on solving major crimes such murder, rape and robbery, it turns out that it is being used in New York City for mostly low-level crimes. And, the technology continues to be error-prone, frequently fingering the wrong person.
The second problem is context. If I hand you a large stack of cash across a table in a café, I could be concluding a drug deal. I might also be buying a used car for cash. I could be paying off a bet (which might be illegal in some states that outlaw gambling). Then again, I might just be a friend giving you a personal loan or paying you back for a loan you gave to me. Algorithms won't necessarily be able to distinguish whether what I'm doing deserves the attention of law enforcement, and I might end up under more intensive surveillance without good cause.
The third problem is that some information will be gathered surreptitiously for business purposes, for instance, to identify me and my activities while in a store. That information will be sent to a database and used to target me for offers and promotions via email, regular mail and online advertising. Ask yourself: If a store had to declare at the front door that you are under surveillance for the purpose of identifying you and observing your behavior while in the store and then using that observed and recorded behavior to target you for its various marketing efforts, would you enter?
Facial recognition, it turns out, is a double-edged sword for law enforcement. The technology is now accessible to activists who want to hold the police accountable. Portland, Oregon, for example, allowed police officers to tape over their names during recent demonstrations that have rocked the city. This was done so that the officers would not be targeted for reprisals. But, aggrieved demonstrators who want to report officers for improper conduct are now seeking to discover their identities through facial recognition systems.
And therein lies the problem. If we don't own our own faces, then others can use them for whatever purposes they choose, not just governments or businesses, but also private individuals. The destiny of so-called surveillance capitalism is to put everyone's face and thus identity up for sale in ways that subject people to increasing manipulation and even physical danger.
However, if I own my own face, it is far less likely to be for sale, and I will have recourse to having it removed from places I don't want it to be (including surveillance databases). Models and other celebrities have enjoyed this protection for a very long time. Will we all have to sign on with the nearest modeling agency to get it?
P.S. Actually, we all have this protection if we only choose to exercise it. I have wondered for some time if some enterprising civil liberties organization might develop a test case to determine if every person owns his or her own face and can therefore prevent its storage in law enforcement databases when there is no probable cause or in commercial databases for marketing purposes. It has long been the practice of photographers and videographers to get releases from people they shoot regardless of whether those people hold themselves out as models. The only exception to this practice is for bona fide news coverage.
Kurt Cobb is a freelance writer and communications consultant who writes frequently about energy and environment. His work has appeared in The Christian Science Monitor, Resilience, Common Dreams, Naked Capitalism, Le Monde Diplomatique, Oilprice.com, OilVoice, TalkMarkets, Investing.com, Business Insider and many other places. He is the author of an oil-themed novel entitled Prelude and has a widely followed blog called Resource Insights. He is currently a fellow of the Arthur Morgan Institute for Community Solutions. He can be contacted at firstname.lastname@example.org.