Celebrate the Friction: Dark Facebook, like Dark Google, stands tall in the lineup of new absolutist information powers who imagine their users as passive sheep so hungry for their services that we’ll acquiesce to any manipulation.  But worldwide outrage over Facebook’s secret “research” into emotional contagion is a sign that we are not passive and will not quietly accept a narcotized robot world owned and operated by Big Tech. This friction is cause for celebration. The absolutism and concentration of information power represented by Facebook, Google, and other tech companies will not stand without a contest. We are in the realm of politics now, and that’s exactly where we should be.

There are many reasons for outrage in this stew, and the experiment itself is only one. The experiment does more than violate well-established principles of human subjects research. It provides a window into Facebook’s absolutist assumptions and how those are expressed in its daily policies and practices.  Despite its history of apologizing for privacy violations, Facebook once again appears to regard its users as so many anonymous eyeballs and revenue streams who will happily succumb to any manipulation as long as they get to play online.  But the more we learn about the true price we pay for the ‘free’ services of Facebook, Google, etc., the more we contest their legitimacy.  This contest is the key to a future in which the networked world is compatible with principles of democracy and individual rights.

The Experiment: Facebook’s internal research board signed off on research in 2012 that aimed to test for mechanisms of emotional contagion through online text. The research systematically manipulated the emotional content of daily feeds without users’ knowledge, let alone their informed consent. To do this, they varied the number of posts with positive or negative words in the daily feeds of nearly 700,000 users for one week. They found that exposure to more positive words produced more positive words in users’ communications and more negatives produced more negatives.  They drew the inference that their manipulation of the content of daily feeds effected the psychological states of users: their moods and emotions.  Ergo, “emotional contagion.” You can read the article in the Proceedings of the National Academy of Sciences here.

The article itself makes bold claims for its findings that I do not find persuasive, but that’s a subject for another discussion. The key point here is twofold: 1) Facebook’s research manipulated not the quantity of messages or the design of the page, both routine, but rather the actual content of feeds. This is messing with the intimate reality of a user’s contact with his or her personal world. One wonders how many other instances of such secret manipulation there are. The history of this subject suggests that once such capabilities are developed they do not lie fallow. They grow in scale and scope as they become available for colonization by ever more powerful interests  2) The researchers, and the Facebook internal review board that cleared the research, believed that it was okay to conduct such research secretly, without the knowledge or consent of those nearly 700,000 users whose feed would be affected. They claimed that Facebook’s generic user agreement covered the problem of informed consent.  That’s not acceptable. But even on FB’s own terms, subsequent reporting reveals that “research” was not even included in those terms until four months after the experiment was conducted. The researchers, Adam D. I. Kramer from Facebook’s Core Data Science Team, James Gilroy from UCSF, and Jeffry T. Hancock from Cornell, and their academic editor, Susan Fiske from Princeton, should have known better. They probably did know better.  Indeed, Adam Kramer issued an apology of sorts—a strange document for many reasons. The grandiose claims of the article are repudiated and recast as weak results. But more interesting still is the obvious fact that he did not imagine the wave of outrage that greeted his triumphant publication in a respected academic journal. All those hapless users were not fooled by the academic white-wash.  Adam tried to put the toothpaste back in the tube, but it can’t be done.

Facebook Absolutism:  As I wrote about in my recent Frankfurter Allgemeine Zeitung essay, “Dark Google,”  Facebook, like Google, represents a new kind of business entity whose power is simultaneously ubiquitous, hidden, and unaccountable.  I encourage you to read the full article (drawn from my new book still in progress), but I”ll summarize a few of the themes here.

The procedures and effects of the new information barons are not well understood, and therefore they can not be effectively restrained.  Individuals and societies are vulnerable to these new powers in ways that we are only gradually coming to light.  Facebook, Google, and other tech companies provide services that many have come to regard as essential for basic social participation.  Indeed, many of us viewed these firms as harbingers of a new more democratic world enabling unprecedented voice and connection.  We thought they were a new kind of company, aligned with our interests. Now the hidden costs of their new goods––in lost privacy and autonomy at the hands of commercial and even state-sponsored surveillance––are slowly being revealed.  The new firms have reverted to the old GM paradigm but with powers and scope beyond anything the world has known. Now, it’s increasingly the case that the ambition of these companies is shifting from collecting information about us to intervening in and shaping our daily reality for their commercial benefit.  I have called this “the reality business” because “reality” is the next big thing that the tech firms want to carve up and sell. In the data business, the payoff is in data patterns that help target ads. In the reality business, the payoff is in shaping and communicating real life behaviors of people and things in millions of ways that drive revenue.  The business model is expanding to encompass the digital ‘you’ as well as the ‘actual’ you. This is precisely what’s exemplified in Facebook’s secret experiment.

The “reality business” reflects a shift in the frontier of data science from data mining to “reality mining”  described in one academic paper as “a God’s eye view.”  But the reality business aims beyond the God’s eye view to the God-like interventions that can shape and control reality.  Facebook’s evident interest in mastering the mechanisms of emotional contagion, like Google’s glasses, self-driving cars, or investments in everything from the wired home to drones and satellites share this purpose: to influence and shape human behavior along the lines that feed their bottom line.

This brings us to the precipice of a new development in the scope of the market economy. A new “fictional commodity” is emerging as a dominant characteristic of market dynamics in the 21st century. “Reality” is about to undergo a fictional transformation and be reborn as “behavior.”  This includes the behavior of creatures, their bodies, and their things. It includes actual behavior and data about behavior. It is the world-spanning organism reborn as information and all the tiniest elements within it. This dominion over “reality”  is the new currency of the networked surveillance sphere.

The big question we are left with: Is reality for sale?  Right now the outrage of Facebook users says NO! The question remains: how do we translate our outrage into the kinds of institutional and legal frameworks that insure democratic principles and humanistic values in a networked world?