News & Views
Can Facebook control your feelings?

Facebook has been asking us what’s on our minds for a decade now. But simply knowing our emotional state is no longer enough. The next step, apparently, is to predict, even control it. Following an experiment carried out in partnership with academics from Cornell and the University of California, Facebook now believes it holds the key to making users happy or sad, by moderating what they see in their news feed.

By filtering the content visible on the homepage feeds of 689,000 users, Facebook determined that individual posting behaviour could be influenced. Says the study:  “Emotions expressed by friends, via online social networks, influence our moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks.”

It’s all very dystopian-sounding, and predictably, there has been a giant backlash from journalists, tech experts, and politicians. Labour MP Jim Sheridan has spoken out:  “They are manipulating material from people’s personal lives and I am worried about the ability of Facebook and others to manipulate people’s thoughts in politics or other areas… This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be to protect people.”

Kate Crawford, Principal Researcher at Microsoft and Visiting Professor at MIT, has made a career studying the political and ethical implications of a world ruled by data. She tweeted the following: “Let’s call the experiment what it is: a symptom of a much wider failure to think about ethics, power and consent on platforms.”

Facebook maintains that there was “no unnecessary collection of people’s data”, and “none of the data used was associated with a specific person’s Facebook account”, but that seems to rather be missing the point. The wider implications of the study, and how its findings may one day be put to use, are chilling.

Co-author Adam Kramer has since come forward to clarify the purpose of the study: “We felt it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook… I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused.” Crawford was quick to point out that while Kramer apologised for the way his team’s motivations were presented, there has yet to be an apology for the lack of consent on the part of the study’s 689,000 unwitting participants.

Not that we should be at all shocked or surprised, says psychologist Katherine Sledge Moore. Speaking to the BBC, she stated that “based on what Facebook does with their news feed all of the time, and based on what we’ve agreed to by joining Facebook, this study really isn’t that out of the ordinary… The results are not even that alarming or shocking.”

There are no comments

Add yours