Facebook in trouble over emotion experiment

Social network "manipulated" news feeds in 2012 to test emotional impact

Facebook has come under attack after it emerged it had run a psychological experiment on nearly 700,000 users without their permission.

The experiment saw Facebook deliberately “manipulate” users' news feeds to control which posts were displayed to them. The goal of the project, according to the social network, was to see if being exposed to certain posts of a similar emotion would impact their own posts.

The research was done in partnership with two US universities, but is believed to have affected users around the world. According to Facebook, it took place over the course of one week during 2012.

Facebook defended the experiment claiming that there was “no unnecessary collection of people's data”.

“None of the data used was associated with a specific person's Facebook account,” it added in a statement.

The two universities involved in the study were Cornell University and the University of California at San Francisco.

Dislike

The revelation has caused a backlash again the social network. Facebook may now be called to appear in front of a Commons media select committee to explain itself to the UK government.

At least one MP – Labour's Jim Sheridan – has called for an investigation into the experience. Sheridan is a member of the Commons media select committee.

“This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be to protect people,” he was quoted as saying by The Guardian newspaper.

"They are manipulating material from people's personal lives and I am worried about the ability of Facebook and others to manipulate people's thoughts in politics or other areas.

“If people are being thought-controlled in this kind of way there needs to be protection and they at least need to know about it.”

Some have questioned whether or not Facebook could also be in trouble with the UK's Advertising Standards Authority.

“Despite Facebook's insistence that this was merely an academic experiment, it sails perilously close to the illegal world of subliminal advertising,” commented director of digital marketing agency DPOM, Brett Dixon.

"There's a reason this insidious form of manipulation is banned - it is an abuse of people's freedom to choose.

“Whether it appeals to the head or the heart, all advertising seeks to influence people's mood. But there's a big difference between influence and control.”

Sorry

“We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out,” commented Adam Kramer, a Facebook employee who co-authored the report.

“At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook.”

However, he admitted that the firm did not “clearly state our motivations in the paper”.

“I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused.”

Ben Furfie is a former freelance writer for T3.com who produced daily news stories for the site on tech and gadgets. He also live-managed the T3 Award websites during the 2013 and 2014 T3 Awards. Ben later moved into web development and is now a Technical Development Manager leading a team of developers.