Well, in a not entirely unexpected manipulation of user’s personal information, Facebook is facing criticism after it emerged it had conducted a psychology experiment on nearly 700,000 users without their knowledge.
The test saw Facebook modify news feeds to tailor which emotional expressions the users were exposed to. The research was done in collaboration with Cornell University and the University of California at San Francisco to assess if “exposure to emotions led people to change their own posting behaviours”.
Facebook said there was “no unnecessary collection of people’s data”. Quite who the arbiter of “unneccessary” is, we are not told. “None of the data used was associated with a specific person’s Facebook account,” the social networking behemoth added.
Some, though, have criticised the way the research was conducted and raised concerns over the impact such studies could have.
“Let’s call the Facebook experiment what it is: a symptom of a much wider failure to think about ethics, power and consent on platforms,” Kate Crawford posted on Twitter. Meanwhile, Labour MP Jim Sheridan, a member of the Commons media select committee has called for an investigation into the matter.
“This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be to protect people,” he was quoted as saying by The Guardian newspaper. “They are manipulating material from people’s personal lives and I am worried about the ability of Facebook and others to manipulate people’s thoughts in politics or other areas. If people are being thought-controlled in this kind of way there needs to be protection and they at least need to know about it.”
The research was conducted on 689,000 Facebook users over a period of one week in 2012.
According to the report on the study: “The experiment manipulated the extent to which people were exposed to emotional expressions in their News Feed”.
The study found that users who had fewer negative stories in their news feed were less likely to write a negative post, and vice versa.
I think, in the REAL world, we call that “empathy”.
Adam Kramer of Facebook, who co-authored the report on the research, said: “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”
“I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused.”
So, no apology for the gross infringement of privacy, or the cynical manipulation of Users in a social experiment with almost zero validity, then.