Skip to main content

The Globe and Mail

Facebook psychology experiment raises ire

A smartphone user shows the Facebook application on his phone in the central Bosnian town of Zenica, in this photo illustration, May 2, 2013.

© Dado Ruvic / Reuters/Reuters

A Facebook scientist says he is sorry for performing an experiment in which he and two other social researchers covertly tried to influence the mood of thousands of Facebook users, making them either happier or sadder by tweaking what they saw in their social-media accounts.

Two weeks after their study was first made public, anger about Facebook users being used as unwitting guinea pigs had ballooned to the point that Facebook had to issue a statement this weekend and the lead researcher tried to apologize.

"My co-authors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety," Facebook data scientist Adam Kramer posted on his Facebook account Sunday afternoon.

Story continues below advertisement

Ethical concerns were raised after Mr. Kramer and two university researchers published a paper earlier this month, revealing that they manipulated the Facebook news feeds of 689,003 users as part of a psychology experiment.

The users were not notified and the researchers argued that they had obtained consent because the fine print users agree to when signing for an account mentions that information could be used for research.

"We did this research because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out," Mr. Kramer argued in his Sunday note.

The study's results were published in the June 17 issue of Proceedings of the National Academy of Sciences. In an e-mail to The Globe and Mail, Princeton University psychology professor Susan Fiske, who edited the article, said the experiment was approved at Cornell University by an ethical review board, known in the United States as an institutional review board (IRB).

Some people knowledgeable about research ethics, however, were dubious about the way the study was conducted.

"This study attempted to manipulate participants' emotional experience. No IRB I have ever worked with would waive consent or debriefing for such an intervention," Jeffrey Sherman, a psychology professor at the University of California, Davis, said in an e-mail interview.

According to Cornell University, the Facebook experiment was funded in part by the U.S. Army Research Office.

Story continues below advertisement

University of Ottawa law professor Gordon DuVal, who chairs the research ethics board at the National Research Council Canada, said the experiment doesn't seem to follow current standards, which require an informed consent from the subjects. Even in experiments where deception is required, the subjects need to be informed, debriefed and notified afterward, Prof. DuVal said.

Furthermore, he added, there was no indication that the researchers bothered to identify whether any of the Facebook users they manipulated were minors.

The paper initially claimed that Facebook's data use policy, to which all users agree prior to creating an account, constituted "informed consent for this research."

As part of the agreement's 2,263-word section on what data Facebook collects from users, the company warns that it could use information "for internal operations, including troubleshooting, data analysis, testing, research and service improvement."

In a statement to the media, Facebook noted that the study was conducted for a single week in 2012. "None of the data used was associated with a specific person's Facebook account," the company said.

Facebook didn't address more specific questions sent by The Globe and Mail, referring a reporter to the public note Mr. Kramer wrote Sunday.

Story continues below advertisement

Mr. Kramer and his fellow researchers didn't see the actual posts because of privacy concerns. Instead, the three million posts that were analyzed were automatically sorted depending on whether they contained words identified as positive or negative words.

"These data provide, to our knowledge, some of the first experimental evidence to support the controversial claims that emotions can spread throughout a network," the article said. "… Given the massive scale of social networks such as Facebook, even small effects can have large aggregated consequences."

Report an error Licensing Options
Comments

The Globe invites you to share your views. Please stay on topic and be respectful to everyone. For more information on our commenting policies and how our community-based moderation works, please read our Community Guidelines and our Terms and Conditions.

We’ve made some technical updates to our commenting software. If you are experiencing any issues posting comments, simply log out and log back in.

Discussion loading… ✨