Skip to main content

See also:

Facebook experiment put users at risk

Emotional Contagion
Emotional Contagion
Reuters/ Dado Ruvic

Who will protect us from social media sites like Facebook, Google and Twitter? When is enough an enough. That’s a billion user question folks, that should not go unanswered.

While social media sites are busy playing with their “human” dolls and being scientists, we need to take a good hard look at what they have been doing in secret; without the constant of their users.

Back in 2012, Facebook decided to run an experiment of massive scale involving 689,003 people (account holders). The information was reviewed on October 23, 2013, approved in March 2014, and was published online on June 2, 2014.

Did no one think the public had the right to know this was going on? An even better question, did no one think billions of users had a right to know they were being manipulated and forced into doing something they didn’t even know they were doing?

The experiment in question, which seems to be breaking out in news all over the world, was published with the National Academy of Sciences of the United States of America, vol.111 no.24 Adam D. I. Kramer, 8788-8790, doi:101073/pnas.132004011.
Authors Adam D. I. Kramer works on Facebook’s Core Data Science Team, Jamie E. Guillory with the Center for Tobacco Control Research and Education, University of California and Jeffrey T. Hancock from the Department of Communication and Information Science, Cornell University in Ithaca, New York.

The team designed the research plan, analyzed data and wrote the paper. Though they claim no conflict of interest this is highly questionable being the experiment shows evidence that emotional states can be transferred to others via emotional contagions, which leads people to experience similar emotions without being aware this is happening.

Jamie E. Guillory was from the Center of Tobacco Control and Education, University of California if he had nothing to gain from the experience on behalf of his rule at the University it is highly unlikely he would be there at all to take part. Even if it was just a paper, he was writing and collecting data.

Binghamton Online Marketing Examiner, Wendy Spickerman got in touch with Adam Kramer asking,

Dear Mr. Kramer,

I found the information about your massive scale emotional contagion experiment quite interesting. I was wondering if you could possibly provide me with more information.

First, off why did Facebook not inform its users that they were a part of a test study Facebook was/is running or even ask users to participate?

There are many people willing to take part in studies. Is it because Facebook felt it should not be held accountable or obligated to compensate such people and feel people would not volunteer?

This type of study raises a huge question off the back of the widespread issue of which Facebook has no known algorithm system in place when dealing with the security and safety of its users.

During the time of the study, was Facebook careful to pick users for this study who may not have a mental health issue?

Did it not cross the minds of those conducting the study that it could put users at risk and in harm’s way causing someone to act out?

Your study clearly shows emotional states can be transferred without people being aware? So during that time frame of which your experiment caused negative emotions to be transferred, how can you be sure you didn't cause someone to beat their spouse or go off their medication even commit suicide?

Mr. Adam Kramer has not made a reply to this line of questioning as of yet. If we do hear back from him, we will gladly share his reply though it is not likely.

On, June 29, 2014, Adam Kramer posted this message of apology out on his Facebook page. However, a Facebook page post is not really an apology to billions of users or the 689,003 users that were manipulated in this case study being not all users know he even has a Facebook page.

Adam D. I. Kramer in Floyd, VA
June 29 at 4:05pm •

“A lot of people have asked me about my and Jamie and Jeff's recent study published in PNAS, and I wanted to give a brief public explanation. The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product.

At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper.

Regarding methodology, our research sought to investigate the above claim by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012). Nobody's posts were "hidden," they just didn't show up on some loads of Feed.

The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.

While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices.”

Yahoo Finance Senior Columnist Rick Newman called it, “some people overreacting” but says, “This was an interesting case study in how Facebook should disclose things to users a little differently. This might be a lesson learned for Facebook.” As reported by Daily Ticker on July 1, 2014.

One user suggested the following:

  • Participants should be given the option to opt – in after being advised of risks of participating.
  • Complete a mental health questionnaire prior to participating.
  • Those with depression or suicidal ideation will be excluded from participating.
  • Follow – up survey should be used to ascertain emotional effects of the study after the study is completed.

Maybe Facebook should have asked her to be a part of the research group. Another user stated. “Using humans in social science research where conditions are manipulated ethically requires informed consent from the test subjects. This may been legal, but most definitely not ethical.”

Apparently Facebook did gain something from all this; more bad PR and they manipulate its users to be outraged by the experiment.

Sources: PNAS, Adam Kramer, Susan Lien Whigham, Cynthia Hendrickson and Daily Tricker.