Skip to main content
  1. News
  2. Business & Finance
  3. Marketing & PR

Facebook releases statement on "emotional manipulation"

See also

On July 1, 2014, Facebook released a statement to Binghamton Online Examiner, Wendy Spickerman in regards to the recent outbreak of disapproval and scrutiny it has faced over a study published with the National Academy of Sciences of the United States of America, vol.111 no.24 Adam D. I. Kramer, 8788-8790, doi:101073/pnas.132004011.

When asked why Facebook did not inform its users that they were a part of the recent study a Facebook spokesperson stated, “When someone signs up for Facebook, we’ve always asked permission to use their information to provide and enhance the services we offer. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow.

The fact that Facebook did not inform its users ahead of time has people questioning Facebook’s ethical values. Furthermore, this current statement from Facebook brings into question “just what else have they been doing”. One would like to know if they gave Facebook the right to have their child named after Mark Zuckerberg.

Just to recap what we’re talking about here so everyone is on the same page. This is an update to our last article covered by Wendy Spickerman here at examiner.com "Facebook Experiment Puts User's at Risk". This is a news update from Facebook which has just been received at 10:52 a.m. today.

Back in 2012, Facebook decided to run an experiment of massive scale involving 689,003 people, using account holders as guinea pigs. The information was reviewed on October 23, 2013, approved in March 2014, and was published online on June 2, 2014.

We hear at examiner questioned the risk Facebook took by manipulating its users emotions, users were unaware Facebook was purposely initiating their reactions by "moving things around" lets call it.

Authors Adam D. I. Kramer works on Facebook’s Core Data Science Team, Jamie E. Guillory with the Center for Tobacco Control Research and Education, University of California and Jeffrey T. Hancock from the Department of Communication and Information Science, Cornell University in Ithaca, New York. The team designed the research plan, analyzed data and wrote the paper.

The experiment in question shows evidence that emotional state can be transferred to others via emotional contagions, which leads people to experience similar emotions without being aware this is happening. We asked Facebook, “During the time of the study, was Facebook careful to pick users for their study who had no known mental health issues.”

"Your line of questioning ignores the fact this research involved showing different kinds of content in News Feed to a certain number of users. To be clear: this was content users already had access to on Facebook. Every time someone visits News Feed, they see a sampling of content we think they’ll be interested in. For this research, we did not manipulate content, delete content, or create content. We simply prioritized certain kinds of content and de-prioritized others, and measured the impacts.” A Facebook spokesperson stated.

When asked if those conducting the study realized they were putting users at risk and in harm’s way, Facebook did not comment.
While Facebook says they did not create content, only used what was already there by prioritizing certain content and de-prioritizing others. Maybe the focus should have been on prioritizing user’s safety, as Facebook cannot be certain the study in question did not cause harm to others.

Furthermore, Facebook brings to light a question we have been asking for some time, as Facebook has no known algorithm system in place to deal with threatening content other than it being reported, “this was content users already had access to on Facebook.” So does that mean, because you didn't create the content Facebook, you don't take responsibility for how this study effected a persons emotions causing them to react with similar emotions?

While Facebook tries to put out the fire one worker and two college students created, they seem to still be ignoring the fact this research involved manipulating its users.

Advertisement