Skip to main content
  1. News
  2. Top News

Facebook secretly manipulated countless news feeds for secret mood experiment

See also

No matter what you may think, Facebook isn't necessarily yours to control even if it is your profile, your pictures, and and your information. The Orlando Sentinel reported on June 28, 2014, that Facebook conducted a secret psychological experiment in which they are said to have manipulated 689,003 news feeds.

According to various reports, the experiment, first revealed by PNAS, was done to study "emotional contagion through social networks."

In otherwords, Facebook was checking to see how social media can affect your mood. They did this by changing the formula in the news feed that changes the amount of positive and negative content viewed by Facebook users.

From there, that information was used to see if it affected what was posted by users. The reason that Facebook users didn't notice is because they were still seeing content from their friends, but only certain posts and updates.

This secret Facebook experiment happened back in January of 2012 for the period of one week. The thing is, even if you felt "used" so to speak, the experiment was completely and totally legal. The terms of service for Facebook, which everyone has most likely not read, states:

"...in addition to helping people see and find things that you do and share, we may use the information we receive about you ... for internal operations, including troubleshooting, data analysis, testing, research and service improvement."

Even though Facebook never necessarily fessed up or gave proof that they did the experiment, they aren't denying it either. A Facebook spokesman released a statement:

"This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”

When all was said and done, it appeared as if as little as one-tenth of a percent of the observed actually changed. One thing that was rather interesting is that those that had their Facebook news feed manipulated didn't lean in a positive or negative fashion, but actually stopped writing less.

No matter what the case may be, Facebook conducted a legal and secret experiment on hundreds of thousands of users. Many aren't happy, but it was their right to do what they did.

Advertisement

News

  • Gaza conflict
    Israel official: Ceasefire and expanding Gaza conflict are both being discussed
    Video
    Video
  • Air Algerie crash
    Conflicting reports about the Air Algerie disappearance begin to emerge
    Top News
  • Arizona execution
    A Senator familiar with torture denounces Joseph Wood's botched execution as such
    US News
  • Ray Rice suspended
    Ravens' running back Ray Rice is suspended for two games for a domestic violence arrest
    Sports
  • Amazon tribe falls ill
    An Amazonian tribe catches influenza after making first contact with the developed world
    Headlines
  • CIA prison in Poland
    Poland is declared guilty of human rights violations for hosting a secret CIA prison
    Video
    Video