Skip to main content

See also:

Facebook changed over 680,000 users' news feeds for psychology experiment

Users may want to hit "Dislike" after Facebook manipulated their data for a psychology experiment.
Users may want to hit "Dislike" after Facebook manipulated their data for a psychology experiment.
mkhmarketing via Flickr.com

Facebook manipulated users’ news feeds in order to conduct a “massive” psychology experiment on how the content of posts influences emotions, according to a new paper published in the journal Proceedings of the National Academy of Science. And of course, Facebook being Facebook, they didn’t bother to ask first.

Over a one-week span in January 2012, Facebook cooked the news feeds of 689,003 users, hiding certain posts and displaying others according to the needs of the study. Only English-speaking users were affected.

Researchers sought to determine whether seeing mostly happy posts in your feed would make you feel more positive, while a string of sad posts would lower your mood. The unsurprising conclusion: News from your friends does, indeed, affect your emotions.

Previous studies had found that talking face-to-face with a person will make you feel better or more depressed, according to your friend’s mood. But those studies failed to rule out the personal interaction as a factor. If a person’s mood improved, for instance, it might be because they spent time with their friend, not because their friend was upbeat or shared good news.

Facebook had the answer: Rig the posts on news feeds. There’s no interaction, so changes in mood would be purely the result of the content.

If it seems self-evident that reading something from a friend expressing positive or negative emotions would have an impact on your own emotions, remember that psychology researchers discover the obvious all the time. Recent studies, for instance, have determined that some people finish difficult tasks first, many children watch TV, and sleep is important.

Why? Because We Can!

Lead researcher Adam Kramer, identified as part of “Core Data Science Team, Facebook, Inc,” is Facebook’s on-staff data scientist and the study’s designer, according to The Verge, which broke the story yesterday.

The study is unapologetic about its methods, claiming that users agreed to be used as lab rats when they signed up for Facebook. “All users agree [to the Terms and Conditions] prior to creating an account,” they said. And checking that box “constitute[d] informed consent.”

Actually, a true informed consent is informed - it includes, among other this, the risks and benefits. Since researchers believed that negative posts would lower a user's mood, all users - but especially those at high risk for depression - should have been told it was a risk, and given the opportunity to opt out.

Informed consent also includes consent - to this study, not other uses. "You said 'yes' to something we asked, a long time ago; therefore, you agreed to whatever we want, forever" is not informed consent. Ethical scientists know this.

Other justifications include: “Content [we hid] was always available” - if you went off News Feed and traveled to the site of the friend who posted it. Further, it “may have appeared” on later News Feeds - or earlier ones.

In case you’re not reassured, they added: “The experiment did not affect any direct messages sent from one user to another.” Disappointingly, they didn't go on to list all the other things they did not do, such as “feed you poison” and “cut off your left foot.”

Study data was anonymized, and researchers did not see your personal details. Not that Facebook has a history of keeping your private data private. EPIC is a good source for the whole appalling history.

Playing With Emotions

Facebook’s privacy policy (which you agreed to, all those years ago when you signed up, and surely remember perfectly) says they may use your information “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.” That statement does include “research,” but it’s unclear how a study for publication in a non-Facebook scientific journal, in which three of the four authors had no affiliation with Facebook, is an “internal operation.”

The full Data Use Policy is mainly concerned with who gets to see the data you share, not whether Facebook can mess with your ability to see posts shared by others. And certainly not whether Facebook can mess with your emotions.

Because that’s exactly what they did. When users’ news feeds were artificially loaded with negative posts, researchers found those users became quiet, withdrawn. They didn’t post as much, or make as many comments on their friends’ posts. What they did say was more negative.

That’s depression. And deliberately inducing depression in another human being? That's just sick.

+++

Want to tell Facebook a thing or two?

(From Facebook's site) “If you have questions or complaints regarding our Data Use Policy or practices, please contact us by mail at 1601 Willow Road, Menlo Park, CA 94025 if you reside in the U.S. or Canada, or at Facebook Ireland Ltd., Hanover Reach, 5-7 Hanover Quay, Dublin 2 Ireland if you live outside the U.S. or Canada. Anyone may also contact us through this help page: https://www.facebook.com/help/contact_us.php?id=173545232710000"