Skip to main content

See also:

Facebook secret psych test: Facebook has been toying with our emotions in secret

Facebook
Facebook
Facebook / Wikipedia

A Facebook secret psych test has been conducted on a platform of 700,000 users, landing the social media giant in hot water after the nefarious test surfaced. Users are wondering why the emotion test was ran behind the scenes, and the fact that the social site, with its hundreds of million active accounts, is flexing some muscle into potential privacy issues leaves many Facebookers a little uneasy.

Reports WidowsITPro:

“For all its success in attracting over one billion users, Facebook is surprisingly tone deaf when it comes to meeting their expectations. And this week, public uproar over the social networking giant's policies reached a new apex when it was revealed that Facebook had secretly and without permission altered how the service worked for almost 700,000 members so it could psychologically gauge their reactions to overly positive and negative news.”

Facebook actually manipulated the type of news feeds that were directed into user’s accounts. They dumped overly positive or negative news and then monitored the user’s reactions by checking out how the stories affected them and their subsequent posts.

“The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product,” one of the study's co-authors explained. “We were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. [But] we didn't clearly state our motivations.”

Writes the Inquisitr:

What did the Facebook secret psych test achieve, anyway? Apparently, emotions expressed by other people that you are friends with on Facebook affect your own mood. The findings have been dubbed the first “massive-scale emotional contagion” ever on social media. In short, moods are contagious.

But some are upset with being fed skewed news items without their permission. What if the steady stream of negative news reports was the push that sent someone over the edge?

Facebook clearly crossed some lines, both ethically, and it appears, legally as well.

"The goal of all of our research at Facebook is to learn how to provide a better service," the explanation continues. "Our goal was never to upset anyone."

Too late for that.

As reported by The Guardian, Maryland University law professor James Grimmelmann claims Facebook failed to obtain the "informed consent" of the study subjects as required by US laws that create an "ethical and legal standard for human subjects research."

"Federal law requires informed consent," he writes. "The study harmed participants ... The unwitting participants in the Facebook study were told (seemingly by their friends) for a week either that the world was a dark and cheerless place or that it was a saccharine paradise. That's psychological manipulation."