Facebook has done some psychological testing on their users and you may be one of them. The reveal that Facebook set out to manipulate people’s moods by using the news feed to send grim news to some and upbeat news to others has sparked outrage, according to “Fox and Friends” live on Monday morning June 30.
According to Today, Facebook is feeling the backlash after the public learned 600,000 users were unknowingly manipulated in this study. Who gave them the right to do this? You did when you clicked “I agree” when first joining the social media giant. That “I agree” comes after you are supposedly read the contract, which very few do.
The study tracked these 600,000 users by first sending them negative or positive news through the Facebook news-feed in 2012. Then they monitored the users’ posts to see what direction their moods went in after reading the hand-picked news of doom or uplifting stories.
The study unsurprisingly found that those who were fed stories that were negative, posted more negative posts on their Facebook page throughout the day. Those who were fed positive stories, seemed to have more upbeat posts on their Facebook pages.
According to the Globe and Mail, the findings of this study were published in the Proceedings of the National Academy of Sciences. The article was titled “Experimental evidence of massive-scale emotional contagion through social networks.”
It doesn’t take a rocket scientist to surmise that if you are reading news filled with gloom and doom your mood would follow suit. The same goes for reading uplifting articles, you’re probably in a good mood once you’ve done this.
Facebook users are furious that this was done without their knowledge. How far did this go? Did some of this mood altering that Facebook did with this experiment end up as a tragedy? Did it get someone out there so depressed that they suffered consequences from this experiment?
“I wonder if Facebook KILLED anyone with their emotion manipulation stunt. At their scale and with depressed people out there, it’s possible,” Lauren Weinstein privacy activist wrote in a Twitter post.
No one will ever know and that is what is wrong with this experiment. The unwilling participants were not aware that their mood was possibly being altered. The backlash from this shifty study on the part of Facebook prompted an apology from one of the study’s authors, Adam Kramer.
He took to his Facebook page to say:
“I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.”
There are too many things that the Facebook study couldn’t control from outside this study. How do they know if the person who appears to be in a negative mood from their Facebook postings didn’t just watch a newscast of a killing? How do they know if the happy campers out there didn’t just win the lottery or hit their dieting weight goal putting them in that positive mood?
Maybe Facebook users should be angered over the fact that this study insinuates that Facebook is the life-blood of the user and that their real life is secondary to the online world of social media. Wouldn’t folks’ moods tend to morph around what they are experiencing in real-life as opposed to what they are reading on Facebook?
There are just too many things that could have gone on in real-life to even suggest that people who read negative news tend to be in a more negative mood and vice versa. By the time they’ve read the stories that Facebook pumped into their news feed to alter their mood, the unknowing participants were most likely exposed to many things in their real-life that would have more of an effect on them than reading a story.
Do you accept this apology? What about a reassurance that they won’t try something like this again? That would carry more weight than the words; “in hindsight the research benefits of the paper may not have justified all of this anxiety.” What do you think?