Facebook is trying to alter your mood

In a shocking revelations exposed in The Atlantic, Robinson Meyer writes that Facebook administered studies that aim to alter users’ mood by virtue of the content that is funneled into their newsfeed. From his article, Everything We Know About Facebook’s Secret Mood Manipulation Experiment, Meyer writes:

Facebook’s News Feed—the main list of status updates, messages, and photos you see when you open Facebook on your computer or phone—is not a perfect mirror of the world.

But few users expect that Facebook would change their News Feed in order to manipulate their emotional state.

We now know that’s exactly what happened two years ago. For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves.

This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it. 

Even the Princeton professor who edited the research study had serious concerns, and thought it was “creepy.” In a follow up article, Even the Editor of Facebook’s Mood Study Thought It Was Creepy, Adrienne LaFrance writes:

Catching a glimpse of the puppet masters who play with the data trails we leave online is always disorienting. And yet there’s something new-level creepy about a recent study that shows Facebook manipulated what users saw when they logged into the site as a way to study how it would affect their moods. 

But why? Psychologists do all kinds of mood research and behavior studies. What made this study, which quickly stirred outrage, feel so wrong?

Even Susan Fiske, the professor of psychology at Princeton University who edited the study for Proceedings of the National Academy of Sciences of America, had doubts when the research first crossed her desk. 

“I was concerned,” she told me in a phone interview, “until I queried the authors and they said their local institutional review board had approved it—and apparently on the grounds that Facebook apparently manipulates people’s News Feeds all the time… I understand why people have concerns. I think their beef is with Facebook, really, not the research.”

Needless to say, the implications of this research is outrageous. For what purpose does Facebook feel the need to actively alter people’s moods?  To throw an election, perhaps? Or to neutralize political dissidents?  Or to support Facebook-backed politicians like Cory Booker?

This may be the tip of the iceberg in revealing Facebook’s behavior, and could be the beginning of Mark Zuckerberg’s undoing. He has been very weak on standing up to the government’s overreach, and his firm continues to comply with giving the government unauthorized and unwarranted access to user data, including location information, access to all messages, status updates, photos, and including graphical mapping of an individuals social network.

The one thing about Facebook that appears to be certain: it’s founder sounds squirrelly and the company cannot be trusted.

Be the first to comment

Leave a Reply