Did you volunteer to be a guinea pig for the
) research study designed to determine whether negative posts on
your News Feed can make your spirits plummet, and cause you to post
negative comments to all of your friends, possibly getting them
bummed out as well?
And yet, we may have been among 700,000 Facebook users who
unwittingly participated in a scientific experiment in "emotional
The News Feeds of these 700,000 randomly-selected Facebook users
were manipulated in order to prioritize the display of posts that
used negative, positive or neutral words. Facebook-affiliated
researchers at Cornell University and the University of California
at San Francisco then studied the posts subsequently made by the
test subjects to see whether their mood was affected, and for how
The study results indicate that, yes, Facebook can yank your chain
remotely, any time it wants.
This happened for one week back in 2012, but only became known
publicly when the article appeared online in
titled "Experimental evidence of massive-scale emotional contagion
through social networks" in the journal
Proceedings of the National Academy of Sciences
The article concludes that, even in the absence of face-to-face
contact, "emotional states can be transferred to others via
emotional contagion, leading people to experience the same emotions
without their awareness."
Slate.com and the website of
broke the story over the weekend.
So, Facebook seems to have found a new use for the algorithm it
uses to show viewers the content they may find most interesting and
The incident is particularly striking because Facebook did not just
collect and hand over a batch of data it had collected on its
users. It does that all the time, as do many other web companies.
It secretly changed the service it provides to users in order to
test their responses to it.
As the story blew up over the weekend, Facebook management
apparently couldn't understand what all the fuss was about. Its
blog insists that the study was just part of its ongoing mission of
"understanding how people respond to different types of content,
whether it's positive or negative in tone, news from friends, or
information from pages they follow."
And, anyway, the company posting said, the study was vetted in its
"strong internal review process."
Great. So, maybe Facebook can start programming its News Feed like
Muzak, moving from frisky beats to soothing melodies depending on
how the company judges we should be feeling at this hour.
Some of the baby geniuses at Facebook seem to have grasped that
they have truly stepped in it this time. One of the paper's authors
an apology of sorts
on his Facebook page, in which he said that the paper should have
made it clear that the company only did the research because "we
care about the emotional impact of Facebook and the people who use
Yeah. They care so much that,
as Slate.com notes
, "Facebook intentionally made thousands upon thousands of people
sad." And, it did so without requesting or receiving "informed
consent" from its test subjects, a standard requirement for social
scientists who want to go messing with people's heads.
Facebook asserts that any permission it needed is covered by the
language of its standard terms of agreement, in which users agree
to the use of their data for analysis, testing and research.
So, does this kind of publicity hurt Facebook at all long-term? It
depends on how people around the world react to headlines like this
one, from the British newspaper
: "Facebook let shrinks MESS WITH YOUR HEAD, sans permission."
In the meantime, perhaps their ace researchers would like to
analyze another incident of emotional contagion: the fury of
Facebook users. Test samples can be found on
) as #FacebookExperiment.