WebProNews

Did Facebook Cross The Line This Time?

Facebook went and freaked a bunch of people out again. They were about due, weren’t they? This time, the freak-out comes from an academic study of all things, looking at how Facebook can manipulate users’ emotions based on the posts they choose to show in the News Feed.

Some people feel Facebook has crossed a line here, while others essentially consider it par for the course on the Internet of today (not to mention on Facebook itself).

Are you comfortable knowing that Facebook can potentially alter your mood by showing you certain types of posts? Did Facebook cross the line? Share your thoughts in the comments.

The study is called “Experimental Evidence Of Massive-Scale Emotional Contagion Through Social Networks”. The abstract explains:

Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. Emotional contagion is well established in laboratory experiments, with people transferring positive and negative emotions to others. Data from a large real-world social network, collected over a 20-y period suggests that longer-lasting moods (e.g., depression, happiness) can be transferred through networks…although the results are controversial. In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that, in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion, and that the observation of others’ positive experiences constitutes a positive experience for people.

“We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.,” the researchers say in the “significance” section. “We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.”

You can dig in here.

Naturally, people are a little uncomfortable with Facebook taking these liberties.

It’s important to keep this in perspective though. They did this with a reported 0.04% of users over a single week two years ago. That might not make you feel any less dirty, but chances are if you weren’t included, and even if you were, it was a long time ago, and likely of little significance to you now other than a general creepy feeling. Of course, you never know when they’re running any kind of experiment, so things like this could really happen at any time without notice.

Facebook’s Adam Kramer, who co-authored the study, took to Facebook to respond to the outrage:

The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.

Regarding methodology, our research sought to investigate the above claim by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012). Nobody’s posts were “hidden,” they just didn’t show up on some loads of Feed. Those posts were always visible on friends’ timelines, and could have shown up on subsequent News Feed loads. And we found the exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion (positive) encourages it rather than suppresses is.

And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it — the result was that people produced an average of one fewer emotional word, per thousand words, over the following week.

The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.

While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices. The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we’ve learned from the reaction to this paper.

Feel better about it now?

As others have pointed out, Facebook’s terms of service pretty much allow it to do this type of stuff as it pleases. Unfortunately, it has come to light that Facebook made changes to its terms to cover the experiment four months after it actually conducted.

Sam Biddle at Valleywag writes, “The most valuable lesson for the company might be that it can keep creeping us out and violating its customers, over and over again, and none of us will ever delete our accounts. I’d love to read that study.”

Let’s just hope nobody involved in in the experiment killed themselves. It certainly wouldn’t be the first time we’ve heard of suicides related to Facebook.

At least it’s probably in Facebook’s interest to keep you happy rather than depressed. If you read too much depressing stuff on Facebook, you might decide you don’t want to use it so much. And that would of course mean that you won’t click on ads.

Facebook continues to tweak its algorithm based on your behavior. While the study was conducted in 2012, just last week, the company announced changes to how it shows videos to users. Facebook knows how much video you’re watching, and will show you more if you watch more and vice versa.

In reality, as an internet user, you’re subject to all kinds of tests from the various services you use, not to mention advertisers, at any point in time. It’s the trade-off we make in exchange for the sites and apps we use every day.

Is using Facebook worth letting them decide what content they want to show you at any given moment? Let us know what you think.