I prefer to do my research before leaping to outrage.
Of course, I'm referring to the recent revelation that Facebook manipulated users' news feeds to see if it had any emotional impact on them.
It's worth asking the question... just what the hell did Facebook experiment with? And why did they do it? And should I really be mad because of it?
If you're wondering what my credentials are to offer this analysis, there's not much. I'm an armchair cognitive and social psychologist, so studies like this always pique my interest. So I'll try to be a bit silly (though not as silly as usual), and I'm certain to skip over lots of relevant details, and hopefully you'll learn a thing or twenty along the way.
Recently, social sciences are in the midst of a fad studying how emotions and other things spread in social groups. The fancy words for this are "emotional contagion" which reflects how emotions can be shared among people verbally and non-verbally.
I first heard about this from the Framingham Heart Study which showed how obesity and happiness spread through family and social connections. If you read the details, you'll learn that you're at an an increased likelihood of gaining weight if your friend, spouse, or family member gains weight. The same study shows happiness spreads through those ties in a similar, "more likely" fashion.
Further studies have shown that yes, emotions are contagious. For example, this effect has been seen in lab settings. Researchers in China studied Weibo posts to show how anger spreads faster and better than happiness or sadness in social networks. If you're a believer in evolutionary psychology like me, then you'd say that empathy evolved to maintain healthy social groups (read as: mates).
By contrast, some people have argued that simply using Facebook makes us less happy, regardless of what content we're exposed to. In her book Alone Together, Sherry Turkle argues that replacing rich, face-to-face interactions with computer-mediated communication leads to an illusion of intimacy; the convenience of communication technology actually causes us to be more isolated. (Yeah, this is a slapdash summary. Just read the book.)
There's some scientific research to support her perspective. A University of Michigan study found that Facebook use predicted a decline in a person's well being. They polled students throughout the day, asking for their mood and how much they used Facebook since the previous polling. They found that the more someone had used Facebook, the worse their mood would be. Sadly, the press misinterpreted this as "using Facebook makes you unhappy" because they don't understand the difference between correlation and causation.
We get it. What did Facebook do?
Most experiments to date have looked for correlation between an emotion in a social network and that same emotion in the experimented individual. Facebook sought to answer a variant of the question: if you decrease seeing a certain emotion in a social network, does it result in an increase of the opposite emotion in the individual?
I'll spare you the brain-numbing experience of reading the paper with this hasty summary. Facebook with help from some folks from Cornell tested two conditions -- whether seeing fewer positive posts would lead to increased negative postings, and whether seeing fewer negative posts would lead to increased positive postings. They randomly picked 600k Facebook users from the FB database for the study.
How did they test this? Facebook manipulated the news feeds those users, showing them fewer happier or fewer sadder posts based on words which matched those emotional outcomes. Then they checked those users' subsequent posts to see how the emotional content changed. The experiment lasted a week.
It worked. People exposed to fewer positive words expressed more negative words in their posts and vice versa as compared to control groups. The researchers noted that this effect was greater when positive posts were hidden than for the group who had negative posts hidden.
Another result is that this showed how emotions can be affected non-verbally, independent of face-to-face contact. Are the emotional outcomes in our own Facebook posts because of content we see in Facebook or do they reflect, for example, emotional outcomes in our face-to-face relationships as filtered through our posts? This seems to demonstrate the former, that Facebook posts alone can affect our emotional state (in our posts).
Also, people in the experiment group were less expressive overall, while people in the control group (who saw greater emotional content in their posts) used more emotional terms in their own posts. The researchers posit that this rebuts the arguments by Turkle and others that using Facebook makes us less intimate with others.
And here's the part where you get mad
Reason number one to be mad -- there was no need to manipulate users' feeds to prove the point. Instead, they could have run a lab study to see how the decreasing emotional input affects your emotional state. Such a large scale study with all the data manipulation was unnecessary. I wonder if that occurred to them before they started this study.
Number two -- the experiment worked. Kinda. In the various experiment conditions, the percent of positive or negative words budged 0.1% at most. That means this change affected the emotion of 1 out of 1000+ words posted on Facebook. That's not much of a change. Even if it's a "statistically valid" result, I wouldn't brag about it. Also, that change doesn't mean users were happier or sadder. It only proves that they changed their emotional word choice.
Getting angrier, the researchers stretch to shoot down arguments that using Facebook makes you less happy, like those in Turkle's book or the uMich study I cited earlier. But that's comparing apples to oranges. Facebook was checking post content days after the experiment went into effect. The Michigan study was checking on the emotional states of people a short time after checking Facebook. Turkle's book was ethnographic (interviews). You can't refute one with the other. This only leads to more questions about why these research results are so different, or why they needed to conduct an experiment of this scale to prove the point.
There's also the question of researcher ethics -- specifically that they went forward with a research plan that you intuitively know would make a group less happy as a result. Honestly, I'm not that bothered by an experiment that would make people happier as a result; we could use more of that in the world. Cornell held the experiment at arm's length, saying that their researchers "did not participate in data collection and did not have access to user data". Seems rather disingenuous to wash your hands like that when you could be all but certain of the negative results that some in the experiment would face.
So who in Facebook should have stopped this? Nobody. As XKCD perfectly noted, Facebook would have done this research anyway. And they probably have done this research before but never made the results public. If the results of this experiment lead to people staying on the site longer or clicking more ads, then it was worth it to Facebook.
This leads to another point. If you've read the paper, you might have noticed that the most interesting results are missing. Did users in the experiment group come back to Facebook more or less often? Did the experiment had a positive or negative effect on Facebook's bottom line? Were there any long term effects? In other words, was it worth ($$$) it for Facebook? The paper is painfully silent.
This experiment is just a drop in the bucket. We're the subjects of hundreds of online experiments on a daily basis. Google might be the most notable of the experimenting bunch (for example, testing 40 shades of blue to see which performs best), and other companies have followed Google's lead hoping this ruthless pursuit of optimization will lead them to similar success. However, few if any of those studies are made public, and even fewer get the attention that Facebook's study did.
So why did this experiment in particular trigger so much anger? My hunch is that the outrage lies somewhere between "can we really trust Facebook?" and "fear of missing out on what my friends wrote." It's the same outrage that underlies big data's ability to manipulate our lives which danah is all over. We're quite helpless against the machinations of their algorithms. And thanks to that research I cited, we know that the anger about this issue can travel very quickly via social networks.
Even if Facebook is manipulating your emotions, then so is Google's search algorithm, that game you just played on your iPhone, and your circle of friends. To have long term effects, you'd need consistent exposure to the same emotional outcome over long periods of time, in the same way you get lung cancer not from one cigarette but from years of smoking. Facebook is just a small part of your total emotional well-being. Emotional reciprocity over long periods of time is a much more important part of that well-being.
That's why I see no harm -- and lots of potential good -- by letting people choose to be part tests like this. Why not let people opt-in to a Facebook experiment to prove that you can be demonstrably, permanently happier by manipulating Facebook posts for years? Add a "Make Your Facebook Happier" button. I could do with fewer curmudgeons and negative posts in my news feed. I'm sure lots of others would agree.
Look, I'm sympathetic about what happened here. There's never a straightforward answer when trying to balance the needs of your audience with the needs of your employer. I've had to make many ethical decisions about the products I've managed in my time. You make the best choice you can, apologize if you make a mistake, and keep moving. Even if the decision seems significant, it's usually small potatoes in the end. Time will pass, people will forget.
So what's really important? The people around you. If you remember only one thing from this rant, make it this. If you want to change yourself -- emotionally, physically, anything-ly -- the best thing you can do is to change the people around you. Facebook posts, violent video games, and how much porn you watch are small potatoes compared to the company you keep.
Who you hang out with determines what you dream about and what you collide with.
And the collisions and the dreams lead to your changes.
And the changes are what you become.
Change the outcome by changing your circle.