Thursday, July 3, 2014

On the Facebook Emotions Experiment OR In Praise of the IRB

There are now hundreds of articles and blogs on the Facebook emotions experiment. Yesterday's non-apology apology from COO Sandberg only intensifies the discussion. She said that she's sorry we got upset about it. I suspect that she's sorry we found out about it. She hasn't said that she's sorry they did the experiment. 

As for me, I'd like to use this moment to think about the importance of that most annoying academic institution - the Institutional Review Board.


*

The history of science is filled with people who had good intentions and interesting research questions conducting research that we have later judged to be unethical. As researchers, no matter our intentions, we are pretty bad at assessing ourselves.

The history of science is also filled with pretty awful people doing terrible things. Scientists have psychologically damaged patients, given them dangerous drugs, and even infected them with syphilis all in the interest of increasing knowledge.

In response to such terrible abuses and the challenges of objectively reviewing even much less dangerous work, scientists of all types have collaborated over the last half of the 20th century to develop robust systems of assessing the ethical basis of experiments.

Over the last few days, revelations about a Facebook experiment designed to manipulate their users’ emotions have sparked considerable outrage and debate among scientists and researchers. The episode raises questions about how well our review systems are functioning today, especially given the ease and power with which companies like Facebook can intrude into our private lives and thoughts.

Academic researchers have to get their work approved by Institutional Review Boards (IRBs). IRBs ensure that researchers don’t expose their subjects to undue risks, consider whether informed consent is necessary, and otherwise protect subject (i.e. us) from risky experimentation.  It’s bureaucratic, but important work.

Honestly, like many researchers, I find the process of getting IRB approval to be really annoying. Instead of filling out paperwork, I just want to collect data and do my research (I study student learning, among other things). I know I’m a good guy and not going to hurt anyone.  

History teaches us, though, that good people can make mistakes and expose research subjects to risks. Research subjects are much safer with IRB protocols in place, and so are researchers, really, if we don't want to go down in history as unethical.

This window into academic bureaucratic procedure matters because the people studying you right now, who might even be manipulating you, may not have gone through any such process.  They may assume you’ve agreed to be guinea pigs when you click “I accept” on an End User License Agreement (EULA), no matter how long, no matter how obscurely written.

In January 2012, Facebook collaborated with a group of academic researchers to study how emotions transferred. For one week, 689,003 users had either more negative or positive posts added to their news feed. The researchers wanted to see how emotions transferred from one person to another online.
In other words, if you saw only happy posts, were you more likely to write happy posts? If you saw negative posts, would you, likewise, go negative? The answer to both questions is yes. The effect was modest but statistically significant.

Companies test their customers or users all the time to see how we react. Are we more likely to click on the funny ad or the serious ad? Are we more likely to share with a friend with the light background or the dark background?  By making users sadder, in particular, Facebook did something that feels different.
Facebook’s data scientist, Adam Kramer, has explained  that they were worried people would stop using Facebook if their news feed was too negative, so they deliberately made some feeds more negative.  We only found out about it because Kramer collaborated with two academic researchers, and sought publication in an academic journal.

The journal, for its part, really should have seen this as fruit of the poisoned tree, but I can't blame the editor for not seeing it that way initially.

We only found about it because two academic researchers used the fruit of Kramer’s research to publish an academic piece on emotional transfer.

I am bothered by the idea of Facebook making people sadder. Yes, the effect was tiny, but I do know people who are depressed or experiencing suicidal ideations who do not need even the most modest nudge.  It’s also an ethical problem that they didn’t screen minors from the experiment. I hope we can all agree that psychological manipulation experiments on children requires some pretty severe oversight. Facebook didn’t get any.

Since the story broke, there has been a lot of confusion about who reviewed the experiment. The academic journal cited Cornell’s IRB, which released a statement saying that because its researcher had only results and not individual data, no IRB approval was needed.

Facebook claims that the research went through an internal review process, but has not specified the details. Facebook maintains that they have come a long way since 2012 and now put any research projects through comprehensive internal review.  My request for clarification of that internal review process was not returned.
 
When experimenting on people, there are two options. First, you can get informed consent. While some have argued that agreeing to Facebook’s Terms of Service counts, let’s be clear. There’s nothing “informed” about clicking, “I agree.” As technology ethicist Catherine Flick notes, modern EULAs do not resemble the informed consent in the way that either good research or good medical practice mandates.  Moreover, as Kashmir Hill discovered, Facebook only added “research” to its terms in mid-2012, four months after the emotion study.

But informed consent is not always necessary. A properly constituted review board can determine that risks are minimal and consent isn’t necessary. That’s just what happened with other experiments that used Facebook and generated no outrage. Before Timothy Ryan, a political scientist, studied, “What makes us click,” he got IRB approval.

Adam Kramer didn’t.

The Facebook emotion experiment is not a disaster. It probably wasn’t dangerous. It did, however, involve deliberately manipulating people in order to see what happened next.

Corporations like Facebook, whether they interact with academia or not, need to embrace more rigorous and independent ethical oversight of their experimentation. 

It may end up being annoying, it may slow them down a little, but the history of science and the depth of access that these companies have to our innermost lives demand it.