If Facebook’s Secret Study Bothered You, Then Quit Now

If Facebook's Secret Study Bothered You, Then Quit Now

There are more than 1 billion Facebook users—I’m one of them—and the reality is that all of us are test subjects for the website’s not-so-grand experiments.

ŸIs there a better way to get users to watch videos or convince them to buy something? Is there anything to learn from where a user’s cursor lands?

Facebook has done all of these tests—and likely hundreds more—in the past few months alone. And almost no user raised a peep.

But for one week in January 2012, nearly 700,000 Facebook users were part of a small A/B experiment: To see if changing a user’s News Feed to show either a slightly high number of positive or negative status updates would affect their behavior.

We know this now because, in a first, Facebook published the results in the Proceedings of the National Academy of Sciences. It turns out that tweaking the News Feed algorithm had a mildly contagious effect on users, who were a bit more likely to post their own positive or negative status updates.

And the response to the study has been overwhelming—overwhelmingly negative for Facebook, that is.

“Anger Builds Over Facebook’s Emotion-Manipulation Study,” Colin Daileda reports at Mashable. “Even the Editor of Facebook’s Mood Study Thought It Was Creepy,” Adrienne LaFrance writes at The Atlantic.

(Kashmir Hill, who’s been all over this story for Forbes, has a great summary of the study and captured Facebook’s nonplussed reaction.)

Did Facebook mistreat its users?

Facebook’s study was almost certainly legal. Users agree to terms of service that permit the company to do these sorts of experiments. And Facebook isn’t the only Web platform testing all kinds of interventions to see customer behavior. (More on that later.)

The more interesting debate: Was it ethical?

And that question’s struck a nerve, with passionate arguments on both sides.

Why Facebook acted unethically

In a scathing piece for Slate, Katy Waldman writes that the study’s “methodology raises serious ethical questions. The team /4/have bent research standards too far, possibly overstepping criteria enshrined in federal law and human rights declarations.”

Waldman adds that Facebook didn’t sufficiently warn users that their data might be used for research; that acknowledgment is buried in the fine print of the company’s Data Use Policy.

Why Facebook’s behavior was defensible

In response to Waldman and others, researcher Tal Yarkoni offers a cogent defense of Facebook on his blog. (Yarkoni’s a PhD who runs UT-Austin’s Psychoinformatics Lab; he was unaffiliated with the study.)

According to Yarkoni:

  • Manipulating a News Feed isn’t so different from what Facebook does all the time, given its ongoing tests.
  • Many firms already experiment on customer behavior—in fact, the tactic is often celebrated—and this isn’t necessarily a bad thing if it improves the product.
  • We shouldn’t criticize Facebook for sharing its data and tests. We should celebrate its decision to share its findings.

Many critics are condemning Facebook because users weren’t given “informed consent”—essentially, they didn’t know that they were test subjects. Informed consent is a key element of clinical research studies, and an essential prerequisite if researchers are trying to get approval from an institutional review board.

But some of the criticism around informed consent has been, well, misinformed. Take this reader comment on Yarkoni’s blog:

Experimenting on people requires informed consent. That’s not optional, even for [F]acebook…it doesn’t matter how interesting you think the results are, they should have had informed consent from study participants. They broke the law.

Here’s why that’s not true: Even in a clinical study, informed consent can be waived if the risks of the study are minimal. And contrary to early reports, it turns out that Facebook didn’t actually go through an institutional review board, according to Hill, which changes the need for informed consent in the first place.

Why there’s been a firestorm of criticism

Even if the study was legal, that doesn’t absolve concerns—legitimate in many cases—that the test was unnerving. And the massive critical response seems to be driven by a few factors, including:

1. Facebook users don’t like being messed with: Facebook’s algorithm-powered News Feed is no secret. But getting reminded that it’s an artificial environment, where Facebook’s software can radically customize your experience, can be unsettling—especially when it’s revealed that a test involved provoking an emotional response.

2. The test potentially put users at risk: There’s evidence that Facebook is already a depressing place to visit, and some critics say that making it a slightly more negative experience could have harmed vulnerable users.

“I wish someone were able to sue [Facebook] over this,” New Yorker writer Emily Nussbaum tweeted. For example, “if it triggered any significant depressive behavior” in users, she added.

3. Mucking around with users’ emotions goes against Facebook’s mission:It’s one thing if Facebook wanted to tweak the to sell. But the company wants to be more than an ordinary Web business—Facebook aspires to be like the plumbing of the Internet, a service that you use constantly without even thinking about it. And it’s not ethical to intentionally turn off the hot water for some of your customers.

4. Media members feel threatened:Much of the negative response has been powered by journalists, but there’s a potential subtext to their criticism, reporter Darius Tahir points out: Facebook’s News Feed has become an essential tool for disseminating their work.

“It’s media members who know how much power Facebook has over their publications and hence working lives,” Tahir writes. And in theory, any change to the Facebook algorithm could be professionally troubling.

How to guard yourself moving forward

In response to the criticism, Facebook data scientist Adam Kramer put out a statement on Sunday, and perhaps the company will change its policy on testing in the future.

But let Facebook’s study serve as a wake-up call: If you’re actively surfing the Web, you’re a high-tech lab rat. At least 15% of the top 10,000 websites conduct A/B testing at any given time, the MIT Technology Review reported earlier this year.

Facebook’s just the only site to publish an academic paper about it.

Meanwhile, don’t plan on the company to stop this sort of testing, no matter how loud the outcry over this study. There are too many business-driven reasons for Facebook to keep tweaking its platform and learning more about how its users respond to different triggers.

And if that concept bothers you—if you find Facebook’s artificial environment somehow less friendly today than yesterday—there’s a simple solution: Quit Facebook.

Advertisement



Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *