Thursday, 3 July 2014

Facebook’s social experiments appear far more extensive than its now-infamous emotions study

Not everyone is OK with Facebook's latest exploits.

Facebook made headlines for a recently published study where it modified the emotional content of what appeared in people’s news feeds and studied the after-effect of that change on users—an experiment some condemned as an intrusion and a manipulation of unsuspecting users of the social network. The Wall Street Journal (paywall) reports now that the study was just one of “hundreds” of tests run by a highly active and not particularly well-supervised group of data scientists.


Though there are stricter guidelines now, the Journal reports that Facebook’s data science team, formed in 2007, could run tests and experiment with very little oversight compared to university researchers—without the consent of study participants beyond accepting Facebook’s terms of service, and without informing those under study. We have reached out to Facebook for comment, and will update this post if we hear back.


Andrew Ledvina, a former data scientist at the company, paints a picture of an organization operating without much constraint. He told the Journal that “there’s no review process, per se”; that anyone on the team could run tests; and that he and a colleague had once run a test without telling anyone else at the company. Of the data science group, he said its members are “always trying to alter peoples’ behavior.” Research intended to be published is subject to stricter guidelines, the Journal reports, but that’s presumably a relatively small fraction of the research that the company does.


Most companies that harvest substantial user data are going to make use of it, and test little tweaks of product and design. But the intimacy and the personal nature of people’s interactions with Facebook make the idea of experimentation without supervision seem particularly invasive, even if the terms of service can be interpreted as agreeing to allow it.


Facebook now requires any research to be reviewed by a panel of internal experts on privacy. Still, the key word is “internal”—the company hasn’t released their identities or addressed concerns about using terms of service as a blanket consent for this type of research.


That this research surprised users and caused such an uproar is a pretty good indication that people have little sense of what they’re signing up for. The company says that it is looking into making further changes.


Facebook COO Sheryl Sandberg apologized for “poor communication” surrounding the study yesterday. The communication wasn’t poor. When it came to the extent of testing, it was essentially nonexistent.


When the emotional content study first began attracting criticism, the company gave the following statement to the Atlantic:


This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.


 





Facebook’s social experiments appear far more extensive than its now-infamous emotions study

No comments:

Post a Comment