Main image of article Facebook's New Privacy Issue Highlights Need for Transparency
Anyone signing up for Facebook must agree to the social network’s Terms of Use, which opens user-generated data to everything from data analytics to product testing. The majority of the time, Facebook keeps its capitalization (some might call it exploitation) of user data firmly in the background; your postings might end up used in a system-wide test, but you’ll never know about it. So it was surprising when, over the weekend, news leaked that Facebook had manipulated the news feeds of 689,003 users, adjusting the number of emotionally negative and positive postings seen by various groups. What’s more, the news leaked because Facebook’s researchers published a paper about their little experiment. Click here to see Big Data jobs. The study concluded that people generated more positive postings whenever their newsfeed presented them with an unending stream of good news. “When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred,” the paper claimed. “These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.” The idea of the social network manipulating peoples’ emotions for profit led, inevitably, to protests online, along with a response from Facebook. “This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account,” a Facebook spokesperson wrote in a statement to Forbes's Kashmir Hill. “We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow.” Despite that defense, the study could exacerbate a perception of Facebook as an untrustworthy holder of users’ online information, one that the company has taken great pains to push back against over the years. (Whether Facebook’s experiment was unethical will likely prove the subject of debate for some time.) The incident also doubles as a cautionary tale for companies that handle massive amounts of user data: When in doubt, it’s always better to offer those users the chance to opt out of an experiment, rather than using any Terms of Service as a fait accompli.

Related Articles

Image: Facebook