#HTE

Of Course Facebook Is Biased

Facebook must have thought the online news game was pretty easy. Two years ago, it plucked a small team of about a dozen bright, hungry twentysomethings fresh out of journalism school or entry-level reporting jobs. It stuck them in a basement, paid them contractor wages, and put them to work selecting and briefly summarizing the day’s top news stories and linking to the news sites that covered them. It called them curators, not reporters. Their work appeared in the “Trending” section of the Facebook home page and mobile app, where it helped to define the day’s news for millions of Facebook users.

That is, by any reasonable definition, a form of journalism. And it made Facebook a de facto news organization.

But Facebook refused to acknowledge that. It never bothered to reckon with the basic responsibilities that journalism entails, nor the ethical and epistemological challenges it presents—probably because they’re messy and inconvenient and might get in the way of optimizing engagement. And now it’s paying the price.

On Tuesday, Facebook became the subject of a Senate inquiry over claims of anti-conservative bias in its Trending section. Senate Commerce Committee Chairman John Thune, a South Dakota Republican, sent Mark Zuckerberg a letter asking a series of pointed questions about how Facebook chooses stories for the section, how it trains its curators, who’s responsible for their decisions, and what steps it’s taking to investigate the bias claims. He also asks for detailed records of stories that the company decided not to include in the Trending section despite their popularity among Facebook users.

The inquiry followed a report by Gizmodo’s Michael Nunez on Monday, in which anonymous former Facebook “curators” described the subjective process by which they assembled the Trending section. Facebook had publicly portrayed the section—which you can find near the top right of Facebook.com or under the search tab on the Facebook app—as an algorithmically driven reflection of the most popular stories its users are reading at any given time. But the ex-curators said they often filtered out stories that were deemed questionable and added others they deemed worthy. One, a self-identified conservative, complained that this led to subtle yet pervasive liberal bias, since most of the curators were politically liberal themselves. Popular stories from conservative sites such as Breitbart, for instance, were allegedly omitted unless more mainstream publications such as the New York Times also picked them up.

None of this should come as a surprise to any thoughtful person who has worked as a journalist. Humans are biased. Objectivity is a myth, or at best an ideal that can be loosely approached through the very careful practice of trained professionals. The news simply is not neutral. Neither is “curation,” for that matter, in either the journalistic or artistic application of the term.

There are ways to grapple with this problem honestly—to attempt to identify and correct for one’s biases, to scrupulously disclose them, to employ an ideologically diverse staff, perhaps even to reject objectivity as an ideal and embrace subjectivity. But you can’t begin to address the subjective nature of news without first acknowledging it. And Facebook has gone out of its way to avoid doing that, for reasons that are central to its identity as a technology company.

Here’s how Facebook answers the question “How does Facebook determine what topics are trending?” on its own help page:

Trending
shows you topics that have recently become popular on Facebook. The topics you see are based on a number of factors including engagement, timeliness, Pages you’ve liked and your location.

No mention of humans or subjectivity there. Similarly, Facebook told the tech blog Recode in 2015 that the Trending section was algorithmic, i.e., that the stories were selected automatically by a computer program:

Once a topic is identified as trending, it’s approved by an actual human being, who also writes a short description for the story. These people don’t get to pick what Facebook adds to the trending section. That’s done automatically by the algorithm. They just get to pick the headline.

Bias in the selection of stories and sources? Impossible. It’s all done “automatically,” by “the algorithm”! Which is as good as saying “by magic,” for all it reveals about the process.

It’s not hard to fathom why Facebook is so determined to portray itself as objective. With more than 1.6 billion active users, it’s larger than any political party or movement in the world. And its wildly profitable $340 billion business is predicated on its near-universal appeal. You don’t get that big by taking sides.

For that matter, you don’t get that big by admitting that you’re a media company. As the New York Times’ John Herrman and Mike Isaac point out, 65 percent of Americans surveyed by Pew view the news media as a “negative influence on the country.” For technology companies, that number is just 17 percent. It’s very much in Facebook’s interest to remain a social network in the public’s eyes, even in the face of mounting evidence that it’s something much bigger than that. And it’s in Facebook’s interest to shift responsibility for controversial decisions from humans, whom we know to be biased, to algorithms, which we tend to lionize.

But algorithms aren’t magic. They’re built by humans, they’re maintained and updated and overseen by humans, and they’re flawed like humans. Most importantly, they’re built to serve human ambitions, which are inherently subjective. I wrote in depth earlier this year about the human values and decisions that shape Facebook’s news feed algorithm. It’s built to execute goals that range from maximizing user engagement to making users feel good about the time they spend on Facebook.

Human values shape the Trending section, too. The algorithm that surfaces the stories might skirt questions of bias by simply ranking them in order of popularity, thus delegating responsibility for story selection from Facebook’s employees to its users. Even that—the notion that what’s popular is worth highlighting—represents a human value judgment, albeit one that’s not particularly vulnerable to accusations of political bias. (That’s why Twitter isn’t in the same hot water over its own simpler trending topics module.)

The problem with an algorithm that simply harnesses the wisdom of the crowd is that the crowd isn’t always wise. The most popular stories at any given time might well be misleading, or sensationalist, or even full of lies. That’s why Facebook felt the need to hire humans to oversee it. This is in keeping with the company’s broader push for what it calls quality content, another term that entails value judgments without copping to them.

Facebook’s instinct to hire journalists was well-placed: As I’ve explained, no algorithm yet devised can fully substitute for a good human writer or editor.

But Facebook instead opted to hire cheap contractors and went on to claim that their role is simply to “confirm that the topics are in fact trending news in the real world and not, for example, similar-sounding topics or misnomers.” That’s a dubious claim, even if you think the allegations of liberal bias are trumped up. If the curators’ job was really just about cleaning up the data, Facebook seems to have forgotten to tell that to the curators themselves, who described their mandate very differently to Gizmodo. They said they were encouraged to prioritize stories from certain outlets deemed reputable; to avoid news about, among other topics, Facebook itself; and to replace the word Twitter in headlines with something more vague, like social media. That may not be political bias, but it’s bias all the same. They also described choosing stories for the Trending section that may not have been surfaced by the algorithm but that seemed to them to be important or worthwhile, like stories about conflict in Syria or the Black Lives Matter movement.

Facebook can and will dispute the specifics of these claims, as the company’s vice president of search, Tom Stocky, did in a Facebook post Tuesday morning. But they’re missing the point entirely. Facebook’s problem is not that its “curators” are biased. Facebook’s problem is that it refuses to admit that they’re biased—or even really human.

The Senate inquiry is pure political theater, a delicious opportunity for Republican politicians to fuel conservatives’ media-persecution complex. It is likely to answer few questions and solve fewer still. Yet Facebook brought this on itself by deliberately obscuring the process behind its Trending section and pretending to have a neutrality that its underpaid nonemployees couldn’t possibly earn.

The problem will not be solved by firing bad apples or instituting tougher guidelines. The only way for Facebook to extricate itself from this mess is to admit that journalism isn’t as simple as it thought. It’s to stop treating “curators” like drones and stop treating news like a data set to be optimized. It’s to build a real human curation team with a real editor in charge and an ethos and a mission and an understanding of the responsibilities involved in shaping how the news is framed to 1.6 billion people. Surely a company that pays its interns $11,000 a month in salary and benefits can afford it.

Either that or—what is more likely, given Facebook’s distaste for human judgment and fear of controversy—take the humans back out of the loop. Turn Trending into the pure algorithmic leaderboard that Facebook pretended it was all along, and accept that it will be quite possibly be riddled with sensationalist junk. Or dispense with it entirely in its current form and bring it back later as a feature of the main news feed.

Even that won’t remove all the bias from your Facebook feed, because that’s impossible. It will simply bury that bias deep within the company’s proprietary machine-learning algorithms, where real journalists can’t get at it. 


http://redirect.viglink.com?u=http%3A%2F%2Fwww.slate.com%2Farticles%2Ftechnology%2Ftechnology%2F2016%2F05%2Fyes_facebook_is_biased_now_it_should_admit_it.html&key=ddaed8f51db7bb1330a6f6de768a69b8