Facebook is reportedly beginning “political bias” training for employees, after the controversy over conservative publications and topics being blacklisted on the social network. Too bad it won’t work.
The problem with Facebook Trending is not that it’s biased against conservatives. The problem is that it’s curated by humans. Facebook’s Trending Topics relies on summaries written by human editors, who were instructed to “inject” stories, “blacklist” topics, and generally ignore Twitter when possible. Leaked documents outline a job that a newspaper reporter from the 1950s would recognize: choose what goes on the front page, skip topics you don’t want to cover, and ignore your rivals at the cross-town daily.
This moved what could have been algorithmic decisions into a process influenced by human foibles and cultural bias. Gizmodo reported that Facebook’s Trending Topics section is run by young journalists from Ivy League schools who “worked at outlets like the New York Daily News, Bloomberg, MSNBC, and the Guardian” — and who avoided conservative news outlets as much by policy as by habit.
A cross-section of American political opinion this is not. My own startup, Recent News, takes a different approach. We’ve been operating our recommendation engine, which like Facebook highlights topics that are trending, with zero human intervention since we launched late last year.
Since then, our recommendation engine has indexed millions of articles and sorted them into thousands of categories while automatically providing our users with trending topics, personal recommendations, suggested topics, local news, related stories, and so on. We analyze the day’s news and attempt to predict which articles you’ll like.
We don’t rely on a stable of newly graduated journalists selected by Accenture subcontractors working out of a basement in New York City. In our case, budget drove innovation: We’re a small startup without enough cash to hire a basement of human editors, so we had to figure out how to invent algorithms to do it instead.
As soon as you download our app and read a few articles, we begin to provide you with personal recommendations. Any of the hundreds of news organizations we index stands an equal chance of being recommended, based on signals including your interests, article quality, publication time, social sharing, your reading history, and so on.
Our recommendation engine certainly isn’t perfect. It’s still overly cautious when deciding whether a topic is trending or not, which means it’s slower than a human to highlight breaking news like last fall’s terrorist attacks in Paris. We think we can make it faster, though it’s not a trivial problem to solve.
There are hints that Facebook also is moving in the direction of complete automation. The Washington Post, which interviewed former editors, reported that “they believed — and they suspect Facebook also believes — that one day the algorithms will be totally self-contained and self-sufficient, and Facebook would have no more need of their department.” (Facebook appears to customize each user’s trending stories based on gender, interests, and other signals, so even today its human editors are working to provide inputs into an algorithm.)
A recently released version of Facebook’s Trending Review Guidelines already is quasi-algorithmic. Here’s what it tells human editors to do: Mark a story as a top story if it’s leading 5 of 10 specified news sites. More important “nuclear” stories must be the most prominent story on 8 of those 10 sites. Then add 1 to 3 keywords and a related photo. Repeat indefinitely until the end of your shift.
To be sure, it’s true that algorithms can be biased, too. In a way, all of them are. They reflect the values and goals of their designers.
Facebook’s News Feed is biased toward increasing user engagement, which can mean highlighting viral articles at the expense of sober ones. Google is biased in favor of web pages that load quickly, which is great for mobile users but not as delightful if your site has megabytes of unnecessary JavaScript. Our own machine learning algorithms are biased toward recommending articles we view as high quality, which isn’t wonderful if you publish mostly photo galleries.
While algorithmic bias in news selection may eventually become a real-world problem, there’s no evidence it is one today. If it does become an issue, it will hardly be a new one: Every news organization has always had its own sets of biases. Owners and publishers have agendas, and advertisers can unduly influence news decisions. So can individual journalists — only 7.1% reporters identify as Republicans, after all.
Which is why it’s disturbing to see politicians — who would never dare interrogate the Wall Street Journal or the New York Times about their editorial practices — demanding answers from Facebook.
Sen. John Thune, the Republican chairman of the Senate Commerce committee, wrote a letter to Facebook on May 10 posing a series of pointed questions, including whether the social network singled out “conservative views for exclusion.”
Facebook replied by saying it would change some policies relating to trending stories. Colin Stretch, the company’s general counsel, said in a letter to the Senate that he could not rule out whether there were “isolated actions with an improper motive.” But, he said, there was no evidence of “systematic political bias in the selection or prominence of stories” listed as trending.
Even if Facebook did engage in “systematic political bias,” its editorial decisions remain none of the Senate’s business.
There’s no difference in principle between Facebook selecting headlines for its home page and traditional news organizations doing the same thing. They’re all shielded by the First Amendment from governmental probes and second-guessing — whether that editorial process involves only machines, only humans, or happens to be a slightly messy process involving a little of both. Facebook will serve its users best, however, when it simply lets the algorithms take the wheel.
Declan McCullagh is the co-founder and CEO of Recent Media Inc., which operates a news recommendation engine and has released a news app for iOS and Android.