When the issue of “fake news” came to prominence at the end of this year’s election cycle, traditional media outlets were the firsts to engage in relentless finger-pointing. Journalists everywhere deplored the spread of misleading, lazily reported and, in some cases, totally fabricated stories that had been facilitated by social media.
That’d be ignoring — of course — that in a competitive media climate where virality and accuracy go head-to-head, editors will often go for a catchier headline, at the expense of factual accuracy. And as one writer at HEqual noted, even the respectable BBC is not immune from this kind of sensationalism.
In February 2016, the media ran amok with a study which looked at gender bias on Github, a website where developers can submit code to collaborate on open source projects. Once a code contributed is submitted, it has to be vetted by various members of the community before being approved. The aim of the study was therefore to compare acceptance rates of contributions from men versus women.
The BBC wrote it up and published an article on Feb. 13th under the headline “Women write better code, study suggests.”
The lede read:
“Computer code written by women has a higher approval rating than that written by men – but only if their gender is not identifiable, new research suggests“
To be fair, the BBC did a better job than other outlets to present the results in a somewhat factual and neutral manner. But as is often the case with these pop science stories, many of the study’s findings were conveniently glossed over and later debunked.
One, for instance, is that yes, although women from outside the Github community who conceal their gender have a higher chance of getting their code approved by moderators than those who don’t, this trend also applies to men, albeit with a smaller margin. Likewise, a quick look at the dataset for community members actually shows no evidence of blatant sexism against women — quite the opposite. Women who clearly identified their gender were actually more likely to get their code approved.
Researchers concluded that this difference could be due to the fact that women were actually better at coding but fell prey to rampant discrimination. While this is a possible conclusion, it is far from being the only, or in this case, the most accurate one. In fact, the study authors themselves later admitted that they had left out a pretty crucial point from their analysis: their data also revealed that women coders are in fact “harder on other women than they are on men.”
If the first headline the BBC put out conveyed that nuanced (“Women may write better code, study finds”) it took less than just 35 minutes for an editor to ditch the “may”, thus dispelling any doubts about women’s superior coding ability.
After a reader complained that the headline was not only misleading but that it showed a clear bias in its treatment of the topic, the headline was amended a first time to say “Women write better-rated code, study suggests.” The change happened on July 15th — or 5 months after publication.
Unsatisfied with the change, however, the diligent reader pushed back. And it wasn’t until October 2016 — a staggering 8 months after the piece was first published — that the BBC completely rewrote the headline and issued a complaint statement, in which it absolved itself from responsibility because, as they put it, “the article did not deal with matters which were controversial in the sense which would require a balance of views.”
They nonetheless corrected the headline to “Github coding study suggests gender bias“ and rewrote the first two paragraphs of the story to reflect the complaint.
As the recent surge in viral and poorly fact-checked stories by the world’s most respected outlets shows, media people don’t always get it right. In fact, there are plenty of times when they get it wrong. That’s what corrections and clarifications are for.
The BBC’s astonishing delay in retracting what was clearly a disingenuous headline, wouldn’t be so much of a problem if the story didn’t get picked up on an astronomical scale. The article went viral, getting hundreds of retweets, including by the executive chairman of Twitter himself. It was shared a total of 31,639 times on social media, according to Muck Rack. Worse, a quick Google search for “women write better code” still yields 328,000,000 results, with the BBC piece on top of the list.
At a time of information overload where 6 out of 10 of us merely read a story’s headline before sharing it, words do matter.
Delaying corrections and continuing to spread questionable statements in the midst of the media’s ‘fake news’ soul-searching is not doing anyone any favors.
The BBC did not immediately return Heat Street‘s request for comment.