Accountability must become part of Silicon Valley’s culture and we can’t leave editorial processes down to algorithms.
Several years ago, Vint Cerf visited the Guardian in his capacity as Google’s “evangelist in chief” – the kind of Silicon Valley title you can carry off only if you have invented the internet, which, luckily for Cerf, he had. He showed us mini-robots and talked about building the internet in space. We smiled indulgently, inwardly questioning the robustness of his faculties, and talk turned inevitably to “the future of newspapers”. “Well,” said Cerf, rotating his robot, “the problem is there’s ‘news’ and there’s ‘paper’, and those are two separate things.”
It seemed so preposterously obvious as to be not worth further scrutiny. But Cerf’s pronouncement, like his tabletop robot and astral internet, was more profound than we realised. On 20 August, Dick Costolo, Twitter’s chief executive, tweeted: “We have been and are actively suspending accounts as we discover them related to this graphic imagery. Thank you … ” . His tweet linked to the news that James Foley had apparently been executed, on video, by Isis.
This was a significant moment. For the first time, Twitter acknowledged it was a platform that exercises editorial judgment. It was not controversial somehow for news organisations to censor the images. Yet the debate raged for days about whether executives in software companies could decide what we see. Inside a newsroom, these decisions are called editorial judgments; outside, they are labelled censorship. The truth is both are forms of censorship, and equally both could be argued to be editing.
News, as Cerf said, is increasingly shared outside familiar formats, and therefore outside journalistic institutions and conventions. The most powerful distributor of news now is not News Corp trucks or Tesco. It is an algorithm governing how items are displayed to the billion active users on Facebook.
Sociologist Zeynep Tufekci noted after the riots in Ferguson that although many news items were being posted to Facebook, she initially saw none of them in her feed, just ice bucket challenges. That led her to speculate that algorithmic filtering could potentially mute important stories.
Ferguson, she wrote, “is a net neutrality issue”. Yes it is. It highlights, as she said, how a news ecosystem that automatically favours one type of story over another can pour very cold water on democratic debate.
It is also an editorial issue. Cries of “hold the front page” are no longer necessary, but a group direct message to “tweak the algorithm” must take their place. Platforms that want public trust should be employing many more journalists than they presently do and using their knowledge to imbue automated process with values. The paradigm of “objectivity” is as absurd a concept in software platforms as it was in 20th-century newspapers.
Accountability is not part of Silicon Valley’s culture. But surely as news moves beyond paper and publisher, it must become so. For a decade or more, news organisations have been obeisant to the power of corporate technology, nodding and genuflecting at the latest improbably impressive magic. But their editorial processes have something to offer technologists too.
Transparency and accountability have to accompany the vast, important role our key information providers now play in society. It is understandable why platforms such as Facebook strenuously resist being labelled as “publishers”, but it is no longer realistic. It takes very little narrative imagination to grasp the ethical complexities ahead; every policeman wearing a camera, every terror cell with a Twitter feed, every face in a crowd rendered recognisable.
Vint Cerf was right. We can build the internet in space, and robots are taking a central role in our lives. But we need an open conversation about who shapes their values.
Guardian.com
0 comments:
Post a Comment
Grace A Comment!