Can An Algorithm Be An Editor? (Or Can An Editor Be An Algorithm?)

May 19, 2016 • Digital News, Ethics and Quality • by

Trust is greater if we know how the algorithm - or editor - work

Trust is greater if we know how the algorithm – or editor – works

Can an algorithm be an editor? Or can an editor be an algorithm? These questions have become more timely since the launch of Facebook’s Instant Articles last year. Publishers now hand content to Facebook and the platform’s algorithm ‘chooses’ who will see it, when and how. Algorithms increasingly do the job of news editors. But is that necessarily bad?

In a recent column published in The Wall Street Journal, Jeffrey Herbst argues that platform operators such as Facebook, Google or Twitter are becoming less like neutral distribution channels and more like news companies, with an editorial stance over the information they distribute. The issue is not new and Emily Bell, director of the Tow Center for Digital Journalism, in particular, has referred to it again and again.

As other major distribution platforms seem to be heading in the same direction as Facebook’s Instant Articles and with more news content being consumed inside the distribution platform itself, algorithms are increasingly making editorial decisions. But are they doing a worse, or better job than human editors employed by tech companies? Recent news that Facebook’s (human) editors displayed bias by routinely suppressing conservative stories has added to the debate.

Is the algorithm really an editor?

It would be naive to say that the algorithm is not an editor.  When tech companies say they are not news companies they are right, in a way. They do not produce news.  But they edit the news in their distribution process. And editing is certainly a part of the news process.

On the other hand, it would also be naive to maintain that the editor is neutral and does not interfere in the process. The choices a news editor makes regarding what should or should not be published are influenced by a multiplicity of factors, some formal, some personal, some known to us and others unknown. The difference – it is usually said – is that humans can be scrutinised whereas algorithms cannot. But is this really so?

Can we scrutinise the algorithm?

No, not completely. But some algorithms are more open for scrutiny than others. We, the users, know which are the factors Google considers in its search algorithm (time on page, pages per visitor etc) but we do not know how much weight it gives to each factor. Plus Google frequently changes the relative correlation between those factors, but we do not know by how much. We also have no say when Google chooses to change those factors or incorporate new ones.

We also have very little information on which criteria Facebook uses to display some stories more often or more prominently. But we do have some degree of control over it, by adjusting the settings and liking, or not liking, certain types of content. So, the answer is more nuanced than it might seem.  No, we do not have complete scrutiny of the algorithm, but we have some degree of it.

But, inversely, can we scrutinise the editor?

Well, the answer is the same: No, not completely. Of course we can say there is a professional (and sometimes ethical) code by which an editor abides. We trust that code to serve as a guarantee that editing follows certain standards. I know the code, but I don’t know the human who acts on its behalf. It would certainly be naive of me, as an informed citizen, to think that editors have no influence on the news I read, watch or listen.

We know the news editor respects a certain code of professional conduct (we take that for granted) but we also know his professional choices can be (and always are) influenced by a myriad of factors, like his or her culture and upbringing, social and political affiliations, personal tastes and professional expertise, among others. It’s because of a complex set of such factors that a human editor makes the editorial decisions he makes. And that is why a human editor’s professional choices when editing work very much in the same way as an algorithm. Only a very opaque one.

Of course, professional news editors are trained to perform that job and that training is in itself a significant part of the “algorithm” they perform when acting on the job. But so are all the other factors that influence (or may influence) his or her professional choices. Citizens cannot control all those factors and cannot predict how the news editor is going to perform at an individual level. So, we can certainly scrutinise the editor, but, again, not completely.

Algorithms are predictable, humans are not

Plus, there’s another important difference: the algorithm always acts the same way – except when tweaked by humans. Human editors always act differently on the basis of a common code. In a way there is more accuracy and reliability in a “system” that always performs a function in the same way (except singular variations) than in a “system” that always performs differently (on behalf of a singular code).

Forget for a moment that we are talking about humans versus machines (human and artificial intelligence, if you will): a “system” that always performs differently on the basis of a given underlying code is less reliable and accurate than a “system” that always performs the same unless particularly tweaked to perform differently.

In this context accuracy and reliability means… trust. Which is a central element of any news ecosystem, both the one we had in the age of professional news editors and the one we seem to be entering in the age of news distribution platforms run by computer algorithms.

Can we trust algorithms more than editors?

Of course, trust is greater if we know how the algorithm works and how the editor works.  That applies to both the human editor and the algorithmic editor. The issue is – I argue – that we know more about the algorithm than we know about the editor. As users of digital platforms for news we have more control over the algorithm than we as consumers of news had over the behavior of the editor. Does that mean we can trust the algorithms more than we can trust the editors? Not necessarily… but maybe.

We now live in a news ecosystem that is profoundly different to the one we used to know. The single most important element to explain that difference is the transition from analog to digital. When all information is digital – including news production and distribution – computers become the key players in the process, because they operate on digital code and therefore are able to perform all sorts of algorithmic operations on the digital news content, including of course editorial choices on behalf of a given human instruction.

The other important change is that this occurs in a networked society, where every node – that is, every individual user – can produce and distribute information (even if just by sharing it, for example). The consequence is that the leading role of professional producers (including news editors) gives way to the emergence of the users (you know, “the people formerly known as the audience”) as the power force behind the new news distribution ecosystem.

The role played by users in distributing content is crucial

We should not forget that platforms like Facebook, Twitter or Instagram do not exist to produce content but to provide the users with the tools to produce and distribute their own content. They are, in a way, the vehicle that operates that power transfer from the producers to the users of information. Furthermore, these information platforms (each one a set of information tools, really) depend on the users to operate. Which means that, strictly speaking, individuals have more controlling power over those platforms and their editorial choices than they had over the traditional media or their editorial process.

Old journalism is dying (of old age)

However, it is also true that journalism has a special role in society and professional news editors are among the people who work to fulfill that special role. Doesn’t old journalism’s demise undermine the democratic functioning of our complex societies? Well, journalism is an institution (or a complex arrangement of institutions, to be more precise) that we as societies “invented” for fulfilling that special role within a certain news ecosystem, with a certain technological environment. It was probably, for many years, the most rational and efficient answer we collectively had for the problem of producing and distributing reliable and trustworthy information.

But in the age of user platforms governed by algorithms we probably need new or reformed institutions. Maybe new forms of journalism or maybe something new altogether. But the truth is we cannot know just now what it is. Probably new institutions will emerge to respond to the challenges new technologies present.

As for the algorithms, of course they should be open for scrutiny. As matter of fact, they should be open, full stop.

 

pic credit: CCO Public Domain

Print Friendly, PDF & Email

Tags: , , , , , , , , , , ,

Send this to a friend