from The Conversation
— this post authored by Jane B. Singer, City, University of London
Facebook’s recent decision to block a Norwegian user’s post containing the Pulitzer Prize-winning photo of children, one of them a terrified and naked girl fleeing a napalm attack during the Vietnam war, was met by a cry of outrage from journalists and other free speech advocates.
Norwegian writer Tom Egeland had posted the picture on his Facebook page as part of a discussion of “seven photographs that changed the history of warfare“. He was subsequently blocked from Facebook.
When Norwegian newspaper Aftenposten reported on this, including the image in its story and posting it on Facebook, the image was blocked there, as well. Facebook cited its policy of barring images of nude children as part of its defence against use of its platform for child pornography.
Facebook policy om sexual violence and exploitation. Facebook
Things escalated from there. The newspaper splashed the image across its front page (as did other news outlets, including the Guardian in the UK), accompanied by a “Dear Mark Zuckerberg” letter from editor Espen Egil Hansen. Hansen expressed strong concern that “the world’s most powerful editor”, in charge of “the world’s most important medium” was “limiting freedom instead of trying to extend it”.
Norway’s prime minister, Erna Solberg, also weighed in, calling the “highly regrettable” decision an attempt to “edit our common history”. The CEO of Index on Censorship, Jodie Ginsberg, was even more blunt: “Absolutely idiotic”, she declared. Journalists, politicians and others around the world republished the image in protest and as a sign of solidarity.
On Friday, Facebook backed down. It reinstated the picture, citing its “status as an iconic image of historical importance“, which, it said,
“outweighs the value of protecting the community by its removal”.
The company pledged to “adjust our review mechanisms” and to engage with “publishers and other members of our global community on these important questions going forward”.
It was a good, if belated, decision. But was it a victory for free speech? Not inherently.
Two wrongs about a right
Facebook’s initial argument that posting the iconic photo would make it more difficult subsequently to refuse to post other photos of naked children was arguably disingenuous but also just plain faulty. Surely a company with the obvious, indeed mindboggling, technical chops that Facebook possesses, has the ability to create an algorithm to take such markers as Pulitzer Prizes into account when making publication calls. Although there are bigger issues here related to letting algorithms make such editorial judgements in the first place, with or without human assistance, the problem in this particular case really should not have arisen at all.
But editors also are on shaky ground in trying to dictate to Facebook what it should or should not publish. It is in fact ironic that they should think doing so is appropriate, let alone righteous, behaviour.
To understand why, consider the justifiable rage if the situation were reversed: if a third-party platform (or anyone else, for that matter) attempted to tell a journalist what stories to write and how to play them. Freedom of the press conveys the right to make independent decisions about what to cover, how to cover it and what to do with the information once it’s in hand. It is the freedom to decide what to say, as well as when and where and how to say it. It also conveys the right not to say something.
Global editor-in-chief? Facebook CEO Mark Zuckerberg. Brian Solis, www.briansolis.com and bub.blicio.us, CC BY
Every publisher must have that freedom if it is to have any meaning – including, yes, Facebook. Despite its rather convoluted recent attempts to define itself as a “tech company” or platform rather than a “media company”, it clearly is both. A decision by Facebook not to allow a particular bit of information to appear on its site may be a bad decision – whether based on policy or merely on an algorithm that needs to have considerably more nuance built in – but it is neither tyranny nor censorship. The company did not tell other people what they could or should do with the photo. It merely exercised its right to make the call in relation to the image on its own site.
Power of the platform
What makes this trickier, though, is that the Aftenposten editor is right about his broader charge: that Facebook holds unprecedented global power over the flow of information. But this power over the press, which is indeed significant, is actually quite different from censorship as understood by both tradition and law.
Facebook cannot prevent an item from being made visible to an audience, as it has no control over what publishers choose to publish or broadcasters to broadcast through their own distribution channels. (Even individuals, such as the writer Egeland here, can disseminate information through their own blog, among other options.) The power it does have, however, is to expand an item’s visibility once it is published or broadcast. Conversely, if Facebook chooses not to exercise that power to extend visibility – as it has a right to do – then visibility is indeed curtailed. And significantly: an estimated 40% of traffic to news sites now comes from Facebook, more even than Google.
The issue for the commercial media, then, is primarily an economic one – their ability to generate revenue ultimately depends on people seeing (and, ideally, somehow engaging with) their wares, the information they produce and provide – and only by extension an editorial one. Current law tends to deal with the economic and editorial realms separately: the first primarily as a matter of commerce and the second as a civil liberties issue, for instance involving free speech.
Such a dichotomous understanding worked well enough when the same parties controlled both the creation of content and the means of distributing it. But over the past decade, with the inexorable rise and exponentially growing power of external platforms, that situation has changed. Media companies no longer create all their own content (for instance, they rely increasingly on material generated by users), and they control a diminishing number of the ways in which that content is accessed.
Their reach is thus limited by the availability of their content on someone else’s information delivery mechanism. In addition to Facebook and Google, those mechanisms include Twitter, YouTube (owned by Google), Yahoo! and a rapidly proliferating array of other “social sharing” technologies.
In other words, the effectiveness of news companies, and perhaps even their survival, is at least to some extent out of their hands. The situation is both frightening and frustrating. Aftenposten editor Hansen declared, in his front page “letter” to the Facebook boss, that “editors cannot live with you, Mark, as a master editor“. And, though he didn’t say it, as master publisher, too.
Yet live with Zuckerberg and his peers they must – somehow. For the foreseeable future, content will be shared, but the space in which that shared content appears will remain under the control of distinct entities, with distinct organisational cultures, ideas about what constitutes valuable content, and economic interests. The inevitable struggle over this highly contested ground has significant implications not just for the media and technology players directly involved but also for the millions of people who count on both of them to work – and to work together.
Jane B. Singer, Professor of Journalism Innovation, City, University of London
This article was originally published on The Conversation. Read the original article.