One of the most dangerous beliefs centering on photography is that pictures have agency, that, in other words, there is a picture, and it, and only it, will then determine everything that might follow. This is very obviously not the case. As I have argued before, our viewing of pictures determines what we make of them. We view pictures not as scientists study specimen. Instead, we look at them to mostly get our suspicions or ideas or stereotypes confirmed.
Beyond that there are more aspects to the lives of pictures, though. A single week, number 36 in 2016, featured a variety of important examples, all of which are critically important for our understanding of what pictures do and how they do it — provided they are in fact given a chance to do so. Two of these examples centered on Facebook, the social-media behemoth, whose influence and power has now come to create major problems for the world of photography.
The company had deleted one of the most iconic photographs of the 20th Century, Nick Ut’s famous Vietnam War image of children fleeing a napalm attack, from the page of a large Norwegian newspaper. The newspaper would have none of it, though, and responded with a front-page letter to Facebook’s CEO. Norway’s prime minister then posted the photo on her page, only to have Facebook remove it there as well.
You really want to wrap your head around this. A company decides it has the right to openly censor a newspaper and the head of government of a democratic country, removing one of the most well-known and historically relevant photographs made over the course of the past 100 years. Just to put this a little into a larger perspective, in 2016, Norway was ranked third in the World Press Freedom Index. Norwegians know a thing or two about press freedom. In that same Index, you can find the US, home of Facebook, at position 41.
Facebook ended up reversing their position eventually, perhaps not surprisingly, given that its decision had resulted in what could only be described as a major PR fiasco. But its reversal really wasn’t as much an acknowledgment of the underlying problem as an attempt to make the problem go away.
“It all comes down to algorithms,” wrote the Christian Science Monitor. But an uncharacteristically scathing in-depth report by NPR found that it wasn’t algorithms, but people who were responsible for the removal: “in this instance, it was a human who flagged the post and a human inside the company who decided to hit the delete button.” The good news is that according to the report, humans are tasked to do the job “because of the problem of context.” (emphasis in the original) Algorithms are unable to understand context. The bad news is that those humans tasked with removing photographs were unable to identify this particular photograph.
There are very obvious alarm bells that should be going off now, not just in the offices of media corporations that are increasingly relying on Facebook to reach their audiences. Alarm bells should be going off in the heads of not only every person who uses Facebook actively, but also of those that don’t (I’m not on Facebook, for a variety of reasons, including this kind of censorship, but also the blatant privacy violations). If Facebook is symptomatic for the state of tech companies then we are in deep, deep trouble.
If the people tasked with monitoring photographs at Facebook can’t even identify one of the most iconic photographs in history, what confidence should we have in their overall qualifications? Are those the kinds of people who should be the most important editors in the world of photography? Given even Facebook appear to be aware of how algorithms fall short when it comes to dealing with photographs, any which way you want to think about this, there’s a huge problem.
If Facebook decides to remove one of your photographs, in all likelihood you’re not going to have a newspaper to raise a big stink, and you might not have your country’s head of government interfere for you. In all likelihood, you’ll merely get an automated message, and that’ll be it.
And will we see tomorrow’s next iconic photograph, the one that shows us the outcome of some horrible event somewhere? Who is to say a picture that possibly could shake us to the core in much the same way as Ut’s will make it past Facebook’s algorithms or censors?
Would Nick Ut’s picture have become iconic, had Facebook been around in 1972? I actually don’t think so. A chilling thought, isn’t it?
These are very deeply troubling considerations, and we, the general public, are absolutely powerless. Unlike a government in a democratic country, which is accountable for its actions (however long that might at times take), Facebook is a privately owned corporation, and there is no such oversight.
When it comes to photography, Facebook can basically do whatever they want. We don’t get to see all the pictures that are out there. We get to see the pictures that make it past flawed algorithms or people who are very clearly not qualified at all to make the kinds of decisions veteran news-photography editors are used to making.
However, the Facebook problem gets worse. On the one had, there is the censorship of images. This goes beyond the depiction of naked children (for which there are very valid and obvious concerns, given the problem of child pornography). For example, Facebook owns Instagram, and you cannot show a female nipple there, for reasons that are too absurd and puritanical to even take seriously.
What you can show without any major problems, however, are depictions of, say, gruesome violence. In fact, as long as your photographs get by Facebook’s censors or algorithms, you’re good to show anything.
The same week Nick Ut’s picture didn’t make it, the small town East Liverpool (Ohio) posted two photographs of a couple that had overdosed in their car, with a small child sitting right behind them. Addiction experts were quick to point out that public shaming would very likely be counter productive. In this case, it was reported, “a Facebook spokesperson said the photos did not violate the company’s community standards.”
As in the case of Ut’s picture, the decision over whether or not to publicly share photographs like the two East Liverpool ones ought to be in the hands of highly trained photo editors, people who not only have the knowledge to understand the “news value” of the photographs, but who have also wrestled with the different underlying ethical problems.
However much any editor’s decisions might be flawed at times, at the very least we can be certain that they have thought about the underlying problems, that, in other words, we’re looking at the end result of an educated process (regardless of whether or not we end up agreeing with it or not). The world of Facebook does away with this.
Instead, it leaves many crucial decisions about whether or not photographs can or should be shown in the hands of people unqualified to properly make educated decisions or, even worse, to algorithms. And as long as those pictures then work along some “community standards” that are very narrowly defined (for example, American and European ideas of what constitutes acceptable amounts of nudity are very different), you’re good to go. Frankly, that’s absurd.
You could argue that, well, at least we get to have a discussion of whether or not the photographs from East Liverpool should be shown or not. But here’s the thing: once these photographs are out there, there’s no way back. Even if they got taken down, they’ll live forever in some form on the internet.
Given its reach, given that media corporations are eager to work with Facebook, there now exists a tremendous problem: news organizations have essentially lost full control over what they can show. There are always “community standards” plus algorithms or undereducated editors who might decide that that news photograph cannot be shown. At the same time, anything that makes it past the company’s “standards” is fine, whether it’s the public shaming of people who suffer from addiction or anything else.
The reality is that we won’t get the toothpaste back into the tube. The only solution I can think of is to try to raise the overall level of visual literacy, to, in other words, discuss more what pictures do and how they do that, so that a larger segment of people has a chance to understand what is going on. That’s why we desperately need to discuss photography more, in wider contexts, and we need to discuss it more deeply. It’s crucially important.
And news organizations better figure out a way to deal with Facebook’s censorship.