Quantcast
Channel: Hacker News
Viewing all articles
Browse latest Browse all 25817

Facebook Backs Down on Censoring ‘Napalm Girl’ Photo

$
0
0

Facebook Inc.FB-2.43% reversed a decision to delete posts containing a famous Vietnam War photo of a girl fleeing napalm bombs after the move drew rebuke from Norway’s prime minister and the nation’s largest newspaper. The flare-up highlighted Facebook’s powerful role as a host for news—even though it says it isn’t a media company.

In a letter on its front page, Norwegian daily Aftenposten lashed out at Facebook Chief Executive Mark Zuckerberg for “limiting freedom” after the social network removed the image of the naked girl from the newspaper’s profile page earlier this week, citing its policy against showing nudity.

Norwegian Prime Minister Erna Solberg weighed in, posting on her Facebook page, “Facebook gets it wrong when they censor such images.” She added: “I say no to this type of censorship.”

Hours later, Ms. Solberg’s post—which included the image—disappeared from her account.

After the public outcry, Facebook said Friday it would restore the image.

“An image of a naked child would normally be presumed to violate our community standards, and in some countries might even qualify as child pornography,” a Facebook spokeswoman said. “In this case, we recognize the history and global importance of this image in documenting a particular moment in time.”

Facebook added that it will adjust its review process, which relies on both software and human moderators, so sharing of the photo would be allowed, though it would take a few days for it to come into effect. It also said it would discuss the issue with publishers.

“Facebook has guidelines, but beneath them it has layers and layers of people,” said Tarleton Gillespie, a communications professor at Cornell University who is also affiliated with Microsoft Research. “At each of these layers, someone could remove something the rules actually allow, or allow something the rules actually prohibit.”

“The real question is what are the public implications of a review process that must make these decisions, at scale and under pressure, across hundreds of reviewers, thousands of times a day,” he added.

The company relies on its users to flag objectionable content in the vast majority of cases. Those posts are then routed to Facebook’s “community operations” team, located in offices around the world. Reports are graded so more serious ones, like ones involving terrorism, are dealt with first.

Facebook outlines its guidelines on nudity, terrorism and hate speech on its website, including strict rules around child pornography. It, along with other tech companies like Twitter Inc.TWTR-3.16% and Alphabet Inc.GOOGL-1.79%’s YouTube, employs technology called PhotoDNA to scan for images related to child sexual exploitation based on a database created by the National Center for Missing and Exploited Children.

Hany Farid, chair of the computer-science division at Dartmouth College, who helped develop the PhotoDNA system, said the Vietnam image would never have been included in the child pornography database. “The standard that we used was 12 years and younger and clearly sexually explicit—this photo clearly does not meet that criteria.”,” Mr. Farid said.

“This is an issue of judgment of what does and does not violate terms of service, not of the technology that is used to enforce those terms.”

Facebook users, civil rights groups and media companies have often accused Facebook of applying its content policies overzealously. For instance, Facebook repeatedly took down photos of women breast-feeding their children. It eventually updated its restrictions to allow images of female nipples if a woman was “actively engaged in breast-feeding or showing breasts with post-mastectomy scarring.”

Website operators like Facebook, YouTube and Twitter all have rules for what can be posted on their platforms. In particular, Facebook, which is the largest driver of traffic to digital publishers, has faced scrutiny in recent months after several instances of news content disappearing from its site.

In July, Facebook took down a live video posted by a woman in Minnesota that showed her boyfriend’s last breath after being fatally shot by police. Facebook later restored the video after public outcry.

Two-thirds of Facebook users in the U.S. get their news on Facebook, amounting to roughly 44% of the general U.S. population, according to a Pew Research Center study in May.

Still, Mr. Zuckerberg often insists that Facebook is simply a platform. “We are a technology company, not a media company,” Mr. Zuckerberg said last month. “We build tools. We do not produce any of the content.”

Aftenposten editor in chief Espen Egil Hansen seized on the flaws in Facebook’s guidelines. “Listen, Mark, this is serious,” he wrote. “First you create rules that don’t distinguish between child pornography and famous war photographs. Then you practice these rules without allowing space for good judgment.”

The deleted image, known as “The Terror of War,” was shot by Nick Ut in 1972 and won a Pulitzer Prize. It depicts 9-year old Kim Phuc fleeing from napalm bombing during the Vietnam War. She had ripped off her burning clothes.

The Facebook-Norway quarrel actually began last month, when Norwegian writer Tom Egeland saw Facebook had removed a post he wrote on famous war pictures, including the picture of Ms. Phuc. Facebook notifies users when their posts are removed for violating its standards, and Mr. Egeland said in a post this week that he was notified that the picture was removed due to nudity.

Mr. Egeland republished the photo on his Facebook account, adding links to newspaper articles in which Ms. Phuc, who now lives in Canada, reportedly supported its dissemination. Facebook again deleted the post and banned Mr. Egeland from the site for 24 hours.

Aftenposten began writing about Mr. Egeland’s story this week, in turn getting entangled with Facebook.

“Mark Zuckerberg is becoming a supreme editor,” the writer said.

The Norwegian Press Association appealed to Norway’s giant sovereign-wealth fund to exert ownership influence to make Facebook change its policy.

“We think the fund’s Council on Ethics should consider putting Facebook on the fund’s observation list, due to its censorship of these images, and of the following debate,” said Kjersti Loken Stavrum, secretary general of the Norwegian Press Association.

The fund, which owns $1.54 billion worth of Facebook shares, or 0.52%, declined to comment.

Norway’s prime minister urged Facebook to review its policy.

“It is highly regrettable,” Ms. Solberg said in a text message sent through an aide. “What they do by removing such images, no matter what good intentions, is to redact our shared history.”

Write to Kjetil Malkenes Hovland at kjetilmalkenes.hovland@wsj.com and Deepa Seetharaman at Deepa.Seetharaman@wsj.com


Viewing all articles
Browse latest Browse all 25817

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>