Facebook wants to create a ‘Supreme Court’ for content moderation. Will it work?

The World
Updated on
napalm girl

Editor’s note: Facebook on Tuesday released more details about the structure and functions of its oversight board. The company said that, when fully operational, the board will consist of approximately 40 members who will be supported by a full-time staff and serve three-year terms for a maximum of three terms. To ensure the board’s independence, Facebook said it’s establishing an independent trust to oversee the compensation and formal appointments of board members and support staff. Initially, Facebook will select a small number of candidates to serve on the board and will work with those members to fill all 40 seats. After that, member selection will be up to the board itself. As part of its selection process, Facebook has opened a portal to solicit board member recommendations. During a press call Tuesday, Facebook’s Brent Harris said the company hopes to have the board up and running by early 2020, but noted that there’s still “a long way to go in making the board fully operational.”

A few years ago, Norwegian writer Tom Egeland posted a photo on Facebook of the Napalm Girl, as it’s widely known. Nick Ut’s Pulitzer Prize-winning snapshot is one of the most iconic images of the Vietnam War. In it, a naked, young girl screams in pain as she and several other children flee a napalm bomb attack.

Egeland had shared several photos on the social media site that he felt had changed the history of warfare. Within hours, Facebook took it down and kicked him off the platform, saying the post violated its rules, or community standards. “An image of a naked child would normally be presumed to violate our community standards, and in some countries might even qualify as child pornography,” Facebook said in a statement.

“I was definitely surprised and upset. This is an iconic war photograph, which has nothing to do with nudity or child pornography. It’s a historic photograph.”

Tom Egeland

“I was definitely surprised and upset,” Egeland said. “This is an iconic war photograph, which has nothing to do with nudity or child pornography. It’s a historic photograph.”

When Aftenposten, one of Norway’s largest newspapers, reported on the incident and shared its article along with the Napalm Girl photograph on its Facebook page, the content was removed. Even the Norwegian prime minister’s post was taken down when she chimed in.

Egeland, meanwhile, grew more frustrated and appealed Facebook’s decision. But reaching out to the company was getting him nowhere.

“It felt pretty hopeless,” Egeland said. “It didn’t really matter to me that I was banned. I can live without Facebook. But I felt like [I was] fighting a system where nobody listens.”

Eventually, Facebook did listen. After a wave of negative media attention and a global protest, the company acknowledged the photo’s “historical importance” and reversed its decision, saying that “the value of permitting sharing outweighs the value of protecting the community by removal.”

The incident illustrates a regular conundrum for Facebook. Every day, the company has to make difficult and consequential decisions about what should stay or go on its platform. In addition to Napalm Girl, it deliberated over a portrait of a starving girl in Yemen, and content said to be stoking anti-Muslim rumors in Sri Lanka and violence in Myanmar.

In recent years, the company’s content moderation practices have come under intense scrutiny for lacking transparency and for allowing Facebook — through its unilateral decision-making — to become an arbiter of global expression, wielding total control over what its more than 2 billion users can see and post.

Currently, Facebook relies on a combination of technical tools and human moderators to enforce its community standards. Soon, the company hopes that the final decisions about content it’s flagged will be made not by its own teams, but by a group of outsiders.

By the end of the year, Facebook plans to have up and running an external, independent oversight board to tackle its most challenging content moderation decisions — a body that Facebook CEO Mark Zuckerberg said will function like an appeals court, or the US Supreme Court.

“First, it will prevent the concentration of too much decision-making within our teams. Second, it will create accountability and oversight. Third, it will provide assurance that these decisions are made in the best interests of our community and not for commercial reasons,” Zuckerberg wrote in a November 2018 blog post outlining the initiative.

Since then, Facebook has released a blueprint for the board, and recently concluded a global consultation period that included workshops, input sessions and board simulations in cities around the world to determine the body’s structure and function.

“It’s not like we have a governance system that governs the entire world. But Facebook has 2 billion-plus users, and it’s a new genre, I guess of governance, that we’re kind of stepping into and I think that’s what’s most exciting about it.”

McKenzie Thomas, business program manager at Facebook

“It’s not like we have a governance system that governs the entire world,” said McKenzie Thomas, a business program manager at Facebook who’s been working on the oversight board project. “But Facebook has 2 billion-plus users, and it’s a new genre, I guess, of governance, that we’re kind of stepping into and I think that’s what’s most exciting about it.”

There are still many details to flesh out further, but Facebook envisions a 40-member panel made up entirely of independent experts. They’ll take on appeals from Facebook users, and guided by the company’s community standards, they’ll issue binding decisions on the fate of content under review. The board will also advise the company on its policies and recommend changes.

“We are looking to basically utilize this body to make us better, but not at all to take the responsibility off of our plates,” Thomas said.

Facebook wants the board to have complete control over the cases it takes up, and acknowledges that the panel won’t be able to address every concern.

“It’s going to consider a selected subset of the most difficult and controversial issues on which reasonable people could reach different conclusions,” said Noah Feldman, a Harvard Law School professor who came up with the idea for the oversight board. “It can’t realistically fix every possible mistake that either computers or humans make. That would be a strange thing for it to do.”

Feldman said that when he came up with the idea for an independent oversight board in January 2018, he didn’t have a particular piece of contested content in mind. Rather, he said, he was thinking about the broader content moderation challenges facing the company: being criticized for taking down too much material, therefore stifling free speech, or leaving too many posts intact and jeopardizing users’ safety.

When Feldman brought the oversight board idea to Facebook, he was pleasantly surprised by the reception. “I discovered, to my great interest, that Mark Zuckerberg had himself been thinking for a long time about different ways to devolve power, create accountability, put some kinds of decision-making outside of the company,” he said.

Zuckerberg’s embrace of the project came at a time of heightened criticism and increased government scrutiny of his company’s content moderation practices, and the real-world impact they can have on democracy, speech and user safety. (In recent years, some countries, including Germany, Australia and the UK, have introduced legislation to regulate Facebook and other social media companies.) Facebook also acknowledged that the platform was being weaponized by bad actors who wanted to cause harm.

Last year, at a tense Capitol Hill hearing, Zuckerberg admitted that Facebook hadn’t done enough to prevent that. “And that goes for fake news, for foreign interference in elections and hate speech … We didn’t take a broad enough view of our responsibility, and that was a big mistake. And it was my mistake. And I’m sorry,” he said.

All of this is perhaps most evident in Myanmar. There, Facebook is accussed of playing a role in the brutal violence against the Rohingya Muslim minority by allowing hate-filled posts and false information to run rampant on its platform. Facebook has since admitted that it didn’t have enough people with the language skills and cultural understanding to deal with what was happening in Myanmar.

That’s an issue that the board will have to grapple with, said Nicolas Suzor, a professor of law at Queensland University of Technology in Australia who has advised the oversight board project.

“One of the challenges that Facebook is facing here is to make sure that the people who are on the board are sufficiently cognizant of the cultural contexts and the power imbalances of the people whose cases will come before them. I think this is something that Facebook still needs to work through because ultimately, you can’t create a board that is sufficiently representative of the 2 billion people who use Facebook.”

Nicolas Suzor, professor of law at Queensland University of Technology in Australia

“One of the challenges that Facebook is facing here is to make sure that the people who are on the board are sufficiently cognizant of the cultural contexts and the power imbalances of the people whose cases will come before them,” he said. “I think this is something that Facebook still needs to work through because ultimately, you can’t create a board that is sufficiently representative of the 2 billion people who use Facebook.”

Facebook knows this. That’s why it’s working on a roster of experts that the board can turn to for input on specific cultures, regions and languages, the company said. They also want a regional representative present for deliberations, to help the board with its decisions.

And there are many other challenges, too, including ensuring that the board is truly independent from Facebook.

In the coming weeks, Facebook will finalize board plans and then embark on a global hunt for panelists. Suzor wants to see all the details before he makes a final judgment. But overall, he’s optimistic because he thinks the body will address one of the biggest problems of content moderation: “That the decisions are made usually in secret by people that we don’t know, and we don’t have any real visibility into that process.”

When it makes a decision on a piece of content, the board will issue a written statement explaining its opinion, and if it recommends a change to Facebook’s policies, the company will have to respond publicly, in writing.

“It’s not like Facebook is giving up all of its power to make the rules, but having that conversation in public … I think that’s a really crucial part of this puzzle that brings these discussions out into the open,” Suzor said.

Suzor said this is only one step to hold the company accountable for what happens on its platform. But it’s a good step.

Egeland, the Norwegian writer at the center of the Napalm Girl controversy, agreed. But he also wanted Facebook to acknowledge that it’s not just a tech company, but a media company, too.

That’s a label that would subject Facebook to increased government oversight and regulation — something the company has long resisted.

Less than .05% of listeners will donate. Can we count on you?

Our coverage reaches millions each week, but only a small fraction of listeners contribute to sustain our program. We still need 224 more people to donate $100 or $10/monthly to unlock our $67,000 match. Will you help us get there today?