Imagine this: Mark Zuckerberg being extradited to Australia over a violent video a user posts on the platform.
Seems far-fetched, but it's exactly what Australian lawmakers want to be able to do through a new law they approved this week.
The point of the legislation is to get social media companies to move quickly to take down violent content that's posted on their platforms. The law, which passed Australia's parliament on Thursday, will fine social media and web hosting companies up to 10% of their annual global turnover and imprison executives for up to three years if videos showing terrorist attacks, murders, rape or kidnapping is not removed “expeditiously.”
The new law is in response to last month's terror attack in neighboring New Zealand, where a lone gunman broadcasted his attack live on Facebook and the video was widely shared on several platforms before being removed.
Australian Attorney General Christian Porter heralded the laws as a "world first in terms of legislating the conduct of social media and online platforms" but technology experts say the law is a "knee-jerk" overreaction by a government and a failure to truly address the issue.
Nicholas Suzor, a law professor at Queensland University of Technology, says the legislation is too ambiguous.
“The real problem here is that it's actually quite vague about what platforms are required to do,” he said. “So, it's not clear how quickly a platform has to remove a video and particularly — even whether they need to know about it before they become criminally liable.”
Even Porter, while supporting the law, couldn't explain.
“Using the Christchurch example, I can't precisely say what would have been the point of time at which it would have been reasonable for them to understand that this was livestreaming on their site or playable on their site, and they should have removed it,” Porter said at a Thursday news conference after the bill passed both houses of parliament. “But what I can say, and I think every Australian would agree — it was totally unreasonable that it should exist on their site for well over an hour without them taking any action whatsoever.”
Australia is not the first country to try to hold social media companies accountable for what's posted on their sites.
In Germany, for example, there is a law that says that social media companies have to remove hate speech, fake news, and other problematic content within 24 hours — or face fines.
The passage of this latest piece of legislation in Australia comes as governments around the world are scrambling to figure out the right way to regulate the internet, says Suzor.
“What that means is that too often, we get knee-jerk reactions and sloppily designed laws that don't really understand how you could regulate tech companies in a way that preserves innovation and freedom of speech and access to information,” he says. “That's the big problem that we have. Not necessarily that countries want to regulate, but that they're going about it the wrong way.”
Under Australia's new legislation, Suzor says there's a danger that companies will go overboard and take down legal content.
"In the past we've seen that these types of laws threaten legitimate speech a lot and they most often are minorities and disadvantaged groups."
“In the past we've seen that these types of laws threaten legitimate speech a lot and they most often are minorities and disadvantaged groups,” Suzor said, pointing to a video depicting the aftermath of the shooting of Philando Castile as an example of content that could inadvertently get taken down because of the new law.
Castile was shot and killed by a police officer during a traffic stop in Minnesota in July 2016. His girlfriend live-streamed the aftermath of the shooting on Facebook, and the video sparked protests and helped galvanize the Black Lives Matter movement. The officer involved in the shooting was later acquitted.
But under Australia’s new law, if Castile’s murder had been shown on the video instead of just its aftermath, it would be considered illegal.
“This is something that is documented on social media,” Suzor says. “And it's not something that would be technically prohibited under this law. But there's no way that an algorithm can easily figure it out. So, my real concern here is that any technical response from the platforms will either be ineffective or much too broad in the type of content that it restricts.”
This kind of legislation also creates uncertainty for Australia's tech industry, says Sarah Moran, who runs a company focused on getting girls and women into tech called Girl Geek Academy.
“It makes my job really hard, particularly when you're talking to aspirational young people to say — I love that you want to build technology, but I cannot tell you what is on the horizon from the politicians that may choose to prevent you from doing that,” Moran says.
“It’s much easier to pass a law to be able to restrict technology than to restrict the people that live and vote in Australia.”
While technology firms have strongly opposed Australia’s content legislation, they say they are working to keep violent and problematic content off their platforms.
"We have zero tolerance for terrorist content on our platforms," said a spokesperson for Google in an emailed statement. "We are committed to leading the way in developing new technologies and standards for identifying and removing terrorist content."
Facebook said last week it was exploring restrictions on who can access their live video-streaming service, depending on factors such as previous violations of the site's community standards.
"With the vast volumes of content uploaded to the internet every second, this is a highly complex problem," said Sunita Bose, managing director of Digital Industry Group, Inc., of which Facebook, Apple, Google, Amazon and Twitter are members.
Bose said the laws fail to understand the complexity of removing violent content.
“Technology doesn’t fight back,” she says. “It’s much easier to pass a law to be able to restrict technology than to restrict the people that live and vote in Australia.”
Reuters contributed to this report.