Entities such as Facebook Inc and Alphabet’s Google, which owns YouTube, could face a fine of up to 10% of their annual global turnover.
Under Australian law, social media companies will now be slapped with a large fine for not “expeditiously” removing videos or photographs that show murder, torture or rape.
“It is important that we make a very clear statement to social media companies that we expect their behavior to change,” Mitch Fifield, Australia’s minister for communications and the arts, told reporters in Canberra.
The Australian parliament passed the new Criminal Code Amendment (Sharing of Abhorrent Violent Material) Bill Thursday which will then go to the House of Representatives Friday.
Entities such as Facebook Inc and Alphabet’s Google, which owns YouTube, could face a fine of up to 10% of their annual global turnover or imprisonment of executives for up to three years for violent content on the platforms.
“We have zero tolerance for terrorist content on our platforms,” a spokesperson for Google stated in an email, adding that “we are committed to leading the way in developing new technologies and standards for identifying and removing terrorist content.”
The fines could be up to $168,000 for an individual or $840,000 for corporations.
Some critics have remarked that the government’s reaction is not properly thought out.
“Laws formulated as a knee-jerk reaction to a tragic event do not necessarily equate to good legislation and can have myriad unintended consequences,” Arthur Moses, head of the Australian Law Council, said.
DIGI, a lobby group for tech companies including Facebook, Twitter and Google, agreed that the bill has far-reaching consequences.
“The bill is at risk of undermining Australia’s important security co-operation with the United States under the CLOUD Act, by potentially requiring US Internet providers to share content data with the AFP in breach of US law,” a DIGI said in a release.
“The bill does nothing to address hate speech, which was the fundamental motivation for the tragic Christchurch terrorist attacks,” the letter adds, explaining that “the current legal definition of hate speech sits within the Racial Discrimination Act 1975 and therefore only applies to race-based hate speech, and does not include religious-based speech.”
The new bill was tabled after a lone gunman attacked two mosques in Christchurch, New Zealand on March 15, killing 50 people at Friday prayers. The gunman streamed the attack, live on Facebook, and it was widely shared for more than an hour before being removed.
Last week, Facebook announced that the company is exploring restrictions on who can access their live video-streaming service, depending on factors such as previous violations.
Social media companies are also expected to inform the Australian Federal Police in a “reasonable” timeframe.