Before the American Supreme Court, two cases are currently to clarify responsibility on the Internet. With potentially huge consequences.
Whether opinion posts on Facebook, a quick food photo on Instagram or the tutorial video on YouTube: the Internet as we know it lives more and more on the fact that everyone can participate. From very large to no matter how small contributions, for sometimes more, sometimes fewer people. However, that could change decisively in the near future – thanks to a procedure in the USA.
There, a question is to be decided that has long been kept vague: who is responsible for the content that we all upload and consume on social networks every day? Of course, it’s not about the harmless image of lunch. In the lawsuit currently before the US Constitutional Court, Google is accused of not doing enough to counter terrorist content on its platform and even supporting it in its distribution. However, the decision could have consequences for the entire network.
Who is responsible on the internet?
Specifically, it is about a section of the law that made the social network possible in the first place, namely Section 230 of the Communications Decency Act. To put it simply, since 1996 it has stipulated that the operators of websites cannot be legally prosecuted for the content posted there by users, unlike the case with newspapers and television stations with the content created there. Although the networks have to remove illegal content, they are not prosecuted for it. It is precisely this section that is now under scrutiny.
And this by two simultaneous processes. Both have the same core: relatives of terror victims, once from the attack on the Bataclan in Paris in 2015 and once from an attack on a Turkish nightclub in 2017, accuse Google and Twitter of having contributed to the radicalization of the attackers. According to the plaintiffs, propaganda by the Islamic State was not only spread via YouTube and Twitter, but was even recommended by algorithms in some cases. Now the two companies are to pay damages.
Google warns of the “horror show”
This could have far-reaching consequences for the internet as we know it. So far, social media has worked according to a certain principle: Users are largely free to post, and moderation only takes place afterwards. However, if the website operators are held accountable, this could change radically.
Lisa Blatt, a lawyer for Google, warned in her testimony before the Supreme Court on Tuesday that “a horror show” is ahead of the Internet. Depending on the court’s decision, the company is either forced to accept any content that appears to be problematic, even to a limited extent remove – or, on the contrary, to give up moderation and wave through even the worst content, she argued.
Hard task
Moderating every piece of content is a Sisyphean task given the ever-increasing volume of posts. Simply viewing the vast number of videos would not be possible for human employees at all in terms of time. Attempts to automate the moderation have also largely come to nothing so far. It is true that keywords, known links and photos and videos that have already been identified as problematic can already be blocked automatically, as is the case with known child pornographic material, for example. However, artificial intelligence is far from being able to implement this for every piece of content.
For the operators of websites, American law would then be almost impossible to comply with. Considering that almost all of the world’s most important sites are based in the USA, the consequences were correspondingly large. This will not destroy Google, Blatt replied when asked by Judge John Roberts. “But smaller websites do.”
Of course, the internet would not be completely dead with the law. A massive change would be expected in any case. Pages from Europe or Asia could grow significantly in their role, and a move of the US pages out of the USA would also be conceivable. However, the consequences cannot be reliably estimated.
“We are not prepared for this.”
The Section 230 controversy is not new. And unlike many other political issues in the US, it does not follow party lines. Critics from the Republican camp see this as the basis for an alleged suppression of conservative voices by the tech giants. The rule is a thorn in the side of the Democrats, however, because Google, Facebook and Co. cannot be held accountable for the numerous misinformation on the platforms.
After two days of negotiations, however, it is already clear that the Supreme Court does not want to decide how things should continue on the Internet. “You know we’re not nine experts on the Internet,” Judge Elena Kagan told a laughing audience. The 62-year-old also clearly referred to the high average age of the judges. Although the oldest judges had died in recent years, even the youngest sitting judge, Amy Coney Barrett, appointed by Donald Trump, is already 51 years old.
Judge Brett Kavanaugh also sees the possibility of gigantic repercussions. And therefore supports a legislative initiative of Congress. Setting the wrong limits “would collapse the digital economy, with all the imaginable consequences for workers and consumers, pension plans and what not,” he explains his skepticism. “We are not prepared for this.”
Quellen:New York Times,Techcrunch, Deadline, NPR, CNN