Paris — The internet has been called "the world’s biggest sewer," where hate speech and fake news are polluting a global source of solid and valuable online information.
While they can agree on that assessment, Americans and Europeans are taking different approaches to try to clean up the cesspool, based on differing laws, traditions and challenges.
In the United States, where the First Amendment provides vigorous legal defenses for free speech and limits state intervention, social media companies are stepping up voluntary measures to delete posts from terrorist organizations such as the Islamic State group and al-Qaida.
The European Union, whose members have laws barring hate speech, has been threatening Europe-wide laws against violent and extremist content. At a meeting with social media companies in Brussels on Wednesday (Dec. 6), the EU said the tech giants needed to do more if they want to avoid state regulation.
Germany, where many agree with British historian Timothy Garton Ash’s description of the web as a giant sewer, has now emerged as the trailblazer in countering hate speech and fake news.
Under a new law that takes effect Jan. 1, it will impose fines of up to 50 million euros (or $58 million) for any major online service that fails to take down illegal content within hours or days of a complaint.
"With this law, we will end the verbal law of the jungle on the web and protect the freedom of opinion for everybody," Justice Minister Heiko Maas said as he introduced the draft of the so-called "Facebook law" in the German parliament in June.
It’s hard to find reliable statistics about the types of hate speech circulating on the internet, but one project that tries — Hatebase, an initiative of the Sentinel Project in Canada — listed religion as the fourth most-frequent offending category after ethnicity, nationality and class, and ahead of sexual orientation, gender and disability.
Germany’s special role here stems from its post-war constitution. Just as the First Amendment shapes the U.S.’s light-handed approach to controlling the web, the opening sentence of Germany’s Basic Law steers it towards more interventionist policies.
"Human dignity is inviolable," reads its first article, which expresses a stand taken in reaction to the widespread violations of that principle during the Nazi years.
Several drafters were Christians, especially Catholics inspired by natural law theology, to guide what became West Germany in 1949. Free speech is guaranteed within the limits of the law and "the right to personal honor."
The constitution’s preamble opened in a similar vein, declaring "the German people" were "conscious of their responsibility before God and man."
Having seen democracy fail in the 1930s, the drafters added several passages on defending the "liberal democratic order," including the suspension of basic rights of individuals or groups judged to endanger it.
German criminal law against hate speech says a potential disturbance of the peace through calls for hate, violence or abuse against persons or groups can be punished by prison sentences of up to five years.
In addition, it spells out that violations of the human dignity of a person or group through insults, slurs or slander can also qualify as incitement to hatred.
Against this background, Germany’s new Network Enforcement Law requires social media companies with more than 2 million users to block "manifestly unlawful" content within 24 hours and possibly illegal content within seven days.
Explicit hate speech would fit the first category, while the second could include certain kinds of fake news. "The regulatory offense may be sanctioned even if it is not committed in the Federal Republic of Germany," the law adds.
If a website fails to take down objectionable material a week after a complaint, the Federal Office of Justice may intervene and impose fines when necessary.
The office, a Bonn-based department of the Justice Ministry in Berlin, is adding 50 new staffers to deal with the expected workload.
"Anyone can file a complaint," office spokeswoman Linda Schreiber told RNS. "The legal rights of that person do not have to be directly affected by the disputed content."
People can also file a separate criminal complaint with their local authorities, she added.
Hans-Georg Maassen, head of the domestic intelligence service known as the Federal Office for the Protection of the Constitution, explained the thinking behind the new law at a cybersecurity conference in Berlin on Nov. 27.
Polarizing debates on social media were often driven by "bot networks that spew out millions of soulless zombie opinions and only simulate political debates," he said.
"Democratic pluralism loses its foundations if it is no longer based on facts and reality is reduced to opinions," he said, citing the emergence of fake news in the U.S. presidential campaign and Britain’s Brexit referendum last year.
The European Commission has been stepping up pressure on social media organizations to block content illegal in Europe. Brussels reported in May that Facebook and Google removed half of these posts but Twitter lagged behind.
In September, it issued more explicit guidelines and said it would decide next May whether the companies have complied or stricter laws are needed.
"We cannot accept a digital Wild West," said Věra Jourová, EU commissioner for Justice, Consumers and Gender Equality. "If the tech companies don’t deliver, we will do it."
In June, Facebook, YouTube, Twitter and Microsoft created a joint forum to counter terrorism that would focus on technological solutions and partnerships with governments and civic groups to weed out offensive material.
In recent months, YouTube began removing extremist videos even if they did not show violence or promote hate. For example, it took down hundreds of lectures on Islamic history by al-Qaida recruiter Anwar al-Awlaki, who was killed in a 2011 U.S. drone strike in Yemen.
In a statement on Nov. 29, Facebook said: "Today, 99 percent of the ISIS and al Qaeda-related terror content we remove from Facebook is content we detect before anyone in our community has flagged it to us, and in some cases, before it goes live on the site."
This week’s EU meeting with technology companies reviewed a report urging social media to "remove terrorist content within 1-2 hours of upload, to the extent it is technically feasible, without compromising human rights and fundamental freedoms."
At the end of the Brussels meeting, EU security commissioner Julian King said that while a lot of progress had been made, additional efforts were still needed.
"We are not there yet," he said. We are two years down the road of this journey: to reach our final destination we now need to speed up our work."
Tough as it is, critics say the new German law does not go far enough.
Renate Künast, a Greens parliamentary deputy and former consumer affairs minister, did not vote for the law because she said it did not go far enough in barring abuse that media companies would not be allowed to publish.
She recently sued a far-right blog for inventing a quote in which she reputedly defended pedophilia. "It would be important to have a court decision because there are hardly any verdicts against fake news in Germany," she said.
Markus Beckedahl, founder of the tech blog netzpolitik.org, feared the law would lead to "overblocking," in which social media companies delete too much to avoid being fined.
He was also worried, he said, because "it takes the legal interpretation of obviously and potentially criminal content out of the responsibility of the courts and hands them over to these platforms."
(Tom Heneghan is a Paris-based correspondent.)