When it comes to responsibility for online content, where does the buck stop?

By: Enrique Dans
The Gonzalez v. Google LLC trial in the U.S. Supreme Court looks set to be an acid test for technology companies’ liability for the content on their platforms. Several companies have submitted briefs to the court requesting that the provisions of Section 230, which protects online platforms from liability for what they publish, be upheld, warning they will go out of business otherwise.
Gonzalez v. Google pits the relatives of an American woman killed in the November 2015 Bataclan terrorist attack in Paris against YouTube, which they accuse of helping radicalize the perpetrators of the attack through its recommendation algorithm. Concurrently, the Supreme Court is reviewing a similar case, Twitter Inc. v. Taamneh, a lawsuit brought by the family of Jordanian Nawras Alassaf, who was killed in 2017 during an ISIS attack in Istanbul against Twitter, Google and Facebook for failing to control terrorist content on their platforms.
Platforms’ responsibility for the content they host or recommend through their algorithms is under the spotlight: on the one hand, it makes sense to give platforms some leeway, as long as they develop mechanisms to remove harmful or dangerous material either through social tagging systems or active monitoring.
That said, it is harder to argue that a platform should be free to use its algorithms to recommend content that creates echo chambers and encourages the use of violence as the basis of its business model.
If platforms were to be held responsible for such behavior, we would likely see any number of multi-million dollar claims, forcing major changes to said business models.
See more: Disney issued a blistering statement on activist investor Nelson Peltz’s bid to join its board
Meanwhile, in Washington state, the Seattle Public School District has brought legal actionagainst TikTok, Instagram, Facebook, YouTube and Snap for exploiting the vulnerable minds of children to generate profits, accusing them of using psychological tactics that have led to an epidemic of mental health problems in schools across the country, and is calling on the companies to help pay for their treatment. According to the public school consortium, these companies are responsible for hooking tens of millions of students into feedback loops of excessive use and abuse, and is demanding the maximum statutory and civil penalties allowed by law by making one argument: that these platforms are detrimental to health, meaning the social networking companies have violated the state’s public nuisance law.
In the UK, the government wants to go a step further, and in addition to holding platforms liable for content that may be harmful to children or that encourages illegal immigration, and are demanding prison sentences for company executives who collude or acquiesce in ignoring takedown notices for such content.
In short, a reckoning is coming for companies that for too long have ignored for their own benefit the potential effects of the content they publish and recommend. At the same time, this is hardly a black-and-white issue; the education system and parents must bear some responsibility.
But social networks are undoubtedly guilty of feeding the timeline of a potential pedophile with videos of young girls performing provocative dances, repeatedly recommending Islamic State videos, as well as creating building a bubble of supposed social approval around someone so that they spend more time with their eyes glued to their platforms.
What we are seeing with these lawsuits is a change in direction of the regulatory winds that reflect growing demands by the public for more control over big tech. Should Facebook, YouTube, et al. bear responsibility for the effects of their content management? Probably yes. Should they have full responsibility ? Absolutely not. When it comes to apportioning responsibility, I’m afraid there are a few more culprits in all this. And this is an uncomfortable truth that few will want to hear, particularly schools and parents that have made a very serious mistake by refusing to engage with the issue.