The company said losing 230 would create a “dystopia” where providers face constant legal pressure to censor any kind of controversial content. Under such stresses, other apps and sites would simply leave up any and all content, no matter how objectionable.
More than that, Google further argued that this would return the internet to the “see-no-evil approach” of tech companies in the mid-1990s which “risked a proliferation of pornography, hate speech, and illegality.” Of course, tech companies are still actively dealing with all those issues. For instance, major platforms like Twitter have struggled to fight back against proliferation of child sexual abuse material. Meta’s Facebook is constantly facing tough questions of what is allowed on the platform and what isn’t.
Google’s second big argument is that without 230 protections, people would be able to hold online users liable for sharing or even liking articles. The company argued that algorithmic-based content systems are the only way that modern tech companies can possibly handle the load of digital content published daily. So if people are able to target how websites sort content “the internet would devolve into a disorganized mess and a litigation minefield.”
The case goes back to the 2015 terror attack in Paris, France that left 130 dead and many more injured. Nohemi Gonzales was a U.S. citizen living in Paris who was killed in the attack, and her family sued Google saying YouTube was a main vehicle for radicalizing and recruiting new members to the Islamic State. The family has further argued that 230 has gone beyond the law’s original intent, and has been used to shield companies from responsibility for algorithms that recommend harmful content.
Google has previously argued that it has worked to remove terrorist and other harmful content. The company has used Section 230 protections as the basis for its defense.
Content moderation is one of the most pressing tech policy issues going into the new year. There are already laws from Texas and Florida that restrict tech companies from doing any content moderation whatsoever. These laws challenging 230 are moving in the Supreme Court’s direction.