Recently the U.S. Supreme Court heard two cases related to Big Tech and the extent of its liability for content posted by ISIS on You Tube and Twitter. Over the coming months and years, we can expect other cases challenging Big Tech’s content judgments and decisions to work their way through the judicial system, with some landing on the docket of the High Court. Given the paradoxical calls for increased content moderation and the end to partisan censorship, clarity on the roles and responsibilities of Big Tech in hosting what has become the digital public square will inevitably need to be resolved. Whether the courts or the legislature end the debate remains to be seen. What is clear is that Section 230 of the Communications Decency Act of 1996, which has spawned the confusion, needs to be fixed if we are to have social media platforms that positively contribute to the growth of the American democracy.
Fixing Section 230
Fixing Section 230
Fixing Section 230
Recently the U.S. Supreme Court heard two cases related to Big Tech and the extent of its liability for content posted by ISIS on You Tube and Twitter. Over the coming months and years, we can expect other cases challenging Big Tech’s content judgments and decisions to work their way through the judicial system, with some landing on the docket of the High Court. Given the paradoxical calls for increased content moderation and the end to partisan censorship, clarity on the roles and responsibilities of Big Tech in hosting what has become the digital public square will inevitably need to be resolved. Whether the courts or the legislature end the debate remains to be seen. What is clear is that Section 230 of the Communications Decency Act of 1996, which has spawned the confusion, needs to be fixed if we are to have social media platforms that positively contribute to the growth of the American democracy.