As companies develop more and more types of technology to find and remove content in different ways, it becomes an expectation that they should use it. Can moderate implies should to moderate. After all, once an instrument has been put into use, it is difficult to put it back in the box. But the moderation of the content is now snowballing, and the collateral damage in its path is too often ignored.
There is an opportunity now for some careful thought on the way forward. Trump’s social media accounts and elections are in the rearview mirror, which means content moderation is no longer the constant A1 story. Perhaps that proves that the real source of much of the anxiety was politics, not platforms. But there is – or should be – some persistent concern about the impressive display of power that a handful of leaders of society have shown by throwing the spear at the accounts of the leader of the free world.
The chaos of 2020 has shattered any notion that there is a clear category of harmful “misinformation” that some powers in Silicon Valley should abandon, or even that there is a way to distinguish health from politics. Last week, for example, Facebook he changed his policy and said it would no longer cut posts that say Covid-19 is man-made or manufactured. Only a few months ago In the New York Times I had cited the belief in this. “without foundation“Theory as evidence that social media has contributed to an ongoing ‘reality crisis’. There was a similar ups and downs. masks. At the beginning of the pandemic, Facebook prohibited advertising for them on the site. This lasted until June, when the WHO finally he changed his guidance to recommend wearing masks, despite many experts advising him long before. The good news, I think, is they they were not so effective to impose the ban in the first place. (At the time, however, this was not seen as good news.)
When we get more out of what the authorities did wrong during the pandemic or cases where politics, not experience, determined narratives, there will naturally be more skepticism about trust in them or private platforms to decide. when you close the conversation. Issuing guidance on public health for a particular moment is not the same as stating the reasonable limits of the debate.
Calls for further repression also have geopolitical costs. Authoritarian and repressive governments around the world have pointed to the rhetoric of liberal democracies to justify their own censorship. This is obviously a specious comparison. Ceasing criticism of the government’s management of a public health emergency, as does the Indian government, it is clearly an affront to freedom of expression as always. But here it is some tension to scream at the platforms to take more here but it stops taking so much over there. So far, Western governments have refused to address this. They have largely left the platforms to fend for themselves in the global growth of digital authoritarianism. And the platforms are losing. Governments need to march and chew gum as they talk about platform regulation and freedom of speech if they want to defend the rights of numerous users outside their borders.
There are other exchanges. Because the moderation of the content to scale will be never be perfect, the question is always which side of the line to go wrong when imposing rules. Tighter rules and a heavier implementation necessarily mean more false positives: That is, more valuable discourse will be broken down. This problem is exacerbated by the increasing reliance on automatic moderation to lower content to scale: These tools are blunt and stupid. If it is said to leave more content, the algorithms will not think twice. They cannot assess the context or tell the difference between content that glorifies violence or records evidence of human rights abuses, for example. The tribute to this type of approach became clear during last week’s Palestinian-Israeli conflict as Facebook did remitted repeatedly essential content from and about the Palestinians. This is not once. Maybe he should no implies always should – especially as we know that these errors tend to decline disproportionately on already marginalized and vulnerable community.