Discord bans malicious misinformation about Covid-19 on its platform

In a nutshell: Following other digital platforms such as Reddit and YouTube, Discord has updated its community guidelines with new rules regarding dangerous medical misinformation. The instant messaging service will prohibit users from posting “misleading health information that may cause physical or social harm.”

The policy change was announced on Friday at Blog post from Alex Anderson, Senior Policy Officer for the Discord Platform. The rules are clearly aimed at misinformation about Covid-19 that is becoming more prevalent online.

“Ensuring the accuracy of online health information has never been more important than during the Covid-19 pandemic,” it says.

In a blog post, Discord cited specific types of content that are subject to the new rules, including anti-vaccination rhetoric, dangerous cures for diseases, and anything that “could interfere with the resolution of the public health emergency.” They stated that medical information is misleading if it “directly and unequivocally contradicts the most recent consensus of the medical community.”

The company is committed to combating Covid-19 conspiracy theories with these guidelines curbing baseless rumors and widely debunked health claims. However, the message also clarifies that they do not intend to “punish polarization or conflicting viewpoints”, with non-misleading personal experiences, commentary and satire excluded from the rules.

These guidelines apply to both individual accounts and organized servers. Discord has clarified that enforcement action will be issued based on the severity and potential harm of misinformation, with penalties ranging from warnings to permanent account or server bans.

In an interview with edge, Discord’s general counsel Clint Smith explained that “if someone posts on Discord, drinks four ounces of bleach, and your body gets rid of the coronavirus, that calls for action.” He also mentioned that low-risk disinformation would most likely not work.

“If someone posts a message that you hold crystals to your chest for 5 minutes and your lung capacity improves, that’s not something Discord will take action against,” Smith said.

The community messaging platform is the latest in a string of tech companies that have tried to fight health-related misinformation. Following massive public pressure, YouTube, Reddit and Facebook have made policy changes designed to dampen the growing anti-vaccine rhetoric on their platforms. Conversely, streaming giant Spotify has refused to remove high-profile creators such as Joe Rogan over allegations of misleading claims about Covid-19.

This is not the first time the company has taken significant steps to curb the spread of malicious content, with operations to combat exploitative content, violent extremism and illegal activities. In his transparency report for the first half of 2021, Discord said it had taken down over 43,000 servers and banned 470,000 accounts for violating the rules.

Only time will tell if Discord can effectively enforce this new policy across over 150 million active users and nearly 6 million servers. Meanwhile, the company published instructions about communicating malicious misinformation to their team.

Image credit: Alexander Shatov

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button