Telegram has quietly edited its FAQs to remove language stating that it doesn’t moderate private and group chats, as reported by CoinDesk. A section with the heading “There’s illegal content on Telegram. How do I take it down?” previously stated that content in chats and group chats remains between participants. Now, though, the section says that “all Telegram apps have ‘Report’ buttons” that will give a way for users to flag illegal content for the app’s moderators. Users only have to tap the message on Android, or press and hold it on iOS, and choose the Report option. They can also take note of links to the content they want to report and send an email to the service’s takedown email address (abuse@telegram.org).
The change comes after Telegram chief Pavel Durov published his first public comment following his arrest on his channel. Durov was arrested at an airport in France in late August as part of authorities’ investigation into the lack of moderation on the app and its failure to curb criminal activities. He was already released from custody, but he was charged with “complicity in distributing child pornography, illegal drugs and hacking software” on the messaging app, as well as “refusing to cooperate with investigations into illegal activity on the Telegram.”
French authorities apparently told Durov that he was arrested because they didn’t receive any responses from Telegram about their investigation. That was surprising, the app’s founder explained in his post, because Telegram has an official representative in the EU and an email address publicly available for anyone. He also said that French authorities had numerous ways to reach him for assistance and that he even previously helped them establish a Telegram hotline to address threats of terrorism in the country. In addition, he called the French authorities’ decision to “charge a CEO with crimes committed by third parties on the platform” they manage a “misguided approach.” No innovators will build ever new tools, he said, he they can be held responsible for the potential abuse of those tools.
Durov also talked about how Telegram defends the basic rights of people, especially in places they’re violated. In Russia, for instance, Telegram got banned when the service refused to hand over encryption keys that will allow authorities to spy on users. He said the service takes down “millions of harmful posts and channels every day,” publishes transparency reports and maintains direct hotlines with NGOs for urgent moderation requests.
The CEO admits, however, that Telegram has room for improvement. Its “abrupt increase in user count” to 950 million “caused growing pains” that made it easier for criminals to abuse its platform. Telegram aims to “significantly improve things in this regard” and has already started the process internally. Presumably, this change in its rules is part of the messaging service’s efforts to address authorities’ accusations that it has failed to prevent criminals from using its app. To note, service reported earlier this year that it has 41 million users in the European Union, but officials believe it lied about its user numbers to avoid being regulated under the Digital Services Act (DSA).
This article originally appeared on Engadget at https://www.engadget.com/apps/telegram-will-allow-users-report-illegal-content-in-private-chats-130053441.html?src=rss