Telegram, the social media giant co-founded by Russian-born billionaire Pavel Durov, has updated its policy on private chat moderation. This change follows a period of intense scrutiny regarding the platform’s content moderation practices.
Previously, Telegram’s frequently asked questions (FAQ) page assured users that private chats were not subject to any moderation requests, meaning users could theoretically engage in illegal activities without fear of intervention. However, the company has now retracted this statement. This shift comes after Telegram CEO Durov made a public statement following his arrest in France, committing to improve content moderation on the platform.
Durov acknowledged the challenges posed by Telegram’s rapid user growth to 950 million users, which has facilitated misuse of the platform by criminals. He assured users that efforts to improve the situation were already underway. The company’s updated FAQ page now guides users on reporting illegal content to the platform’s moderators.
While the reasoning behind Telegram’s policy change remains unclear, it likely stems from increasing pressure from authorities. French law enforcement agencies have launched an investigation into Telegram’s alleged role in the spread of sexually explicit deepfake content. South Korea has also initiated a similar investigation.
Furthermore, concerns have arisen regarding Telegram’s financial stability, potentially hindering its planned $30 billion initial public offering (IPO). Despite boasting over 900 million users, Telegram reported a $108 million loss last year, with revenues totaling $342 million.
The updated policy on private chat moderation reflects Telegram’s attempt to address these concerns and improve its image in the face of growing scrutiny. This move signifies a shift in the platform’s approach to content control, emphasizing its commitment to addressing issues related to illegal content and user safety.