Telegram has removed about 15 million illicit groups on its messaging platform using artificial intelligence (AI). Over the last few months, the platform has been under immense pressure to purge its messaging application of illicit content. The pressure led to the arrest of its CEO Pavel Durov in France, where he faces charges over the harmful and illegal content shared on the application. While Durov, who has had his first day in court, is still under strict restriction, the platform seems to have gotten to work. According to Telegram , it has now removed more than 15.4 million illicit groups and channels related to fraud and other illegal activities. In its statement, Telegram highlighted that the feat was achieved using cutting-edge artificial intelligence moderation tools. The move comes after the platform announced a crackdown in September, with Durov noting that they were trying to comply with government requests. Telegram launches new moderation page to monitor efforts The new moderation page aligns with the company’s policy of transparency in its dealings. According to a post from Durov’s Telegram page, the company wants the public to see its efforts at combating these illicit activities. He noted that the moderation team has been working behind the scenes over the last few months, removing the contents that violated its terms, and thanking the users for their help in curbing the menace. Durov promised to bring regular updates that users can see in real-time, showing how much work the moderation team is doing. Telegram moderation overview in 2024. Source: Telegram According to the moderation page, the platform has carried out a higher rate of enforcement after Durov’s arrest. The page notes that the removal of illicit accounts has been in effect since 2015. Blocked illegal groups and channels currently stand at about 15,474,022, representing the figure for only 2024. Telegram also increased its crackdown on Child Sexual Abuse Materials (CSAM) this year, banning a total of 703,809 groups and channels. Aside from user reports and the proactive work from the moderation team, Telegram also reported thousands of instances processed via third-party organizations against CSAM, leading to instant bans. Breaking down the figure, it highlights the top four groups that provided the most reports, including the Internet Watch Foundation, the National Center for Missing and Exploited Child, the Canadian Center for Child Protection, and Stitching Offlimits. Since 2016, Telegram has actively combated violence and terrorist propaganda by providing daily updates on its efforts, which have been recognized by Europol. The company noted that it has worked with numerous organizations on these fronts since 2022, banning 100 million terrorist content. In 2024, Telegram revealed it blocked 129,099 terrorism-related content. Durov’s continued restriction amid moves to combat illicit materials Durov will continue to be under the watchful eyes of French authorities after his first day in court. The Telegram CEO was accompanied by his legal counsel, answering several questions related to the platform’s activities. Although Durov refused to comment on the case after the session, he noted his belief in the French justice system and his innocence. However, if he is found guilty, he could spend up to 10 years in prison, in addition to a $550,000 fine. Meanwhile, Telegram recently announced a partnership with the Internet Watch Foundation (IWF), signaling an attempt to combat illicit child sexual exploitation materials shared on the platform. The charity will identify, and remove these materials, using its cutting-edge AI tools. IWF’s CEO, Derek Ray-Hill, mentioned the great feats both firms could achieve together. “ We look forward to seeing what further steps we can take together to create a world in which the spread of online sexual abuse material is virtually impossible ,” he said. A Step-By-Step System To Launching Your Web3 Career and Landing High-Paying Crypto Jobs in 90 Days.