Web Analytics
Cryptopolitan
2026-02-10 05:09:59

Discord enforces teen-by-default settings globally, requiring age verification for safe access to sensitive content.

Discord announced on Monday that it will soon ask all users globally to confirm their age via a facial scan or by uploading a form of identification to access adult content. Discord’s press statement revealed that the improved teen safety features being implemented globally will further the company’s long-standing goal of making the app safer and more welcoming for users aged 13 and older. The chat platform said that this update will automatically provide all new and current users worldwide with a teen-appropriate experience with revised communication settings, limited access to age-gated areas, and content filtering that protects the privacy and deep connections that characterize Discord. Discord expands global age checks and safety controls Discord will soon be expanding teen safety protections worldwide including teen-by-default settings and age assurance designed to create safer experiences for teens. We’re also launching recruitment for Discord's first Teen Council, creating a space for teen voices to help shape… pic.twitter.com/CW7G4sO38R — Discord Support (@discord_support) February 9, 2026 Discord allows people to create and join groups based on their interests. The group messaging tool revealed that it has more than 200 million monthly users. Discord currently requires certain users in the UK and Australia to confirm their age to adhere to online safety regulations. However, the chat platform announced that it will implement age checks for all new and existing users globally starting in early March this year. This means that some users will need to complete an age-verification process to change certain settings or access sensitive content, such as servers, age-restricted channels, app commands, and certain message requests. “Nowhere is our safety work more important than when it comes to teen users, which is why we are announcing these updates in time for Safer Internet Day. Rolling out teen-by-default settings globally builds on Discord’s existing safety architecture, giving teens strong protections while allowing verified adults flexibility.” – Savannah Badalich , Head of Product Policy at Discord. The community server app stated that the new default settings will limit what users can see and how they may communicate. Only users who will authenticate as adults will be allowed to access age-restricted forums and unblur sensitive content. The site also revealed that until users pass Discord’s age checks, they won’t be allowed to view direct messages sent to them by an unknown user. Drew Benvie, head of social media consultancy Battenhall, stated that it’s a good idea to support efforts to make social media a safer place for all users. Discord’s move comes amid growing global concern over how social media platforms expose children and teenagers to harmful content and addictive design features. Governments, regulators, and courts are increasingly examining tech companies to determine whether they are doing enough to protect young users. Recent measures demonstrate growing pressure to enhance industry-wide online safety standards. The European Union on February 6 accused TikTok of breaching the bloc’s digital regulations wth “addictive design” features that lead to compulsive use by children. EU regulators said that their two-year probe found that TikTok has not done enough to evaluate how features like autoplay and infinite scroll may affect users’ physical and emotional health, particularly children and “vulnerable adults.” The European Commission said it believes TikTok should change the “basic design” of its service. Social media giants face landmark child addiction trial The largest social media corporations in the world, including TikTok, are facing a number of historic trials that aim to make them accountable for injuries to children who use their services in 2026. February 9 marked the start of opening arguments in one such trial held in Los Angeles County Superior Court. There are allegations that Google’s YouTube and Instagram’s parent firm, Meta, intentionally injures and addicts children. The lawsuit’s original names, TikTok and Snap, reached settlements for unknown amounts. An American Lawyer, Mark Lanier, said in the opening statement that the case is as “easy as ABC,” which he said stands for “addicting the brains of children.” The lawyer also called Google and Meta “two of the richest corporations in history” that have “engineered addiction in children’s brains.” Prosecution attorney Donald Migliori said in his opening statement that Meta has fabricated claims about the security of its platforms by designing its algorithms to keep youth online despite being aware that youngsters are vulnerable to sexual exploitation on social media. Claim your free seat in an exclusive crypto trading community - limited to 1,000 members.

Holen Sie sich Crypto Newsletter
Lesen Sie den Haftungsausschluss : Alle hierin bereitgestellten Inhalte unserer Website, Hyperlinks, zugehörige Anwendungen, Foren, Blogs, Social-Media-Konten und andere Plattformen („Website“) dienen ausschließlich Ihrer allgemeinen Information und werden aus Quellen Dritter bezogen. Wir geben keinerlei Garantien in Bezug auf unseren Inhalt, einschließlich, aber nicht beschränkt auf Genauigkeit und Aktualität. Kein Teil der Inhalte, die wir zur Verfügung stellen, stellt Finanzberatung, Rechtsberatung oder eine andere Form der Beratung dar, die für Ihr spezifisches Vertrauen zu irgendeinem Zweck bestimmt ist. Die Verwendung oder das Vertrauen in unsere Inhalte erfolgt ausschließlich auf eigenes Risiko und Ermessen. Sie sollten Ihre eigenen Untersuchungen durchführen, unsere Inhalte prüfen, analysieren und überprüfen, bevor Sie sich darauf verlassen. Der Handel ist eine sehr riskante Aktivität, die zu erheblichen Verlusten führen kann. Konsultieren Sie daher Ihren Finanzberater, bevor Sie eine Entscheidung treffen. Kein Inhalt unserer Website ist als Aufforderung oder Angebot zu verstehen