Web Analytics
Cryptopolitan
2026-02-10 05:09:59

Discord enforces teen-by-default settings globally, requiring age verification for safe access to sensitive content.

Discord announced on Monday that it will soon ask all users globally to confirm their age via a facial scan or by uploading a form of identification to access adult content. Discord’s press statement revealed that the improved teen safety features being implemented globally will further the company’s long-standing goal of making the app safer and more welcoming for users aged 13 and older. The chat platform said that this update will automatically provide all new and current users worldwide with a teen-appropriate experience with revised communication settings, limited access to age-gated areas, and content filtering that protects the privacy and deep connections that characterize Discord. Discord expands global age checks and safety controls Discord will soon be expanding teen safety protections worldwide including teen-by-default settings and age assurance designed to create safer experiences for teens. We’re also launching recruitment for Discord's first Teen Council, creating a space for teen voices to help shape… pic.twitter.com/CW7G4sO38R — Discord Support (@discord_support) February 9, 2026 Discord allows people to create and join groups based on their interests. The group messaging tool revealed that it has more than 200 million monthly users. Discord currently requires certain users in the UK and Australia to confirm their age to adhere to online safety regulations. However, the chat platform announced that it will implement age checks for all new and existing users globally starting in early March this year. This means that some users will need to complete an age-verification process to change certain settings or access sensitive content, such as servers, age-restricted channels, app commands, and certain message requests. “Nowhere is our safety work more important than when it comes to teen users, which is why we are announcing these updates in time for Safer Internet Day. Rolling out teen-by-default settings globally builds on Discord’s existing safety architecture, giving teens strong protections while allowing verified adults flexibility.” – Savannah Badalich , Head of Product Policy at Discord. The community server app stated that the new default settings will limit what users can see and how they may communicate. Only users who will authenticate as adults will be allowed to access age-restricted forums and unblur sensitive content. The site also revealed that until users pass Discord’s age checks, they won’t be allowed to view direct messages sent to them by an unknown user. Drew Benvie, head of social media consultancy Battenhall, stated that it’s a good idea to support efforts to make social media a safer place for all users. Discord’s move comes amid growing global concern over how social media platforms expose children and teenagers to harmful content and addictive design features. Governments, regulators, and courts are increasingly examining tech companies to determine whether they are doing enough to protect young users. Recent measures demonstrate growing pressure to enhance industry-wide online safety standards. The European Union on February 6 accused TikTok of breaching the bloc’s digital regulations wth “addictive design” features that lead to compulsive use by children. EU regulators said that their two-year probe found that TikTok has not done enough to evaluate how features like autoplay and infinite scroll may affect users’ physical and emotional health, particularly children and “vulnerable adults.” The European Commission said it believes TikTok should change the “basic design” of its service. Social media giants face landmark child addiction trial The largest social media corporations in the world, including TikTok, are facing a number of historic trials that aim to make them accountable for injuries to children who use their services in 2026. February 9 marked the start of opening arguments in one such trial held in Los Angeles County Superior Court. There are allegations that Google’s YouTube and Instagram’s parent firm, Meta, intentionally injures and addicts children. The lawsuit’s original names, TikTok and Snap, reached settlements for unknown amounts. An American Lawyer, Mark Lanier, said in the opening statement that the case is as “easy as ABC,” which he said stands for “addicting the brains of children.” The lawyer also called Google and Meta “two of the richest corporations in history” that have “engineered addiction in children’s brains.” Prosecution attorney Donald Migliori said in his opening statement that Meta has fabricated claims about the security of its platforms by designing its algorithms to keep youth online despite being aware that youngsters are vulnerable to sexual exploitation on social media. Claim your free seat in an exclusive crypto trading community - limited to 1,000 members.

Crypto 뉴스 레터 받기
면책 조항 읽기 : 본 웹 사이트, 하이퍼 링크 사이트, 관련 응용 프로그램, 포럼, 블로그, 소셜 미디어 계정 및 기타 플랫폼 (이하 "사이트")에 제공된 모든 콘텐츠는 제 3 자 출처에서 구입 한 일반적인 정보 용입니다. 우리는 정확성과 업데이트 성을 포함하여 우리의 콘텐츠와 관련하여 어떠한 종류의 보증도하지 않습니다. 우리가 제공하는 컨텐츠의 어떤 부분도 금융 조언, 법률 자문 또는 기타 용도에 대한 귀하의 특정 신뢰를위한 다른 형태의 조언을 구성하지 않습니다. 당사 콘텐츠의 사용 또는 의존은 전적으로 귀하의 책임과 재량에 달려 있습니다. 당신은 그들에게 의존하기 전에 우리 자신의 연구를 수행하고, 검토하고, 분석하고, 검증해야합니다. 거래는 큰 손실로 이어질 수있는 매우 위험한 활동이므로 결정을 내리기 전에 재무 고문에게 문의하십시오. 본 사이트의 어떠한 콘텐츠도 모집 또는 제공을 목적으로하지 않습니다.