Home News Discord Continues to Face Serious Child Safety Concerns

Discord Continues to Face Serious Child Safety Concerns

by admin
child safety

Discord’s child safety issues have once again come to light, as a recent report reveals concerning statistics. Over the span of six years, NBC News has uncovered 35 cases in which adults were prosecuted for offenses like “kidnapping, grooming, or sexual assault” that were allegedly facilitated through Discord communication. Out of these cases, at least 15 have resulted in guilty pleas or verdicts, while several others remain pending.

The report further uncovers an additional 165 cases, including instances involving four crime rings, where adults were prosecuted for sharing child sexual exploitation material (CSAM) on Discord or using the platform to coerce children into sharing sexually explicit images of themselves, commonly known as sextortion. These illegal activities often occur within hidden communities and chat rooms, according to the report.

A simple search using “site:justice.gov Discord” yields a distressing number of search results, highlighting the gravity of the issue. NBC News highlighted a particularly disturbing case, in which a teenager was groomed on Discord for months, taken across state lines, raped, and eventually found locked in a backyard shed, as confirmed by the police.

“The visible cases are just the tip of the iceberg,” warned Stephen Sauer from the Canadian Center for Child Protection in an interview with NBC News. This is not the first time Discord has faced criticism regarding its handling of child abuse complaints. CNN previously reported numerous instances of CSAM, with some parents expressing dissatisfaction with the platform’s limited assistance.

Earlier this year, the National Center on Sexual Exploitation (NCOSE) issued a statement titled “Discord Dishonestly Responds to How it Handles Child Sexual Abuse Material After Being Named to 2023 Dirty Dozen List.” The statement highlighted instances where CSAM links were reported but remained accessible on Discord’s servers for over two weeks. NCOSE criticized Discord’s passive approach to addressing these issues and recommended a transformative change, including banning minors, until the platform takes decisive action.

In a recent transparency report, Discord claimed to have significantly strengthened its investment and prioritization of child safety. It stated that 37,102 accounts were disabled, and 17,425 servers were removed due to child safety violations. John Redgrave, Discord’s VP of trust and safety, expressed confidence in the platform’s improved approach to the issue since acquiring AI moderation company Sentropy in 2021. Discord now employs multiple proactive detection systems for identifying CSAM and analyzing user behavior. Redgrave suggested that the company currently detects most materials that have already been identified, verified, and indexed.

However, the current systems are unable to detect previously unidentified instances of child sexual abuse material or messages. NBC News discovered 242 Discord servers in the past month that use thinly veiled terms to advertise CSAM when reviewing the platform.

While Discord is not the only social media company grappling with CSAM-related challenges, a recent report highlighted Instagram’s role in fostering a vast network of accounts dedicated to underage sexual content. Nevertheless, Discord has faced specific difficulties for law enforcement, with one incident involving a request for payment from Discord when the Ontario Police sought to preserve records, according to the report. BuyTechBlog has reached out to Discord for comment on these matters.

Frequently Asked Questions (FAQs) about child safety

What are the main concerns regarding Discord’s child safety?

Discord’s child safety concerns primarily revolve around cases of adults engaging in activities such as kidnapping, grooming, and sexual assault through the platform. There have been instances of sharing child sexual exploitation material (CSAM) and coercing children into sending explicit images, known as sextortion. These activities often occur in hidden communities and chat rooms.

How has Discord responded to these concerns?

Discord claims to have made significant investments and prioritization in child safety. They have disabled thousands of accounts and removed numerous servers for child safety violations. The company has implemented proactive detection systems to identify CSAM and analyze user behavior. However, there are limitations in detecting previously unidentified instances of child sexual abuse material or messages.

Has Discord faced criticism for its handling of child abuse complaints before?

Yes, Discord has previously faced criticism for its handling of child abuse complaints. In the past, there have been reports of instances involving child sexual exploitation material, and some parents have expressed dissatisfaction with the platform’s level of assistance in addressing these issues.

What recommendations have been made to Discord regarding child safety?

The National Center on Sexual Exploitation (NCOSE) recommended that Discord ban minors until significant changes are implemented. They criticized Discord’s passive approach and called for a more proactive search and removal of exploitation material. The issue has prompted calls for radical transformation in the platform’s approach to child safety.

Are other social media platforms facing similar child safety issues?

Yes, other social media platforms, such as Instagram, have also faced issues related to child safety. A recent report highlighted Instagram’s role in promoting a network of accounts dedicated to underage sexual content. However, each platform may have its own specific challenges and responses to addressing these concerns.

More about child safety

You may also like

1 comment

socialmediafan June 23, 2023 - 3:43 am

It’s not just discord, othr platforms hv similr issues. But discord shud do bettr! Let’s prtct ths vulnerable kids! #childsafety

Reply

Leave a Comment