New Jersey Sues Discord For Exposing Children To Harmful Content

New Jersey became the first state to sue Discord on Thursday, accusing the private social messaging app of exposing children to harmful content, including graphic violence, as well as sexual abuse and exploitation. 

“Discord markets itself as a safe space for children, despite being fully aware that the application’s misleading safety settings and lax oversight has made it a prime hunting ground for online predators seeking easy access to children,” New Jersey Attorney General Matthew J. Platkin said in a statement.

The lawsuit explains that Discord’s popularity combined with limited safety controls and private chat rooms has made it easy for predators to target younger users. Despite the platform’s restriction of users under 13 years of age, Platkin argues that children are easily able to make accounts, while adults often pose as children. 

advertisement

advertisement

The lawsuit follows previous accusations made against the platform. In 2022, for example, Discord was named in a suit filed in California by The Social Media Victims Law Center on behalf of an underage girl who was sexually targeted by adult men. That same year, Discord was cited in six cases involving child sex abuse material and grooming children. 

At the start of 2025, Discord was also named in a lawsuit in which a 13-year-old boy was sexually exploited by an adult stranger, and this past weekend, a California man was arrested and charged with kidnapping and engaging in sexual conduct with a minor after a 10-year-old girl was reported missing and found to be chatting with the man on Discord. 

In many of these cases, Roblox -- a virtual gaming platform -- was named alongside Discord, as the two platforms in particular have gained a reputation for safety loopholes that allow adult users to chat directly with children, often persuading them to open accounts on other social apps. 

New Jersey’s suit names various criminal cases made against adult residents accused of engaging in explicit communication with children, soliciting and sending nude pictures and taking part in sexual acts on video chat. 

The lawsuit argues that Discord was aware of its young users’ vulnerability and marketed the platform as safe to parents. 

“Discord claims that safety is at the core of everything it does, but the truth is the application is not safe for children,” Attorney General Platkin’s director of the division of consumer affairs, Cari Fais, said in the statement.

A Discord spokesperson issued a vague response, stating that the company is “proud” of its “continuous efforts and investments in features and tools that help make Discord safer.”

In 2023, NBC News found 35 cases of grooming, sexual assault or kidnapping that involved communication on Discord over a six-year period.  

Next story loading loading..