’10 minutes of murder’: Why one family is speaking out about the online extremist network 764
Victims and lawmakers are pleading with platforms to do more to keep kids safe.
Why one family is speaking out about the online extremist network 764
Colby and Leslie Taylor are speaking out as they have quietly waited for the day that justice would be delivered for their 13-year-old son, Jay.
Colby and Leslie Taylor are speaking out as they have quietly waited for the day that justice would be delivered for their 13-year-old son, Jay.
For more than three years, Colby and Leslie Taylor have quietly waited for the day that justice would be delivered for their 13-year-old son, Jay, who in early 2022 was allegedly pushed into killing himself — and streaming it live on social media — by an online tormentor associated with the extremist network known as “764.”
“It’s almost biblical in its definition of evil, what happened,” Colby Taylor said.
Authorities in the United States have yet to file any charges in the case. It’s unclear if they ever will, as current U.S. law doesn’t specifically criminalize such online coercion.
But five weeks ago, authorities nearly 5,000 miles away in Germany, where the alleged online tormentor lives, filed murder charges against him, alleging that the 20-year-old medical student abused more than 30 kids online, and left one American, Jay Taylor, dead.
For the Taylors, it’s finally time to speak out, after remaining so silent — and so anonymous — for so long.
As they see it, the public needs to know about 764, described by authorities as a loosely-knit network of online predators. They say online platforms need to do more to protect their users, Congress needs to act, and someone needs to pay for what happened to their child.
“I couldn’t live with myself not making this push now, making it public,” Colby Taylor said in an exclusive TV interview with ABC News. “Because if I read [another] story like Jay’s, after Jay passed — we failed Jay.”
‘It seemed so innocent’
Growing up with his parents and three siblings in the picturesque, waterside town of Gig Harbor, Washington, Jay was “funny” and “sweet,” and he had a knack for drawing and crafts, especially crochet, according to his mother.
But by the start of 2021, the COVID-19 pandemic had left Jay feeling isolated and lonely. And though he was assigned female at birth, he was in the midst of a gender transition, exacerbating his feelings of loneliness, his parents said.
He became anorexic and began cutting himself, a desperate attempt to release the “complex feelings” churning inside, his mother said.
His parents sought professional help and “locked down” Jay’s time on his computer, limiting it to one hour each day and tracking the websites and chat rooms he visited, said his father, Colby, an engineering manager at a big company in Seattle.
Jay appeared to be getting better.
“Everything seemed like we were in a nice, healthy place,” Colby said.
According to Leslie, a high school teacher and crisis counselor, 99% of what they saw when they checked Jay’s computer was “craft related.”
“Light and airy, just people trying to find each other through crafts,” she said. “It seemed so innocent.”
But members of the sadistic extremist network 764 were anonymously lurking online, from all corners of the world.
Members of 764 find vulnerable children on popular platforms like Discord and Roblox, befriend them, and then coerce them into producing sexually-explicit content and committing acts of gruesome violence against themselves or others, including pets, siblings and even strangers.
“They’re seeking the end of the world,” corrupting future generations and desensitizing them to violence and gore, explained Pat McMonigle, who until he retired from the FBI last year was one of the agents investigating Jay Taylor’s case.
Members of 764 often host live online chats so others can watch the self-harm and violence in real time. The further they can push their victims, the more stature and respect they will receive within 764, authorities say.
“Just sick,” McMonigle said.
’10 minutes of murder’
In late January 2022, in the days leading up to Martin Luther King Jr. Day, Jay Taylor posted a message to the online platform Discord, saying, “I’m looking for friends, preferably LGBTQ for crochet buddies,” Jay’s father recalled.
At about 1:30 that holiday morning, someone responded to Jay’s message, bringing him into a live chat with several others.
Unbeknownst to his parents, the one-hour time limit they set up on Jay’s devices reset at the stroke of midnight.
Within an hour or so, the others in the group chat began telling Jay he should kill himself. And a Discord user calling himself “White Tiger” online was leading the charge, directing others to push and manipulate Jay, according to Jay’s parents.
At first, Jay kept telling them that he didn’t want to die — that he was feeling good and looking forward to upcoming plans with his family — but they kept pressuring him, his father told ABC News.
RELATED: FBI investigating more cases involving online extremist network 764, including in the Tri-State
“You could see they were … typing, and watching, and encouraging, and even purposely misgendering Jay. All the things you could think of to trigger someone in Jay’s life,” Colby said.
One of them even falsely promised to take their own life too.
A little before 4 a.m., Jay snuck out of his home and walked to the parking lot of a nearby grocery store, where he pointed his phone toward himself, opened up a livestream on Instagram, and then strangled himself to death.
It’s still unclear if Jay had been in contact with any 764 members before that, or if they first made contact with him just before his death.
Were it not for a teenage girl in Australia, more than 7,000 miles away, Jay’s parents would have never known what really happened to their child.
In the hours after Jay’s death, the teen in Australia ended up in another online chat with “White Tiger” and others, who she said were sharing recordings of the suicide and joking about it.
“I couldn’t just do nothing about it,” she told ABC News, requesting that her name not be used out of fear of reprisal.
She said she herself had been tormented by members of 764 and pushed to self-harm.
“I had to do something,” she said.
So she found Jay’s father online and sent him the video of Jay’s death. He watched it in a bathroom.
“Ten minutes of murder,” he said of what he saw.
He gave the video and all of Jay’s devices to a local detective in Gig Harbor. And within months, the FBI took over the case.
After months of what McMonigle described as “painstaking work,” he and his partners uncovered what they believe is the true identity of “White Tiger”: a young German-Iranian medical student from Hamburg, Germany.
ABC News is not using his name due to privacy-related restrictions in Germany.
“White Tiger” referred to himself online as an “e-girl groomer,” targeting young girls who “just wanted love,” convincing them to mutilate themselves, and then coaxing them into finding and manipulating even more victims, according to McMonigle.
He was a “terrible guy,” McMonigle said.
U.S. law doesn’t specifically criminalize using online platforms to coerce victims into harming or killing themselves. But the law in Germany does.
So the FBI agents investigating Jay Taylor’s death handed their case file to German authorities.
After an excruciating two-year wait for the Taylors, police in Hamburg arrested “White Tiger” in June, eventually charging him with murder and more than 200 other counts for the alleged abuse of dozens of victims. He has pleaded not guilty and denied all charges.
‘A blind eye’
Leslie and Colby Taylor say they put much of the blame for what happened to their child — and to so many other children around the world — on Discord and other online platforms.
“Discord is taking a blind eye to the kids doing this,” Leslie Taylor said.
The Taylors said Discord provides parents with only a limited ability to know what their children say and do on the platform, and they were shocked to learn that Discord users can employ outside applications to continually delete their messages on Discord.
“[It’s] the most sinister part,” Colby Taylor said. “They let users literally cover their tracks … Discord allows users to hide evidence of foul play on their system, and that attracts these types of organizations to just fester and grow.”
A Discord spokesperson told ABC News that such tools are not endorsed by Discord.
Still, Colby and Leslie Taylor said Discord also needs to be willing to invest in a bigger army of moderators who can detect harmful content and block malicious actors on its platform. And they suggested that Discord create a “little red button” on its system that would allow concerned users to report suspicious behavior as it’s happening.
Failing to do all that amounts to “negligence,” Colby Taylor said.
“Discord purposely brought a gateway of the dark web into our house,” Leslie Taylor said.
They are so convinced that Discord played a major role in their child’s death by allegedly failing to offer proper safeguards that they are now preparing to file a lawsuit against the service, hoping that it will pressure the platform to do more. Colby Taylor said they also want to take some of the profits that Discord makes from what he called “the lack of protections they have in place.”
In a statement to ABC News, a Discord representative said the service is “committed to user safety” and that the “horrific actions of groups like this have no place on Discord or anywhere in society.”
According to a Discord spokesperson, the platform invests “heavily” in specialized teams and newly-developed artificial intelligence tools that can “disrupt these networks, remove violative content, and take action against bad actors on our platform.” Discord also said it shares intelligence with other platforms, which can help identify bad actors even before Discord has spotted them.
Discord also said it cooperates with law enforcement, proactively providing tips and other information to them, and quickly responds to subpoenas.
Their tips have led to many arrests, including the arrest of Bradley Cadenhead, the Texas teen who started 764. And just two weeks ago, Discord announced new tools aimed at giving parents more control and more insight into their children’s accounts.
Spokespersons for Roblox and for Meta, the parent company of Instagram, both said they are making similar efforts and working constantly to protect their users from 764.
A Roblox spokesperson said its policies prohibiting 764-type content “are purposely stricter than other platforms.” As part of its efforts to protect kids more broadly, Roblox announced Tuesday that it is in the process of implementing “age-based chat” restrictions, announced in September, which the company said will help “limit conversations to users with similar ages.”
‘Jay’s law’
Despite the expanding efforts, some say Discord and the other platforms still need to do much more.
“They need to moderate it way better than they have been, which they did start doing, but I don’t think they’re doing enough at all … because a lot is still happening,” said the girl in Australia who informed Jay Taylor’s father about what really happened to Jay.
McMonigle agreed, saying online platforms “need to be held accountable for what is happening on their platforms.”
The Taylors also said they hope their new public push will also push Congress to do more.
In particular, they hope to convince lawmakers to pass a law that would explicitly make it a federal crime to solicit someone online to kill themselves, which would allow U.S. authorities to charge “White Tiger.”
It’s unclear if he could be charged with other offenses in the United States that are already on the books.
Nevertheless, a law like the one being proposed by the Taylors “would be very helpful” — it could even be called “Jay’s law,” McMonigle said.
McMonigle said he has already spoken with one member of Congress about the idea.
According to Colby and Leslie Taylor, there was a point not too long ago where they thought “White Tiger” would never face any type of justice.
“I had become comfortable with the idea that … that just may be a chapter that we live with, that stays open,” Colby said. “Because I knew we couldn’t charge [him] with murder at a federal level in the U.S.”’10 minutes of murder’: Why one family is speaking out about the online extremist network 764
Victims and lawmakers are pleading with platforms to do more to keep kids safe.
ByMike Levine ABCNews logo
Wednesday, November 19, 2025 11:32AM
Why one family is speaking out about the online extremist network 764
Colby and Leslie Taylor are speaking out as they have quietly waited for the day that justice would be delivered for their 13-year-old son, Jay.
Colby and Leslie Taylor are speaking out as they have quietly waited for the day that justice would be delivered for their 13-year-old son, Jay.
For more than three years, Colby and Leslie Taylor have quietly waited for the day that justice would be delivered for their 13-year-old son, Jay, who in early 2022 was allegedly pushed into killing himself — and streaming it live on social media — by an online tormentor associated with the extremist network known as “764.”
“It’s almost biblical in its definition of evil, what happened,” Colby Taylor said.
RELATED: FBI sounds alarm on online ‘764’ Network targeting kids in NC, calls it ‘modern day terrorism’
Authorities in the United States have yet to file any charges in the case. It’s unclear if they ever will, as current U.S. law doesn’t specifically criminalize such online coercion.
But five weeks ago, authorities nearly 5,000 miles away in Germany, where the alleged online tormentor lives, filed murder charges against him, alleging that the 20-year-old medical student abused more than 30 kids online, and left one American, Jay Taylor, dead.
For the Taylors, it’s finally time to speak out, after remaining so silent — and so anonymous — for so long.
RELATED: Central California teen one of 28 charged nationwide in connection with extremist group ‘764’
As they see it, the public needs to know about 764, described by authorities as a loosely-knit network of online predators. They say online platforms need to do more to protect their users, Congress needs to act, and someone needs to pay for what happened to their child.
“I couldn’t live with myself not making this push now, making it public,” Colby Taylor said in an exclusive TV interview with ABC News. “Because if I read [another] story like Jay’s, after Jay passed — we failed Jay.”
‘It seemed so innocent’
Growing up with his parents and three siblings in the picturesque, waterside town of Gig Harbor, Washington, Jay was “funny” and “sweet,” and he had a knack for drawing and crafts, especially crochet, according to his mother.
But by the start of 2021, the COVID-19 pandemic had left Jay feeling isolated and lonely. And though he was assigned female at birth, he was in the midst of a gender transition, exacerbating his feelings of loneliness, his parents said.
He became anorexic and began cutting himself, a desperate attempt to release the “complex feelings” churning inside, his mother said.
His parents sought professional help and “locked down” Jay’s time on his computer, limiting it to one hour each day and tracking the websites and chat rooms he visited, said his father, Colby, an engineering manager at a big company in Seattle.
Jay appeared to be getting better.
“Everything seemed like we were in a nice, healthy place,” Colby said.
According to Leslie, a high school teacher and crisis counselor, 99% of what they saw when they checked Jay’s computer was “craft related.”
“Light and airy, just people trying to find each other through crafts,” she said. “It seemed so innocent.”
RELATED: Terrorism charges recently filed against violent online 764 network targeting minors
But members of the sadistic extremist network 764 were anonymously lurking online, from all corners of the world.
Members of 764 find vulnerable children on popular platforms like Discord and Roblox, befriend them, and then coerce them into producing sexually-explicit content and committing acts of gruesome violence against themselves or others, including pets, siblings and even strangers.
“They’re seeking the end of the world,” corrupting future generations and desensitizing them to violence and gore, explained Pat McMonigle, who until he retired from the FBI last year was one of the agents investigating Jay Taylor’s case.
Members of 764 often host live online chats so others can watch the self-harm and violence in real time. The further they can push their victims, the more stature and respect they will receive within 764, authorities say.
“Just sick,” McMonigle said.
’10 minutes of murder’
In late January 2022, in the days leading up to Martin Luther King Jr. Day, Jay Taylor posted a message to the online platform Discord, saying, “I’m looking for friends, preferably LGBTQ for crochet buddies,” Jay’s father recalled.
At about 1:30 that holiday morning, someone responded to Jay’s message, bringing him into a live chat with several others.
Unbeknownst to his parents, the one-hour time limit they set up on Jay’s devices reset at the stroke of midnight.
Within an hour or so, the others in the group chat began telling Jay he should kill himself. And a Discord user calling himself “White Tiger” online was leading the charge, directing others to push and manipulate Jay, according to Jay’s parents.
At first, Jay kept telling them that he didn’t want to die — that he was feeling good and looking forward to upcoming plans with his family — but they kept pressuring him, his father told ABC News.
RELATED: FBI investigating more cases involving online extremist network 764, including in the Tri-State
“You could see they were … typing, and watching, and encouraging, and even purposely misgendering Jay. All the things you could think of to trigger someone in Jay’s life,” Colby said.
One of them even falsely promised to take their own life too.
A little before 4 a.m., Jay snuck out of his home and walked to the parking lot of a nearby grocery store, where he pointed his phone toward himself, opened up a livestream on Instagram, and then strangled himself to death.
It’s still unclear if Jay had been in contact with any 764 members before that, or if they first made contact with him just before his death.
Were it not for a teenage girl in Australia, more than 7,000 miles away, Jay’s parents would have never known what really happened to their child.
In the hours after Jay’s death, the teen in Australia ended up in another online chat with “White Tiger” and others, who she said were sharing recordings of the suicide and joking about it.
“I couldn’t just do nothing about it,” she told ABC News, requesting that her name not be used out of fear of reprisal.
She said she herself had been tormented by members of 764 and pushed to self-harm.
“I had to do something,” she said.
So she found Jay’s father online and sent him the video of Jay’s death. He watched it in a bathroom.
“Ten minutes of murder,” he said of what he saw.
He gave the video and all of Jay’s devices to a local detective in Gig Harbor. And within months, the FBI took over the case.
After months of what McMonigle described as “painstaking work,” he and his partners uncovered what they believe is the true identity of “White Tiger”: a young German-Iranian medical student from Hamburg, Germany.
ABC News is not using his name due to privacy-related restrictions in Germany.
“White Tiger” referred to himself online as an “e-girl groomer,” targeting young girls who “just wanted love,” convincing them to mutilate themselves, and then coaxing them into finding and manipulating even more victims, according to McMonigle.
He was a “terrible guy,” McMonigle said.
U.S. law doesn’t specifically criminalize using online platforms to coerce victims into harming or killing themselves. But the law in Germany does.
So the FBI agents investigating Jay Taylor’s death handed their case file to German authorities.
After an excruciating two-year wait for the Taylors, police in Hamburg arrested “White Tiger” in June, eventually charging him with murder and more than 200 other counts for the alleged abuse of dozens of victims. He has pleaded not guilty and denied all charges.
‘A blind eye’
Leslie and Colby Taylor say they put much of the blame for what happened to their child — and to so many other children around the world — on Discord and other online platforms.
“Discord is taking a blind eye to the kids doing this,” Leslie Taylor said.
The Taylors said Discord provides parents with only a limited ability to know what their children say and do on the platform, and they were shocked to learn that Discord users can employ outside applications to continually delete their messages on Discord.
“[It’s] the most sinister part,” Colby Taylor said. “They let users literally cover their tracks … Discord allows users to hide evidence of foul play on their system, and that attracts these types of organizations to just fester and grow.”
A Discord spokesperson told ABC News that such tools are not endorsed by Discord.
Still, Colby and Leslie Taylor said Discord also needs to be willing to invest in a bigger army of moderators who can detect harmful content and block malicious actors on its platform. And they suggested that Discord create a “little red button” on its system that would allow concerned users to report suspicious behavior as it’s happening.
Failing to do all that amounts to “negligence,” Colby Taylor said.
“Discord purposely brought a gateway of the dark web into our house,” Leslie Taylor said.
They are so convinced that Discord played a major role in their child’s death by allegedly failing to offer proper safeguards that they are now preparing to file a lawsuit against the service, hoping that it will pressure the platform to do more. Colby Taylor said they also want to take some of the profits that Discord makes from what he called “the lack of protections they have in place.”
In a statement to ABC News, a Discord representative said the service is “committed to user safety” and that the “horrific actions of groups like this have no place on Discord or anywhere in society.”
According to a Discord spokesperson, the platform invests “heavily” in specialized teams and newly-developed artificial intelligence tools that can “disrupt these networks, remove violative content, and take action against bad actors on our platform.” Discord also said it shares intelligence with other platforms, which can help identify bad actors even before Discord has spotted them.
Discord also said it cooperates with law enforcement, proactively providing tips and other information to them, and quickly responds to subpoenas.
Their tips have led to many arrests, including the arrest of Bradley Cadenhead, the Texas teen who started 764. And just two weeks ago, Discord announced new tools aimed at giving parents more control and more insight into their children’s accounts.
Spokespersons for Roblox and for Meta, the parent company of Instagram, both said they are making similar efforts and working constantly to protect their users from 764.
A Roblox spokesperson said its policies prohibiting 764-type content “are purposely stricter than other platforms.” As part of its efforts to protect kids more broadly, Roblox announced Tuesday that it is in the process of implementing “age-based chat” restrictions, announced in September, which the company said will help “limit conversations to users with similar ages.”
‘Jay’s law’
Despite the expanding efforts, some say Discord and the other platforms still need to do much more.
“They need to moderate it way better than they have been, which they did start doing, but I don’t think they’re doing enough at all … because a lot is still happening,” said the girl in Australia who informed Jay Taylor’s father about what really happened to Jay.
McMonigle agreed, saying online platforms “need to be held accountable for what is happening on their platforms.”
The Taylors also said they hope their new public push will also push Congress to do more.
In particular, they hope to convince lawmakers to pass a law that would explicitly make it a federal crime to solicit someone online to kill themselves, which would allow U.S. authorities to charge “White Tiger.”
It’s unclear if he could be charged with other offenses in the United States that are already on the books.
Nevertheless, a law like the one being proposed by the Taylors “would be very helpful” — it could even be called “Jay’s law,” McMonigle said.
McMonigle said he has already spoken with one member of Congress about the idea.
According to Colby and Leslie Taylor, there was a point not too long ago where they thought “White Tiger” would never face any type of justice.
“I had become comfortable with the idea that … that just may be a chapter that we live with, that stays open,” Colby said. “Because I knew we couldn’t charge [him] with murder at a federal level in the U.S.”
But now that “White Tiger” is being prosecuted in Germany, the Taylors have hope for at least a bit of justice.
“Jay’s parents have been incredibly courageous,” McMonigle said. “They took their time of grief. We all needed it … but now they’re ready to do something about it.”
ABC News’ Megan Christie, Pierre Thomas and Juju Chang contributed to this report.
But now that “White Tiger” is being prosecuted in Germany, the Taylors have hope for at least a bit of justice.
“Jay’s parents have been incredibly courageous,” McMonigle said. “They took their time of grief. We all needed it … but now they’re ready to do something about it.”
ABC News’ Megan Christie, Pierre Thomas and Juju Chang contributed to this report.