Requests to share nudes? It happens younger than you think.
If you’ve read Thorn’s latest research report, then you know that kids are navigating online grooming and receiving requests for nudes often at far younger ages than many people think.
But despite a majority of parents thinking they should talk with their children about sharing nudes before the age of 13,only 1 in 5 parents have done so.
To compound the issue, kids often face shame and blame as they recover from negative digital experiences. This can exacerbate the harm they experience and further isolate kids who are in trouble.
So where does that leave kids and their parents? How can parents gain confidence to have these difficult conversations early and often with their children?
Enter Thorn for Parents – a digital resource hub designed to help parents have earlier, more frequent, and judgment-free conversations with kids about digital safety.
Not a parent? Tell a friend or family member who could benefit from this information!
THORN FOR PARENTS
BE YOUR KID’S SAFETY NET
Kids today face a very different set of challenges. There’s a whole new landscape where a child’s relationship with technology and normal sexual development overlap, with a whole new set of experiences online. And they need your help to navigate it safely.
TAKE THE FIRST STEP
Whether this is your first time talking to your child or you’ve broached a topic before, here are some areas to learn more about and guide conversations.
SEXTING & NUDES
When and how to have conversations about consent and the risks of sharing nudes.
DEVICE ACCESS & MONITORING
What access your child likely has, and things to consider when it comes to monitoring their behavior.
ALL ABOUT THE PLATFORMS
A guide to the places kids interact online — usage, risks, and privacy across the digital landscape.
Brands Suspend Ads From Parts of Twitter Over Child Pornography Concerns
Some major advertisers including Dyson, Mazda, Forbes, and PBS Kids have suspended their marketing campaigns or removed their ads from parts of Twitter because their promotions appeared alongside tweets soliciting child pornography, the companies told Reuters.
DIRECTV and Thoughtworks also told Reuters late on Wednesday they have paused their advertising on Twitter.
Brands ranging from Walt Disney Co., NBCUniversal, and Coca-Cola Co. to a children’s hospital were among more than 30 advertisers that appeared on the profile pages of Twitter accounts peddling links to the exploitative material, according to a Reuters review of accounts identified in new research about child sex abuse online from cybersecurity group Ghost Data.
Some of tweets include key words related to “rape” and “teens,” and appeared alongside promoted tweets from corporate advertisers, the Reuters review found. In one example, a promoted tweet for shoe and accessories brand Cole Haan appeared next to a tweet in which a user said they were “trading teen/child” content.
“We’re horrified,” David Maddocks, brand president at Cole Haan, told Reuters after being notified that the company’s ads appeared alongside such tweets. “Either Twitter is going to fix this, or we’ll fix it by any means we can, which includes not buying Twitter ads.”
In another example, a user tweeted searching for content of “Yung girls ONLY, NO Boys,” which was immediately followed by a promoted tweet for Texas-based Scottish Rite Children’s Hospital. Scottish Rite did not return multiple requests for comment.
In a statement, Twitter spokesperson Celeste Carswell said the company “has zero tolerance for child sexual exploitation” and is investing more resources dedicated to child safety, including hiring for new positions to write policy and implement solutions.
She added that Twitter is working closely with its advertising clients and partners to investigate and take steps to prevent the situation from happening again.
Twitter’s challenges in identifying child abuse content were first reported in an investigation by tech news site The Verge in late August. The emerging pushback from advertisers that are critical to Twitter’s revenue stream is reported here by Reuters for the first time.
Like all social media platforms, Twitter bans depictions of child sexual exploitation, which are illegal in most countries. But it permits adult content generally.
Twitter declined to comment on the volume of adult content on the platform.
Ghost Data identified the more than 500 accounts that openly shared or requested child sexual abuse material over a 20-day period this month. Twitter failed to remove more than 70 percent of the accounts during the study period, according to the group, which shared the findings exclusively with Reuters.
Reuters could not independently confirm the accuracy of Ghost Data’s finding in full, but reviewed dozens of accounts that remained online and were soliciting materials for “13+” and “young looking nudes.”
After Reuters shared a sample of 20 accounts with Twitter last Thursday, the company removed about 300 additional accounts from the network, but more than 100 others still remained on the site the following day, according to Ghost Data and a Reuters review.
Reuters then on Monday shared the full list of more than 500 accounts after it was furnished by Ghost Data, which Twitter reviewed and permanently suspended for violating its rules, said Twitter’s Carswell on Tuesday.
In an email to advertisers on Wednesday morning, ahead of the publication of this story, Twitter said it “discovered that ads were running within Profiles that were involved with publicly selling or soliciting child sexual abuse material.”
Andrea Stroppa, the founder of Ghost Data, said the study was an attempt to assess Twitter’s ability to remove the material. He said he personally funded the research after receiving a tip about the topic.
Twitter’s transparency reports on its website show it suspended more than 1 million accounts last year for child sexual exploitation.
It made about 87,000 reports to the National Center for Missing and Exploited Children, a government-funded non-profit that facilitates information sharing with law enforcement, according to that organization’s annual report.
“Twitter needs to fix this problem ASAP, and until they do, we are going to cease any further paid activity on Twitter,” said a spokesperson for Forbes.
“There is no place for this type of content online,” a spokesperson for carmaker Mazda USA said in a statement to Reuters, adding that in response, the company is now prohibiting its ads from appearing on Twitter profile pages.
A Disney spokesperson called the content “reprehensible” and said they are “doubling-down on our efforts to ensure that the digital platforms on which we advertise, and the media buyers we use, strengthen their efforts to prevent such errors from recurring.”
A spokesperson for Coca-Cola, which had a promoted tweet appear on an account tracked by the researchers, said it did not condone the material being associated with its brand and said “any breach of these standards is unacceptable and taken very seriously.”
NBCUniversal said it has asked Twitter to remove the ads associated with the inappropriate content.
Twitter is hardly alone in grappling with moderation failures related to child safety online. Child welfare advocates say the number of known child sexual abuse images has soared from thousands to tens of millions in recent years, as predators have used social networks including Meta’s Facebook and Instagram to groom victims and exchange explicit images.
For the accounts identified by Ghost Data, nearly all the traders of child sexual abuse material marketed the materials on Twitter, then instructed buyers to reach them on messaging services such as Discord and Telegram in order to complete payment and receive the files, which were stored on cloud storage services like New Zealand-based Mega and U.S.-based Dropbox, according to the group’s report.
A Discord spokesperson said the company had banned one server and one user for violating its rules against sharing links or content that sexualize children.
Mega said a link referenced in the Ghost Data report was created in early August and soon after deleted by the user, which it declined to identify. Mega said it permanently closed the user’s account two days later.
Dropbox and Telegram said they use a variety of tools to moderate content but did not provide additional detail on how they would respond to the report.
Still the reaction from advertisers poses a risk to Twitter’s business, which earns more than 90 percent of its revenue by selling digital advertising placements to brands seeking to market products to the service’s 237 million daily active users.
Twitter is also battling in court Tesla CEO and billionaire Elon Musk, who is attempting to back out of a $44 billion deal to buy the social media company over complaints about the prevalence of spam accounts and its impact on the business.
A team of Twitter employees concluded in a report dated February 2021 that the company needed more investment to identify and remove child exploitation material at scale, noting the company had a backlog of cases to review for possible reporting to law enforcement.
“While the amount of [child sexual exploitation content] has grown exponentially, Twitter’s investment in technologies to detect and manage the growth has not,” according to the report, which was prepared by an internal team to provide an overview about the state of child exploitation material on Twitter and receive legal advice on the proposed strategies.
“Recent reports about Twitter provide an outdated, moment in time glance at just one aspect of our work in this space, and is not an accurate reflection of where we are today,” Carswell said.
The traffickers often use code words such as “cp” for child pornography and are “intentionally as vague as possible,” to avoid detection, according to the internal documents. The more that Twitter cracks down on certain keywords, the more that users are nudged to use obfuscated text, which “tend to be harder for (Twitter) to automate against,” the documents said.
Ghost Data’s Stroppa said that such tricks would complicate efforts to hunt down the materials, but noted that his small team of five researchers and no access to Twitter’s internal resources was able to find hundreds of accounts within 20 days.
Twitter did not respond to a request for further comment.
Former Texas Rangers pitcher John Wetteland appeared in a Denton County Court on Monday to respond to charges
DENTON, TX – A Denton County judge on Friday declared a mistrial in the child sexual assault case against John Wetteland, a former Texas Rangers player who is accused of molesting a boy three times more than a decade ago.
The jury told the judge three times that it was split. At one point, the judge said she heard loud arguing coming from the deliberation room.
Wetteland, who testified in his defense during the trial last week, faced three counts of aggravated sexual assault of a child. He played for the Rangers from 1997 to 2000, as well as for the New York Yankees and Seattle Mariners, and is in the Rangers’ Hall of Fame.
At 4:40 p.m., the jury sent its third note to the judge. It said it was deadlocked and some jurors were “unwilling to budge.” The jurors asked how long they were expected to deliberate. Some were concerned about child care.
Judge Lee Ann Breading had pressed the jury to keep trying to reach a verdict. But after questioning the jury about 5 p.m., she declared a mistrial.
Wetteland, 56, faced 25 years to life in prison if convicted. It was unclear Friday whether prosecutors would pursue a second trial. Defense attorneys declined to comment.
Since Tuesday, jurors in 462nd District Court heard from the accuser, Wetteland and other witnesses.
According to authorities, Wetteland sexually assaulted the child three times between 2004 and 2006, starting when the child was 4 years old. Wetteland pleaded not guilty and said the accuser’s account of sexual abuse is a lie.
According to the accuser’s mother, he first told her in 2016 — when he was 16 — that Wetteland raped him as a child. She said she did not report the allegation to police. In his testimony, the accuser said he did not want to report the abuse and wanted an apology from Wetteland, according to the Denton Record-Chronicle.
The accuser testified on Tuesday. He said he looked up to Wetteland and wanted to please him. The first time Wetteland sexually abused him, he said, he was confused. The abuse impacted him deeply into his teenage years, he testified, causing incontinence, suicidal thoughts and self-harm.
When the boy was 18, his mother testified, she told him to write a letter about the abuse and planned to send it to people connected with Wetteland.
According to prosecutor Lindsey Sheguit, the document was saved on the Argygle school Google account, and the school district’s monitoring system flagged it. Employees discovered the letter, the school district’s chief technology officer testified Wednesday, and reported it to the Texas Department of Family and Protective Services.
On cross examination from Derek Adame, one of Wetteland’s defense attorneys, the technology officer testified that the district could not know who wrote the letter, only that it was written on the accuser’s account. Adame and defense attorney Caroline Simone argue the abuse allegations are not true and were possibly fabricated by a man named Chris. Chris is not biologically related to the accuser but lived with him when the accuser was a teenager.
Police Arrest 4 In Connection With Abuse Of 2 Boys
OKLAHOMA CITY, OK – The start of school is around the corner for metro students. Oklahoma City police officials said on Wednesday with the start of school, officers would see an increase of reported child abuse cases.
One case currently under investigation started last December with a metro school counselor and resulted in the arrest of four adults this week. They were accused of child abuse and not reporting the abuse of two elementary age boys.
A concerned school counselor tipped off police to the alleged abuse. According to a report, a 6-year-old student returned from Thanksgiving break with injuries to his face and his 10-year-old brother did not return to class following the break.
“Both had some pretty substantial injuries that nobody could seem to explain,” said Msgt. Gary Knight, Oklahoma City Police Department.
Investigators found the family living at the Red Roof Inn near I-40 and south Meridian Avenue. Court documents indicated the boys’ mother Krista Cox claimed their injuries were self-inflicted during emotional outbursts. Police learned the brothers were diagnosed with autism and intellectual disabilities.
“It’s bad enough when you’ve got a child that’s unable to defend themselves,” said Knight. “Then you throw in its special needs child that has certain disabilities that makes it an even more egregious act.”
Police said there were two other adults living in the hotel rooms besides Cox and her boyfriend Christopher Aucoin.
“As the case evolved, it turned out there were two other people involved,” said Knight. “A parent of the mother of the child and a parent of the boyfriend.”
After combing through multiple Oklahoma Department of Human Services referrals and hospital records for the brothers, investigators determined the injuries were from abuse. They are holding all four adults criminally responsible.
“The children were placed into protective custody,” said Knight. “That’s important to note.”
DHS was contacted for the story to confirm the children were placed in state custody. An agency official said they could not comment on the case due to confidentiality laws.