Sign up for The Brief, The Texas Tribune’s daily newsletter that keeps readers up to speed on the most essential Texas news.
As school districts struggle to control the spread of cyberbullying, pornographic images and online exploitation among their students, Texas lawmakers could consider banning social media from minors, among other sweeping measures, in the upcoming legislative session.
Over the last decade, Texas lawmakers have attempted to slow the spread of social media’s harmful effects by criminalizing cyberbullying and preventing online platforms from collecting data on minors, the latter of which has faced court challenges by social media companies.
While law enforcement and prosecutors have traditionally been responsible for cracking down on these online dangers, lack of resources in those agencies has meant enforcement has fallen onto educators, who already struggle to meet the demands of instruction, let alone stay knowledgeable on all the ways children use the internet.
“Almost every kid comes to school these days, regardless of background, regardless of socioeconomic status, they have some type of smartphone device in their hand. So they will have access to unfettered content most of the time, no matter what we try to do,” said Zeph Capo, president of the Texas American Federation of Teachers.
Lawmakers have suggested several initiatives next session to address the online dangers affecting Texas children, including a bill filed by Rep. Jared Patterson, R-Frisco, that would prohibit minors from creating accounts on social media sites and require age verification for new users. Other options include adding funds to internet crimes units in law enforcement agencies, banning the use of people’s likeness in artificially created pornographic images, and making people aware of the dangers of the internet.
“Social media is the most dangerous thing our kids have legal access to in Texas,” Patterson said in a news release.
While they welcome any efforts to reduce harm to children, school officials and cybercrime investigators say more needs to be done to hold social media companies accountable for enforcement.
“We need these businesses to be responsible business people and throttle some of this tremendously negative content, particularly when it comes to kids,” Capo said. “But, you know, they don’t want to do anything like that.”
The most important Texas news,
sent weekday mornings.
Schools are hunting grounds
During a Senate Committee on State Affairs hearing in October, lawmakers listened to a litany of stories about how social media has affected young people in Texas: a middle school girl who developed an eating disorder after watching a TikTok video, a middle school boy addicted to cartoon pornography after his YouTube algorithm took him to a porn site, and a woman who testified to being groomed for sex work in high school as her images were posted on social media applications.
Most of these incidents had a starting point at school where children have frequent access to technology and teachers and administrators are too busy to provide oversight. Add in the fact that they know ways to circumvent campus firewalls, students are being groomed via social media on school grounds, said Jacquelyn Alutto, president of Houston-based No Trafficking Zone, during the hearing.
“Right now, schools are a hunting ground,” she said.
The Texas Tribune requested interviews with several school districts about online dangers in schools, including the Austin, Round Rock, Katy and Eanes school districts, but they did not respond. The Plano school district declined to be interviewed.
Last year, the American Federation of Teachers and the American Psychological Association, among other national organizations, called out social media platforms for undermining classroom learning, increasing costs for school systems, and being a “root cause” of the nationwide youth mental health crisis. The admonishment came after a report detailed how school districts across the country are experiencing significant burdens as they respond to tech’s predatory and prevalent influence in the classroom.
The same year, in an attempt to hold social media companies more accountable, Gov. Greg Abbott signed into law House Bill 18, known as the Securing Children Online through Parental Empowerment Act. The SCOPE Act requires covered digital service providers to provide minors with certain data protections, prevent minors from accessing harmful content, and give parents tools to manage their child’s use of the service.
It also required school districts to obtain parental consent for most software and social media applications used in the classroom and to look for alternatives to the internet for instruction.
However, many of the family-friendly websites and games that children might use for entertainment are also rife with potential sexual predators who pretend to be children.
“A little boy can be playing Robloxs in the cafeteria, and during that lunch break, a trafficker can target him, and he can be sexually groomed or exploited within a few weeks or months,” Alutto said.
And even harder to control is when students share pornographic images of themselves online, a reason why some child welfare groups want social media platforms restricted or outright banned for minors.
“This has also helped human traffickers groom and recruit children,” Alutto said.
Unknown damage
Studies show 95% of youth ages 13 to 17 report using social media, with more than a third saying they use social media “almost constantly.”
Nearly 40% of children ages 8 to 12 use social media, even though most platforms require a minimum age of 13 to sign up, according to a study by the U.S. Surgeon General.
This has created a generation of chronically online children, and the medical community is still unsure of their longterm effects.
Although the SCOPE Act was passed to restrict kids from seeing harmful online content and give parents more control over what their children do online, social media companies have watered it down.
A federal district court judge earlier this year temporarily blocked part of the law that required them to filter out harmful content, saying it was unconstitutional under the First Amendment free speech right.
Texas Attorney General Ken Paxton announced in October that he was suing TikTok by allowing their algorithm to affect minors. TikTok denied the state’s allegations, pointing to online information about how parents in certain states, including Texas, can contact TikTok to request that their teen’s account be deleted.
This lawsuit, like dozens of others across the country, is playing out in court, forcing Texas lawmakers to wait and see what more they can do in the upcoming session to hold social media companies accountable.
Australia recently banned social media from children under the age of 16.
“The state needs to ensure that if technology providers want to do business, they must protect our children, stop the flow of (child pornography and child sexual assault) and report it,” Brent Dupre, director of law enforcement at the Office of the Attorney General of Texas, told The Texas Tribune.
Potential solutions?
Dupre’s department is one of three Internet Crimes Against Children Task Forces in the state, and his agency alone covers 134 counties. His office receives 2,500 cyber tips per month for investigation from the National Center for Missing and Exploited Children, an overwhelming number of cases for an agency with only 11 officers.
The problem is so persistent that Dupre said his office was conducting a live training session with law enforcement officers a few months ago on how to pose in chat rooms as a minor when the trainer noticed a real adult was already trying to solicit their fake minor for sex.
“These proactive investigations aren’t done as frequently as we like because of the sheer caseload that we got,” Dupre said, noting how they work with other law enforcement agencies who are suffering with staff shortages.
Christina Green, chief advancement and external relations officer for Children’s Advocacy Centers of Texas, said her agency serves more than 60,000 child victims yearly, with a majority of these connected to online incidents that happened in school while using social media applications. She said law enforcement agencies as well as hers need more resources to protect children.
“This field is rapidly developing, and the tools needed to continue must also develop,” she said.
Echoing school officials, Dupre said social media companies should enforce more restrictions on what minors can do on their platforms. He said companies should be required to track attempts to upload child pornography and other internet harm and be held accountable for allowing sexually explicit content to stay on their websites.
Dupre suggested lawmakers require chat and social media companies use artificial intelligence to scan for child pornography and child sexual assault material and block users from sending this kind of material on their platforms.
“To me, children who try to upload self-produced material should automatically have their accounts disabled,” he said. “Many technology providers scan for these photos and videos, which are then quarantined and reported, but not all providers lockout or cancel that user end-to-end encryption.”
However, the most essential place to stop cyberbullying, sexual exploitation and other internet-based crimes on minors is at home, Green said.
She suggested teaching children in schools as early as the third grade about online risks and repeating training yearly.
She also wants the same education extended to parents.“We have been talking to parents about when you drop your kid off at someone’s house, do you know if devices will be used there? It’s like asking if there is a pool in the backyard. These types of questions need to become commonplace,” Green said.