As states rush to enact laws criminalizing abortion-related care in the wake of the Supreme Court’s recent decision overturning Roe v. Wade, health advocates and civil rights groups are warning that school surveillance software can be weaponized against teens who seek reproductive care.
In tens of thousands of schools, every message students send or term they search on their computer is algorithmically monitored by software from companies like Bark, Gaggle, GoGuardian, and Securly. These tools monitor many students even outside of school hours and can send automatic alerts to school administrators, parents, or police when they detect dangerous behavior, which may range from imminent suicide threats to “sexual content.”
By adding a few keywords to a watchlist, civil rights experts say, schools could easily instruct the programs to flag students looking for information about abortions or other newly illegal health care services.
“School admin, staff, parents, or other adults who have access to the admin consoles are able to see everything,” said Daly Barnett, a staff technologist with the Electronic Frontier Foundation. “The way these applications can be deputized to specifically gather information around searches for reproductive health is alarming.”
Nicole, a high school student in Austin who asked that her full name not be used, told The Markup that much of her knowledge about abortion and reproductive health was self-taught online.
“Regardless of whether or not I live in Texas, [the internet] gives me a little bit of access,” she said. Websites that discreetly ship medication are one of the few—albeit still illegal—ways to access abortion in the state, which has not only criminalized abortion but has also enacted a law that allows private citizens to file civil lawsuits against anyone who helps another person end their pregnancy.
Nicole said her school gives a computer to each student who needs or wants one and that each device comes pre-installed with GoGuardian. The software allows teachers to view students’ screens in real time, access their search histories, and monitor their messages.
GoGuardian sends email alerts to administrators when it detects content related to drugs, alcohol, or harassment, according to the Austin Independent School District’s policy. Other alerts, including those for “sexual content,” go directly to the district’s police department.
The policy doesn’t define sexual content, and Anastasia Drabicky, a spokesperson for Austin ISD, said the district was closed for the week so staff could not respond to questions.
Last year, the algorithmic filters used by the student monitoring company, Securly, were found to be mislabeling sexual health resources as pornography, adult content, and hate speech.
Nicole told The Markup that she hadn’t known GoGuardian was sending alerts to the district’s police.
Digital evidence trails were a concern for reproductive rights advocates even before the Supreme Court’s decision in Dobbs v. Jackson Women’s Health Organization.
In 2015, a jury convicted an Indiana woman of feticide through self-induced abortion, based in part on text messages she sent to a friend. Her conviction was eventually overturned on appeal.
In 2017, prosecutors charged an Ohio woman with killing her stillborn baby, using her internet browsing history as evidence. A jury acquitted the woman of the most serious charges.
And In 2018, a Mississippi grand jury indicted a woman for the murder of her stillborn baby after the prosecution presented her Google search history for abortion pills. Those charges were eventually dropped after civil rights groups rallied to the woman’s defense.
But with states now free to enforce punitive abortion laws, the bar for securing convictions in future cases may be lower. Tech companies have been largely silent about how and when they’ll turn over user data to police in abortion cases, adding to fears about who will be given access to data from tools like period trackers.
Kendra Albert, a clinical instructor at Harvard Law School’s Cyberlaw Clinic, said those fears are not baseless but that the greater legal threat is likely to come from surveillance systems that are already in place and can easily be adapted to the new laws—like student monitoring software.
“I’m always more worried about times when you just add more keywords to the system compared to times when you have to build the whole system from scratch,” Albert said. “We have seen, especially in red states, school districts that consider it their mission to police and out students, and there can be school-to-prison-pipeline consequences to that.”
The Markup reached out to popular student monitoring companies Bark, Gaggle, GoGuardian, and Securly with questions about how they monitor sexual health content and share information with law enforcement. Bark was the only company to respond.
“Our AI flags and alerts the content then forwards it to the appropriate contact at the schools,” Adina Kalish, a spokesperson for the company, wrote in an email. “Bark serves to detect and alert only, and cases that require further involvement from law enforcement are handled by the schools directly.”
“Bark does not supply student information to law enforcement,” Kalish added. “Schools handle this directly when an issue of concern is identified.”
Z Zenobia, a student organizer with the nonprofit Unite for Reproductive and Gender Equity (URGE), said young people are experienced at adapting to unsafe online spaces—using coded language and finding other ways to subvert technology.
But they need to know they’re being surveilled in order to do that, and Zenobia said they fear that the sheer ubiquity of surveillance in schools will scare some people away from seeking health information entirely.
“A lot of people don’t know that Plan B doesn’t work for people over 180 pounds,” Zenobia said, referencing the drug’s effective weight limit. “Anyone trying to access that who is being surveilled is basically in danger at this point. The fear is very much there that anything we do has to go through a legal lens.”