Earlier this year, the head of my son’s middle school called me during the day—which is always a bad sign. My son, it turned out, had been Google searching for news about mass shootings, and the administrators were worried.
They were, of course, tracking his Google searches on his school-issued laptop. It was a bit of surveillance that was at once totally routine and also totally extraordinary. During the pandemic, as schools issued more devices to students, we all began to accept a new level of technological surveillance in our homes.
On one hand, it’s the school’s device, so school administrators understandably are concerned about liability for how it’s used. On the other hand, that my son’s curiosity about something that could easily affect him—as we were enduring a new wave of school shootings at that time—would trigger an official inquiry is a real sea change in our societal norms.
When I was growing up, if I had been curious about something like school shootings, I would have read about it in a print newspaper at home or looked up information at my local public library. In either case, my actions would not have been surveilled. The print newspaper, unlike its digital equivalent, cannot report back who is reading what article. And librarians long ago adopted a “bill of rights” declaring that all readers have a right to privacy about what they choose to read.
But in today’s world, the indulgence of a private curiosity feels like a luxury. To understand the impact of all this surveillance on students, I turned to Elizabeth Laird, who leads the Student Privacy Project at the Center for Democracy & Technology and recently co-wrote a report on the harms caused by school surveillance.
Laird previously worked at the Washington, D.C., Office of the State Superintendent of Education, the Louisiana Department of Education, and the Data Quality Campaign, a nonprofit advocacy group.
Our conversation, below, has been edited for brevity and clarity.
Angwin: How has the use of technology in schools changed since the pandemic?
Laird: The role of technology in education has been growing over time, but with the pandemic we saw some big changes. One thing is that the number of students who were receiving devices from their schools increased dramatically. Our research shows that 95 percent of teachers say that their school now gives out devices, and prepandemic, this number was much lower.
We have also seen the rapid increase of student monitoring. Schools are monitoring students’ activities online, especially through the devices that schools are providing. A lot of this is happening in the name of keeping students safe, whether that’s safety in terms of preventing them from harming themselves or preventing them from harming others, which we’ve seen with the resurgence of gun violence in schools. Technology is certainly well established in schools, and in some cases it can help students, but there are certainly uses of technology that are arguably inflicting more harm than good.
Angwin: To me, it doesn’t seem like schools are getting safer, and I haven’t seen any evidence that safety has improved. Is that the case?
Laird: It’s a really challenging space because schools are under more pressure than ever to keep students safe, and parents are demanding that when students are in school, they’re kept physically safe. What’s challenging is that there isn’t a lot of research, especially in the space of data and technology, about the efficacy of these techniques. If there is any research that’s available, it is almost always provided by the technology company and not independently verified. Schools find themselves in a position of being asked to do more and yet are not given the tools or the information to make informed decisions. When you pair that urgency to act with a lack of information, it almost inevitably leads to poor decision-making.
What we’re trying to do is broaden the definition of what it means to keep students safe. Keeping them safe from an act of mass violence is certainly one dimension of safety, but it also means supporting students’ mental health and making sure they feel comfortable expressing themselves and can access resources if they need help. What we’re finding with some of this technology is that it’s having the exact opposite effect.
Angwin: Can you tell us a bit about your research and what you discovered about the connection between monitoring and student safety?
Laird: The question we started with is, what are schools tracking when students are using their devices? And how is that monitoring data being used? We found that student activity monitoring technologies are almost ubiquitous; 89 percent of teachers say that their school uses it. Once we established how widespread it is, we wanted to understand the impact of this technology.
We found that students and parents strongly support this technology if it’s used to keep students safe, meaning that it prevents students from harming themselves or someone else. Despite this, we found that it is much more common for student monitoring to be used for disciplinary purposes: 78 percent of teachers said that students have been flagged for discipline as a result of this technology, and 59 percent of students have actually been disciplined because of it.
Angwin: Your report also has some data about how certain groups of students are more likely to be flagged by monitoring technology. Can you talk about this finding?
Laird: The disciplinary action that results from student monitoring falls along racial lines. Black and Hispanic students are more likely to be disciplined as a result of this technology than their White peers. Following our reporting last year, a story came out in Baltimore about how the school system had set up alerts to automatically go to law enforcement after school hours. We wanted to know how widespread this practice is.
This is what shocked us the most: 44 percent of teachers said that they know of a student who had been contacted by law enforcement as a result of this technology, and 37 percent of teachers said that law enforcement receives alerts automatically after school hours. What started as this one-off story from Baltimore, when we looked at the data, we understood that this is a much more routine occurrence. If this technology is used in this way, law enforcement becomes much more present in schools. Additionally, you’re inviting them not just into the school but literally into students’ homes.
We also found that LGBTQ+ students were being disproportionately targeted for action because of these systems: 29 percent of LGBTQ+ students we surveyed said that they or someone they know has been outed because of this technology. We also found that—and we’re still trying to unpack this—LGBTQ+ students were statistically more likely to be disciplined because of this technology, and they were more likely to be contacted by law enforcement. It wasn’t just that they were being outed, but they were also getting in trouble more frequently and were being contacted by law enforcement for criminal investigation.
Angwin: Another thing you found was that parents were often unaware that their students’ data was being shared with third parties?
Laird: Yes. Before we looked at who has access to student data, we wanted to know who parents and students actually trust with this information. Not surprisingly, they trust those who are closest to the student, such as a counselor or teacher. Conversely, the people they trust the least are those who are furthest removed. That included the vendors themselves, law enforcement, and social services, as well as the actual IT administrators who are on the back end making sure that these programs work.
We also found that one in five parents don’t know that their school uses monitoring technology, so certainly many parents are out of the loop. Beyond this, we found that the IT administrator has access to these programs, and then on the company side, there are often content moderators who look at this information and are the first line of defense when alerts come in. This shows us that there’s a big gap between who parents trust and who actually has the most access to this information. This speaks to the fact that there is a lack of transparency and understanding around the way these tools work, and some parents aren’t well positioned to make informed decisions and recommendations about the technology’s actual safety benefits.
Angwin: What do you think schools should do? Should they stop using this technology?
Laird: We concluded that the way this technology is being used is problematic. Schools are making a choice to share this information with law enforcement, and they could easily choose not to share it with law enforcement and then explore alternatives. Alternatives could include not doing any monitoring after school hours, making sure alerts stay within the school system, or not using them for disciplinary purposes. Similarly, with the way LGBTQ+ students are being disproportionately targeted, companies should be looking at that information and trying to figure out why that is—is it something that’s inherent in the algorithm?
You cannot avoid the chilling effect that this technology will create, which I believe will be detrimental to students’ mental health. Half of the students who knew they were being monitored reported that they don’t feel comfortable expressing their true thoughts and feelings online. This number rises for students with a learning difference or a physical disability.
I want to see schools and the companies that they’re working with take action to minimize these harms. These are not foregone conclusions; these are choices that people are making. If they can’t be addressed, then they shouldn’t be using this technology because the harm that this is causing, especially to students who are already more vulnerable, undermines the whole mission of our education system.
If you have a student in your life who would like some tips on dealing with surveillance, I recommend the Electronic Frontier Foundation’s Surveillance Self-Defense for Students guide or The Markup’s guide to obtaining public records from your school about the technology it’s using and the data it’s collecting.
As always, thanks for reading.
(Additional Hello World research by Eve Zelickson.)