Skip navigation

Artificial Intelligence

His students suddenly started getting A’s. Did a Google AI tool go too far?

Some teachers say that AI tools, particularly Google Lens, have made it impossible to enforce academic integrity in the classroom — with potentially harmful long-term effects on students’ learning.

A close-up view of a student's hands on a laptop keyboard. A teacher's hands can be seen nearby resting on the student's desk.

The Markup, now a part of CalMatters, uses investigative reporting, data analysis, and software engineering to challenge technology to serve the public good. Sign up for Klaxon, a newsletter that delivers our stories and tools directly to your inbox.

A few months ago, a high school English teacher in Los Angeles Unified noticed something different about his students’ tests. Students who had struggled all semester were suddenly getting A’s. He suspected some were cheating, but he couldn’t figure out how.

Until a student showed him the latest version of Google Lens.

Google has recently made the visual search tool easier to use on the company’s Chrome browser. When users click on an icon in the Google search bar, a moveable bubble pops up. Wherever the bubble is placed, a sidebar appears with an artificial intelligence answer, description, explanation or interpretation of whatever is inside the bubble. For students, it provides an easy way to cheat on digital tests without typing in a prompt, or even leaving the page. All they have to do is click.

“I couldn’t believe it,” said teacher Dustin Stevenson. “It’s hard enough to teach in the age of AI, and now we have to navigate this?”

Keeping up with students’ methods of cheating has always been a cat-and-mouse game for teachers. But some now say that AI tools, particularly Lens, have made it impossible to enforce academic integrity in the classroom — with potentially harmful long-term effects on students’ learning.

↩︎ link

‘A terrible idea’

Lens has been around for nearly a decade. It’s the camera technology that scans QR codes or identifies objects in photos. But as AI has evolved, its uses have expanded, and Google has made it more available to users, especially those using Chrome, the Google browser.

During the COVID school closures, most school districts in California gave students Chromebook laptops to do remote work. Thousands of those laptops were actually donated by Google. After schools reopened for in-person learning, schools kept using the Chromebooks, making them an integral part of classroom instruction.

Millions of California’s 5.8 million K-12 students use Chromebooks, making it by far the most popular laptop option in schools. 

For William Heuisler, a high school ethnic studies teacher in Los Angeles, that was the first red flag.

“After COVID-19, it was clear that Chromebooks were a terrible idea in my classroom,” Heuisler said. Students used the laptops to play games during class, watch soccer matches and otherwise focus on anything but the lesson plan. 

Then came AI, with its immense potential to enhance education — and facilitate cheating. That’s when Heuisler decided to ditch technology altogether in his classroom and return to the basics: pencil and paper. Tests, homework and in-class assignments are all on paper. The school already bans cell phones.

It’s more work for him, but worth it, he said. 

“We want teenagers to think independently, voice their opinions, learn to think critically,” Heuisler said. “But if we give them a tool that allows them to not develop those skills, I’m not sure we’re actually helping them. Can you get by in life not knowing how to write, how to express yourself? I don’t know, but I hope not.”

↩︎ link

AI and cognitive activity

Heuisler is not alone, according to research from the Center for Democracy and Technology. In a recent nationwide survey, the organization found that more than 70% of teachers say that because of AI, they have concerns about whether students’ work is actually their own. Nearly 75% of teachers say they worry students aren’t learning important skills like writing, research and reading comprehension.

The impact on students’ learning appears to be real, according to a recent study by the Massachusetts Institute of Technology. The study, “Your Brain on ChatGPT,” found that students who use AI to write essays showed 55% less cognitive activity than those who didn’t. The essays themselves were also of poorer quality, with limited ideas, sentence structures and vocabulary compared to the essays written by students who didn’t use AI.

Nonetheless, about 85% of teachers and students use AI in the classroom, the Center for Democracy and Technology found. Teachers use it to organize lesson plans and grade papers, and students use it to do things like research and brainstorming. 

↩︎ link

Lack of consistent rules

But rules related to its use vary widely. The California Department of Education offers extensive guidance on how teachers can use AI in the classroom, but no requirements — even regarding students who use AI to cheat. One video urges teachers not to punish students caught using AI to write an essay. Instead, the video encourages teachers to come up with essay assignments that can’t be easily written by a machine, or require students to provide their notes and cite AI just like they would cite any other source for an essay. 

Even within schools, teachers have different AI rules. Some encourage students to use AI to write essays, while others ban it outright. A recent survey by RAND research organization found only 34% of teachers said their school or district had policies related to AI and cheating, and 80% of students said their teachers never provided guidance on how to use AI for schoolwork.

That confusion is the crux of the problem, said Alix Gallagher, a director at Policy Analysis for California Education who has studied AI use in schools. Because there are few clear rules about AI use, students and teachers tend to have “significantly” different views about what constitutes cheating, according to a recent report by the education nonprofit Project Tomorrow.  

“Because adults aren’t clear, it’s actually not surprising that kids aren’t clear,” Gallagher said. “It’s adults’ responsibility to fix that, and if adults don’t get on the same page they will make it harder for kids who actually want to do the ‘right’ thing.”

Districts need to provide high-quality training for teachers and consistent policies for AI use in the classroom, so everyone knows what the rules are and teachers know how to navigate the new technology, she said.

↩︎ link

Unsustainable?

In Hillary Freeman’s government class at Piedmont High School near Oakland, AI is all but forbidden. If students use AI to write a paper, they get a zero. She only allows students to use AI to summarize complex concepts, write practice questions for a self-assessment or when Freeman explicitly permits it for a specific task.

She appreciates that AI can sometimes be useful, but she worries that it’s too easy for students to use it as a crutch.

“Reasoning, logic, problem-solving, writing — these are skills that students need,” Freeman said. “I fear that we’re going to have a generation with huge cognitive gaps in critical thinking skills. … It’s really concerning to me. I want their futures to be bright.”

Detecting students’ use of AI is another obstacle, she said. It means spending time digging through version histories of students’ work, or using AI plagiarism screeners, which are sometimes inaccurate and more likely to flag English learners.

“It’s a huge ‘add’ to my job, and it doesn’t seem sustainable,” Freeman said. 

↩︎ link

Digital literacy and academic integrity

Google, meanwhile, so far has no plans to remove Lens from its Chrome browsers, even on school-issued laptops, although it is continuing to test various levels of accessibility. Recently, it paused a “homework help” Lens shortcut button, but it’s unlikely Lens will vanish altogether, Google spokesman Craig Ewer said. 

The tech giant encourages students and teachers to learn more about positive and ethical uses of AI and how it can enhance learning. It’s also invested more than $40 million in AI literacy for students and teachers over the past few years.

“Students have told us they value tools that help them learn and understand things visually, so we have been running tests offering an easier way to access Lens while browsing,” said Ewer. “We continue to work closely with educators and partners to improve the helpfulness of our tools that support the learning process.”

School administrators also have the option of disabling Google Lens on Chromebooks. 

Los Angeles Unified has decided to keep Lens on its student laptops, at least for now, because the tool has plenty of positive uses that students should have the opportunity to explore, a district spokesperson said.

But the district has instituted some guardrails: the tool is only available to students who have completed a lesson on digital literacy, and students and teachers must comply with the district’s academic integrity and responsible-use-of-technology rules. Those rules include bans on plagiarism and cheating.  

“Los Angeles Unified remains committed to providing students with access to innovative learning tools while maintaining safeguards that promote academic honesty, digital citizenship, and the responsible use of technology,” a district spokesperson said.

This isn’t the district’s first challenge with AI technology. In 2024 Superintendent Alberto Carvalho unveiled a nearly $3 million chatbot called Ed, only to shelve it three months later when the company laid off half its staff.

Meanwhile, Stevenson said Lens vanished from his students’ Chromebooks last week after he alerted the district that some students were using it to cheat.

“It’s encouraging, but it also reveals how haphazard the introduction of AI has been,” Stevenson said. “Teachers and school leaders spend countless hours considering each detail of the learning experience, then Google totally undermines it with the click of a button. This isn’t how education is supposed to work.”

We don't only investigate technology. We instigate change.

Your donations power our award-winning reporting and our tools. Together we can do more. Give now.

Donate Now