Students, Parents Figure Out School Is Using AI To Grade Exams And Immediately Game The System
from the teacher-bot dept
With the COVID-19 pandemic still working its way through the United States and many other countries, we’ve finally arrived at the episode of this apocalypse drama where school has resumed (or will be shortly) for our kids. It seems that one useful outcome of the pandemic, if we’re looking for some kind of silver lining, is that it has put on full display just how inept we are as a nation in so many ways. Federal responses, personal behavior, our medical system, and our financial system are all basically getting failing grades at every turn.
Speaking of grades, schools that are now trying to suddenly pull off remote learning for kids are relying on technology to do so. Unfortunately, here too we see that we simply weren’t prepared for this kind of thing. Aside from all of the other complaints you’ve probably heard or uttered yourselves — internet connections are too shitty for all of this, teachers aren’t properly trained for distance learning, the technology being handed out by schools mostly sucks — we can also add to that unfortunate attempts by school districts to get AI to grade exams.
This story begins with a parent seeing her 12 year old son, Lazare Simmons, fail a virtual exam. Taking an active role, Dana Simmons went on to watch her son complete more tests and assignments using the remote learning platform the school had set students up on, Edgenuity. While watching, it became quickly apparent how the platform was performing its scoring function.
She looked at the correct answers, which Edgenuity revealed at the end. She surmised that Edgenuity’s AI was scanning for specific keywords that it expected to see in students’ answers. And she decided to game it. Now, for every short-answer question, Lazare writes two long sentences followed by a disjointed list of keywords — anything that seems relevant to the question. “The questions are things like… ‘What was the advantage of Constantinople’s location for the power of the Byzantine empire,’” Simmons says. “So you go through, okay, what are the possible keywords that are associated with this? Wealth, caravan, ship, India, China, Middle East, he just threw all of those words in.”
“I wanted to game it because I felt like it was an easy way to get a good grade,” Lazare told The Verge. He usually digs the keywords out of the article or video the question is based on.
And Lazare appears to have been right, as he now gets perfect scores on all of his tests. This is obviously both lazy teaching and lazy technology. Relying on software to grade tests that are essentially short-form essay tests, as opposed to multiple-choice Scantron style tests, make zero sense. Human grading is needed.
But the technology is quite lazy as well. How can a platform that is grading exams of this nature not build in a check against proper grammar, for instance? The fact that a student can simply toss in a bunch of disjointed words at the end of an answer, like some kind of keyword metadata, and get away with it is crazy. Especially when Edgenuity informs everyone that it’s supposed to work this way.
According to the website, answers to certain questions receive 0% if they include no keywords, and 100% if they include at least one. Other questions earn a certain percentage based on the number of keywords included.
Whatever that is, it sure as hell isn’t good education. And while testing practices in education are generally under scrutiny wholesale at the moment, there is little reason to issue tests at all if everyone involved is going to be this lazy about it.
And, to be clear, this is happening all over the place, with students finding more than one way to game the system.
More than 20,000 schools currently use the platform, according to the company’s website, including 20 of the country’s 25 largest school districts, and two students from different high schools to Lazare told me they found a similar way to cheat. They often copy the text of their questions and paste it into the answer field, assuming it’s likely to contain the relevant keywords. One told me they used the trick all throughout last semester and received full credit “pretty much every time.”
Another high school student, who used Edgenuity a few years ago, said he would sometimes try submitting batches of words related to the questions “only when I was completely clueless.” The method worked “more often than not.”
I think it’s fair to say that Edgenuity probably doesn’t get a passing grade for its platform, now widely used thanks to COVID-19.