ChatGPT Cheating Fears Seem Overstated

from the use-the-tech-for-good dept

There have been all sorts of overblown fears and moral panics raised by the availability of new generative AI tools. And one that I keep hearing about, which many people have accepted as obviously true, is that it will damage school education, as kids will just use ChatGPT to do their work.

This has always been a bit overblown for a variety of reasons, including that by its very nature, ChatGPT output tends to be “average” at best, and for anything that involves any level of deeper thinking, it tends to be pretty obvious, pretty quickly, that the tool just isn’t that good.

Now, the NY Times has highlighted how some recent Stanford research has called into question the entire premise. It’s based on an ongoing series of anonymous surveys to students regarding a variety of behavior that might impact their education that is done by a school reform nonprofit named Challenge Success. As the co-founder of that group, Denise Pope, notes, the survey seeks to find out honest answers on things like “the amount of sleep they get, homework pressure, extracurricular activities, family expectations, things like that — and also several questions about different forms of cheating.”

She notes that the latest surveys don’t suggest any mass increase in cheating, and that the numbers of students who say they’ve cheated have mostly held steady. You could argue that that’s not really the point, as the nature of the cheating could be very different, even if the number of kids doing it remains the same, but there were some other interesting findings:

But I think it’s important to point out that, in Challenge Success’ most recent survey, students were also asked if and how they felt an AI chatbot like ChatGPT should be allowed for school-related tasks. Many said they thought it should be acceptable for “starter” purposes, like explaining a new concept or generating ideas for a paper. But the vast majority said that using a chatbot to write an entire paper should never be allowed. So this idea that students who’ve never cheated before are going to suddenly run amok and have AI write all of their papers appears unfounded.

And, this is actually encouraging, because that kind of use of ChatGPT… is actually good? It’s the kind of way that the tool should also be used in the real world. These tools can be useful starting points or brainstorming tools.

As she later notes, if anything, these tools may be leveling the playing field a bit:

Even before ChatGPT, we could never be sure whether kids were getting help from a parent or tutor or another source on their assignments, and this was not considered cheating. Kids in our focus groups are wondering why they can’t use ChatGPT as another resource to help them write their papers — not to write the whole thing word for word, but to get the kind of help a parent or tutor would offer. We need to help students and educators find ways to discuss the ethics of using this technology and when it is and isn’t useful for student learning.

The NY Times piece also points to a recent Pew survey that also pours some cold water on the idea that kids are just ramping up their cheating by using ChatGPT, that seems to support the Stanford study as well. Again, students feel okay using it for things that seem perfectly reasonable, such as researching topics, but a much smaller percentage find it acceptable for writing.

Pew survey results showing that 69% of surveyed teens say it's acceptable to research new topics with ChatGPT, but only 20% say it's okay to write essays with it.

And of course, we keep hearing about productive and useful ways to use ChatGPT in school. One of my favorites is an idea that some teachers have used, asking students to deliberately have ChatGPT write an essay, and then they have to submit the prompt they came up with for the essay, the essay, and (most importantly) the final essay showing the corrections they’d make to the ChatGPT essay.

To me, this seems like a pretty powerful tool for learning when used this way, rather than just freaking out and trying to ban it entirely. I know that years ago, I realized that the best way to truly learn something is to teach it to someone else, as their questions and confusion force you to understand the topic you’re teaching at a much deeper level. If ChatGPT or other AI tools can stand in as the “learner” in those scenarios, it could be a huge boost to better learning.

Filed Under: , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “ChatGPT Cheating Fears Seem Overstated”

Subscribe: RSS Leave a comment
26 Comments
Ethin Probst (profile) says:

As someone who does use ChatGPT as a learning/brainstorming tool (as well as a tool that helps me refine things like algorithms I come up with in ways I usually might not think of), I definitely agree that students are more likely to use it as a brainstorming/templating tool than a way of cheating. And if people do use it to cheat, it’ll be quite obvious, especially if it needs to do searching with Bing.

MrWilson (profile) says:

There were people railing against the use of thesauri and dictionaries and spell check and grammar check and any number of other tools in the past. Like any tool, it’s not a replacement for the writer, but they can be useful or they can be misused. You have to be a good writer to recognize if ChatGPT’s output is even useful. A student who doesn’t know better will post something that “seems” useful and it should be apparent to an instructor that it didn’t come from the student.

Anonymous Coward says:

  1. It’s still rather new, so people will try the new tool. If the tool gets any better, then great.
  2. If search wasn’t such a dumpsterfire that’s been raging 15 years or so, threatening to start a tire fire in the next lot, people wouldn’t be as inclined to use the putatively-better LLM search and summary option as much.
  3. Gen Z (and Millennials) use the library more than previous generations, so lol.
Professor Ronny says:

“ChatGPT Cheating Fears Seem Overstated”

I’m a college professor and I want to point out two things about cheating with ChatGPT.

First, if you use well thought out projects, you are less likely to have students cheat using ChatGPT. I have students tour a working factory and compare/contrast what they saw with what we covered in class. I don’t see how ChatGPT could be a major help with a project like that. Of course, these types of projects are not suitable for all courses.

Second, and much more negatively, students just don’t see a lot of what they do as cheating so I’m not so sure a survey on cheating is all that meaningful. I once had a student get the instructor’s manual and use that to turn in homework answers. I caught her because there were errors in the instructor’s manual. I dinged her over it and she swore that what she was doing was NOT cheating.

My own nephew was taking an online accounting class and had his CPA wife take his exams for him. He did not see that as cheating and willing bragged about it at a family gathering.

Those are just two of many examples. If these students didn’t see their behavior as cheating, they are not going to tell a surveyor that they had cheated because, in their mind, they had not cheated.

Anonymous Coward says:

Re:

Students these days are not going to think much of integrity. And let’s be honest, what are they looking at when they graduate? They’re not looking at a job with a supportive plan of progression. They’re not going to be mentored to develop into the best person they can be. They’re going to be mere stepping stones for CEOs who, in their obsession with self-importance and power and growth at all costs, run organization after organization into the ground.

Students in 2008 saw how the government bailed out banks who fucked it up for the rest of the planet. Students in the pandemic years saw how the only thing companies care about is asses in cubicles and seats amid a desperate desire to go back to the good old days, instead of adapting to the needs of the workplace and meaningful employee engagement. Short term grifts, faking it until you make it, and relying on a network of contacts instead of knowledge is how you make it in the world today.

Of course a student relying on a family member who knows better or a list of correct answers isn’t going to look at it like it’s cheating. Would we expect a CEO to know how to balance their own accounts, instead of relying on someone else to do it? And sure, you could correctly, idealistically argue that a student needs to be able to think for themselves. In a workplace where compliance and obedience are prized above all else, when was the last time anyone gave a shit about what a student thinks?

Arianity says:

Re:

First, if you use well thought out projects, you are less likely to have students cheat using ChatGPT. I have students tour a working factory and compare/contrast what they saw with what we covered in class. I don’t see how ChatGPT could be a major help with a project like that. Of course, these types of projects are not suitable for all courses.

As a former TA in a STEM field, the problem for us was homework. Kids would cheat on the homework, and then fail the exams, because they never built up the skillset from repetition of basic problems. And it was such a large proportion that it would be difficult to simply fail them all.

Some professors would mitigate this by writing their own problems. It wasn’t a silver bullet, but it did at least require them to find a classmate to cheat off of, so it did help.

One of our professors intentionally uploaded a wrong answer (I think it was to Chegg), as a test. I forget the percentage, but well over 30% of the classes submitted that incorrect answer (I want to say it was closer to 70%, but my memory is hazy). I was stunned.

weevie833 (profile) says:

Detectability

From having been an instructor for fully online college courses for the past 12 years, I am certain that ChatGPT has already infringed on the integrity of my classes – I can tell. Students just don’t write with the kind of clarity that comes out of ChatGPT – and I use ChatGPT a lot. It’s so obvious!

The real problem will be when students figure out how to make their ChatGPT output sound like their own writing. Until then, I have candid discussions with my students on whether they are using ChatGPT and draw the lines where it is and is not permissible. So far, it’s been collegial, but I admit that it is a hopeless effort.

I agree with the other poster here who said that, when asked, students do not admit cheating because they don’t think using ChatGPT is cheating.

As I tell my colleagues, the challenge for us college educators is to admit when we are beat by this thing, change the forms of assessment we make, and let go of the parts of the instructional narrative we cannot control.

Rocky says:

Re:

As I tell my colleagues, the challenge for us college educators is to admit when we are beat by this thing, change the forms of assessment we make, and let go of the parts of the instructional narrative we cannot control.

If you are uncertain if a student wrote a paper, use a small oral questionnaire about the paper to ascertain if the student knows what they wrote.

Anonymous Coward says:

Re:

The biggest challenge for college educators is going to be providing an education that’s still worth a damn in today’s economy, and in all honesty I don’t think that’s a fight the educators can win on their own.

You can pump an undergraduate full of knowledge and information and skills, and odds are it’s not going to count for much when they get paid below market rate, doing a menial job that’s irrelevant to their degree.

Tertiary education has simply become another checkbox to tick on whether someone is qualified enough to be treated as a functioning human. It’s no more than a stepping stone so kids can jostle against each other to rub the right shoulders. That’s not to say the skills aren’t necessary. They are, but we as a society, economy and corporatocracy continue to devalue their qualifications, abilities and effort.

Students using ChatGPT was a completely logical progression. If all everyone else cared about was getting told the right answer, instead of the students’ opinions and abilities, why expend the effort?

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...