0

TurnItIn: AI

Hello - 

While I recognize that there are merits to using AI to support writing in certain ways, because I teach English courses to juniors and seniors that center around composition (word choice, organizational choices, making arguments that should be unique, etc.) I avoid it in nearly all my formal writing assignments. 

I recognize that the new Clarity program exists (a discussion for another day), I am very aware of how TurnItIn’s AI detection evaluates writing, and I definitely know it isn’t perfect. I totally understand and appreciate the way TurnItIn labeled this tool as the start of the conversation rather than the, for lack of a better term, smoking gun that points to this form of plagiarism that is becoming increasingly common. Where this is leading is that I’ve been having a lot of these conversations of late. I’ve had one or two students that logged 96% or 100% that there really isn’t much of a conversation about. But I’ve also had quite of few who logged the “*,” showing that there’s AI, but not enough to be confident, and more who have logged between 20 and 50%. 

My question: the majority of students I’ve spoken to are shocked that the AI score they have is as high as it is. I am not naive - I know I might have a few being dishonest. At this point I ask them to tell me ANYTHING they might have done differently and the most common answer is… they accepted choices recommended by the Grammarly extension or in other cases, passed it through ChatGPT or some similar program and said “check my grammar.” Is this enough to set off the alarms? Can you explain to me how this works so I can explain it to them? I have bounced around some other forums where this has been discussed, but I figured I’d come right to the source. 

Reply

null

Content aside

  • yesterdayLast active
  • 12Views
  • 1 Following