Why I’m encouraging my students to use Generative AI (e.g., ChatGPT ) when writing their assignments
Because I can’t stop it from happening, and they need to learn how to use these new tools.
As a student in the 1970s, my teachers insisted I learn to use a slide rule, even though electronic calculators were increasingly common and inexpensive. I wasn’t very good at using a slide rule (or maybe just sullen at having to use what I viewed as an obsolete technology). My lack of dexterity with a slide rule wasn’t a significant problem in my career, however, because within a few years, slide rule use became obsolete. Indeed, most of my current students don’t even know what a slide rule is. Learning to use a slide rule was a waste of time for my education in two ways: I spent time learning a skill I wouldn’t ever use, and I didn’t spend that time learning new skills I would use.
For those of you from Generation Z who are wondering what a slide rule is: it’s a mechanical analog calculator used for multiplication, division and other calculations. It was invented in the 1600s and was widely used (where we would now use a calculator) until the 1970s, when advancing technology made it obsolete. Honest, ask your grandparents — I’m not making this up. (See picture above.)
As with the transition from slide rules to calculators, we’re currently dealing with a new transition — moving to a world of AI enhanced writing tools. I will encourage my students to use these new tools for two major reasons:
- I don’t see any practical way to stop them — the genie (so to speak) is out of the bottle. Although I could ask my students to promise, on their honor, not to use this technology, I share Falstaff’s cynicism about honor.
- Even if I could stop my students from using these products, as an educator, I don’t think I should. AI is a tool my students will need to master. It will only become even more sophisticated in the (near) future, so it’s time for them to start on their learning curve.
I teach public sector management and leadership to graduate students. I generally assign my students actual public sector challenges and ask them to propose solutions in writing, based on the context and frameworks from assigned readings. While I think Generative AI will be helpful to them for some portions of their writing assignments, it will have severe limitations for other parts of their work. The type of material I assign might not have been in the AI’s training set, and/or the AI might not be good at recognizing how it needs to apply complex frameworks to specific situations.
Remember (at least for the moment), Generative AI doesn’t replicate human creativity. It has limited capabilities (e.g., it’s unable to understand context and nuance). Therefore, it can sometimes produce results that are difficult to interpret or obviously aren’t useful. It’s also generally not very good at listing sources, which my students will need to find before they can submit their papers. And significantly, the current versions of Generative AI can sometimes ‘hallucinate’! Its developers have said it’s: “prone to sometimes write plausible-sounding but incorrect or nonsensical answers.” So, students will sometimes get a plausible sounding answer — that happens to be wrong!
As with any tool (e.g., a spreadsheet, a calculator, SAS, R, etc.), the students are responsible for the answer(s) they submit. If the AI system produces garbage and it’s submitted, the student will be graded accordingly. Hopefully, my students will learn to work with AI to produce accurate and more sophisticated answers — in less time.
For my classes, AI will generate (at best) a good first draft for certain sections of their assignments, and (at worst) potentially persuasive-sounding BS. Precisely because of the AI’s own limitations, students will need to critically engage with the text it generates. It will be up to the students to redraft the AI’s work to produce a submittable product.
From the students’ perspective, this will involve an iterative process of reviewing the text to see whether it makes sense in the context of the assignment. The students will only be able to do this, if they’ve done the class readings. This process of carefully examining the first (and any subsequent) draft produced by AI will force the students to confront the gaps in their own understanding of the class material. For example, the AI might produce an answer that sounds plausible, but is wrong — the students will need to evaluate the text to understand it and rewrite it.
The students will be free to interrogate the AI, by giving it additional prompts. This iterative approach (of giving feedback to revise and develop the draft) is a powerful tool for learning. It will hopefully force the students to engage more deeply with their topics by: recognizing when the AI is hallucinating, noticing gaps in the AI’s logic, and applying specific readings to their analyses. Because the AI’s responses will likely be too generic for class submissions, the students will have to rewrite the AI generated text, in a way that works within the context of the assignment. Even when they are satisfied with the text the AI generates, the students will still need to intelligently link the text to appropriate sources. In the process of doing so, they may discover new information or realize that the AI missed something important. Again, this interaction with the AI’s draft(s) has the potential to enhance student learning. Overall, this should prompt students to start learning a new skill — how to interact with AI enhanced writing systems.
Anyway, that’s my plan for dealing with AI in the classroom this semester — wish me and my students good luck!