Why the Humanities Are a Good Defense against AI

While being an adjunct is not a salubrious experience for anyone, it does offer one the opportunity to directly compare very different types of students. For example, there are striking differences in the use of AI at two institutions where I have recently taught. The comparison is, I think, instructive.

At one institution, which we shall call B, I have had a remarkably high rate of inappropriate AI use. In one class more than 10% of the students submitted AI-generated papers, despite being told clearly not to. (I have no trouble spotting these, by the way. In addition to being vague and a little too pat, they are inevitably at variance with the rest of the student’s work.) Most of my students at B are business and finance majors, and my theory is that they have learned such things as how to be efficient, get ahead, and maximize short term gains. They have not been encouraged to think about what it means to be the kind of person who takes shortcuts or how their actions may affect other people. And they certainly don’t think about what the value of actually doing the work may be.

At another institution, which we’ll call C, I’ve never received an AI-generated paper, nor in fact even the faintest whiff of academic impropriety. My students at C are mainly artists, though there are sometimes architects and engineers in the mix as well, and I believe my conclusions apply to them too. I think that it doesn’t even occur to these students to use ChatGPT or its ilk for the simple reason that they take pride in their work, regardless of what it is. Because they are encouraged to be creative, these students recognize the value of experience. The same goes for reading; they want to read a text in order to get whatever they can out of it. That’s not to say that they always do the reading or that they always put a great deal of effort into their writing, but I think it would horrify them on some level to pass off someone else’s work (even if that someone is a chatbot) as their own. They also tend to be much more impressed by the environmental ramifications of power-hungry AI data centers; they are, in other words, concerned with how their decisions affect other people.

The difference in attitude, I think, boils down to a concern for results (B) versus experience (C), that is, an emphasis on doing something rather than being done with it. This is what the humanities teach us. Why read the Iliad? If you have to ask, then you’re missing the point of education altogether.