The A, the I, and the H
Micah Nathan published a thoughtful piece in The Guardian about experiences teaching MIT students creative writing. The students were not humanities scholars or creative writers; they were at MIT to study math, science, engineering, and related subjects. Creative writing was not their strong suit, and some of them turned to large language models for help, despite being asked not to.
Nathan reports that she could immediately identify the work done by an AI. "I turned to the ostensible authors and told them I knew that AI wrote their stories. I didn’t need AI-detection software to know; I just knew."
This made me wonder about a sort of reverse situation. A class in a STEM subject, where the assignment might be some math or math-adjacent problem. Would it be noticeable if a student turned in an LLM-generated answer? I would argue not. And there are already pretty advanced tools available, from powerful calculators to MatLab and Mathematica. Using such tools in math, science, and engineering classes is pretty standard, as far as I know.
What does that say about those areas of study? Are we just more acclimated to those tools? Is the current version of AI just too new to be accepted as a "tool" instead of a "cheat?" What if chatbots had been invented before calculators? Would we by now be more accepting of students using them for assignments?
Also consider the goal of a STEM education versus a humanities curriculum. In light of existing and emerging tools, is a STEM degree not basically aimed at turning out a human who has developed computer-like skills? Why are those skills often touted as more valuable than a humanities degree?
Back in the 1800s (or maybe earlier), educators in the US admired the system used in what was then Prussia. The system was "...described as a factory-model education, emphasizing discipline, conformity, and loyalty to the state over individual thought." It occurs to me that this part and parcel of a way of looking at the world that's deeply embedded in economic, particularly capitalistic, thinking. Everything is assigned a "value", and that value is — or becomes — indistinguishable from monetary value. When you hear someone touting "STEM education," they're usually talking about preparing students for regimented work environments. And salaries, which are taken as another measure of the "value" of a person.
But it seems possible that we're approaching a point where performing regimented tasks, especially those that are math or math-adjacent, might become more and more the realm of machines. For a long time already, it's been machines that perform calculations. Unlike a creative writing teacher, a math teacher would be unable to glance at a student assignment and know it was computer-generated. The solution to an equation has nothing to do with tone or feeling. I believe at the highest level of such things, that may not be entirely true, but I think that might apply only to people far beyond the mathematical or scientific skills most of us have or even aspire to.
Creative arts, though, are uniquely human. A chatbot leaves its mark on a story or illustration, showing immediately, at least to some, that we're not dealing with an authentic human product. And that raises an interesting question: might revival of the humanities be an unexpected side effect of the rise of AI? I think I might like that.