National Opinions

OPINION: The AI ate my homework, and my schoolwork too

As a college professor, I get a lot of questions about homework — and lately they have almost all been about how artificial intelligence will change it. After all, if AIs can pass many medical, bar and economics exams, then they can certainly handle high school or college homework.

Homework has long been a staple of the academic experience. How will it evolve as more students master the capabilities of rapidly improving AI systems? Or, to ask a slightly more pointed question: How am I supposed to know whether I am grading the student or the AI?

Big changes are in the offing, but they will arrive slowly. Classroom practices, for better or worse, are among the stickiest of human institutions. A lot of instruction hasn’t changed much for thousands of years, even if modern chalk is better than its ancient precursors.

The main point is that grades will come to mean something different. Traditionally, at least in theory, grades have been a measure of how well a student understands the material. If they got an A in U.S. history, presumably they could identify many of the founders. In the future, an A will mark a kind of conscientiousness: It will mean that, at the very least, they applied their AI consistently to the questions at hand. Whether that counts as “cheating” or “allowed” will depend on the policies of the relevant educational institution, but anti-AI software is not reliable and anti-AI rules cannot be enforced very readily.

“Applied their AI consistently” might sound unimpressive as a certification. But I have known many students over the years who don’t meet even that standard. They may neglect to hand in homework or fail to monitor due dates. They may or may not know the relevant material — often they do not — and it is not at all clear to me that current AI technology will automatically enable them to get good grades.

In other words, an academic system replete with AI is still is testing for something, even if it is much less glorious than what we might have hoped for. Over time grades will come to indicate not so much knowledge of the material as a student’s ability to be organized and prepared.

To be clear, these are good habits to cultivate — and keep in mind that this new system would not be so dissimilar from the status quo. Students have collaborated on homework since homework has existed, including with their parents, whether they have been allowed to or not. The AI eases and speeds this collaborative process. There have never been entirely honest grades, not even in the “good old days.”

ADVERTISEMENT

Still, there may continue to be a need or wish to test for knowledge of the actual material. This will have to be done in person. Maybe there will be exercises to be completed in the classroom, as my Bloomberg Opinion colleague Adrian Woolridge has noted, or oral exams in the Oxbridge style.

These adjustments are most likely to take place in fields where results matter in a direct and measurable way, such as physical engineering. They are perhaps less likely in the humanities, where a student who is using AI to fake knowledge probably wouldn’t get very far anyway.

Another kind of adjustment will involve assigned projects, to be created by the student and the AI working in tandem. I taught a class to law students last spring, and one of the requirements was that they collaborate with AI on a research paper — transparently of course, and with an explanation of how the AI was used. On average, students who learned how to work with the AI wrote better papers. This kind of collaborative skill will only become more crucial, and homework will evolve to reward that reality. In this context, using an AI to “cheat” is no longer an issue.

On the downside, collaborative projects with AI usually are less predictable and less “cookie cutter,” and thus harder for instructors to grade. That seems like a small price for teaching students this all-important skill.

There’s no doubt that AI will change the nature of not only homework but also instruction. A professor who can recite a lot of information may no longer seem so impressive: Why pay for what an AI can deliver more cheaply? Instead, the focus will shift to what only humans — so far — can provide: inspiration, charisma, mentoring. That, too would be a change for the better.

Tyler Cowen is a Bloomberg Opinion columnist. He is a professor of economics at George Mason University and writes for the blog Marginal Revolution. He is coauthor of “Talent: How to Identify Energizers, Creatives, and Winners Around the World.”

The views expressed here are the writer’s and are not necessarily endorsed by the Anchorage Daily News, which welcomes a broad range of viewpoints. To submit a piece for consideration, email commentary(at)adn.com. Send submissions shorter than 200 words to letters@adn.com or click here to submit via any web browser. Read our full guidelines for letters and commentaries here.

ADVERTISEMENT