The Classical Teaching Institute

View Original

Can A Classical Christian School Honestly Use AI?

by Joshua Gibbs

What can your school legitimately use AI to do?

Can you use ChatGPT to write job postings? How about newsletters? How about lesson plans? How about any of the myriad other communications that nobody actually reads?

In the current conversation about the use of AI, I regularly hear proponents and sympathizers say, “AI is a tool,” in an emphatic, deal-with-it tone that suggests finality on the matter. Like a hammer, like a wrench, like a can-opener, AI is something that can be used for good or evil. It has no value in and of itself. Its value is entirely bound up in the intentions of the user.

I use the example of hammer, wrench, and can-opener because the people who argue that AI is “a tool” seem to have such tools in mind. The hammer is an iconic tool, so this might not be entirely unfair, and yet to suggest that AI is “just a tool” is a bit like suggesting that a switchblade is a utensil that can be used while dining, sort of like a spoon, or that fentanyl is a drug in the same way caffeine and Tylenol are drugs.

And yet, the fact that a gun is a tool doesn’t mean every man should have the same unlimited access to guns that he has to can-openers. The fact a gun is “a tool” doesn’t mean I want eight men with Saudi Arabian passports to bring Glocks on a passenger plane departing LaGuardia this afternoon. They’re all welcome to bring can-openers, though.  

That’s really bold rhetoric, though, and it’s really not the direction I want to argue.

I believe a far more basic standard ought to be used when judging the right use of AI. This standard has more to do with manners than philosophy, and it is this: provided you are willing to admit the use of AI, go ahead.

Simply put, if your school uses AI to write a job posting or a newsletter, admit it up front.

If you are comfortable concluding an AI-enhanced posting with a keyboard symbol—perhaps like © or ® or ™—that makes plain the fact AI was used in the preceding text, go for it. However, if you are not comfortable letting applicants at your school know that AI was used in the creation of a job posting, you aren’t free to use it. If you would be embarrassed to see that information get out in the public, it’s off limits. If you’d prefer parents think you wrote last week’s school update by yourself, when in fact you had a lot of help from AI, then you can’t honestly use AI.

The use of certain tools in certain settings is typically regarded as cheating—if not illegal. You can’t fish for lobsters with dynamite. You can’t use tracing paper for the fifth-grade art contest. You can’t use a motorcycle in the Tour de France. You can’t use a radar detector in the state of Virginia. In other instances, the use of certain tools is legal only if it is acknowledged. For example, you can use a laboratory to create fake black truffle flavor for potato chips, but you have to admit its fake on the bag. If you sell fake black truffle as though it is real, that’s fraud. If your school is going to use AI to create various documents that go out to parents, students, or potential applicants, be above board and make it known. But be ready for people to speak of AI created documents and letters the same way they speak of fake black truffle or imitation crab. Fake. Imitation. Cheap.

Furthermore, any technology that can be used to gain leverage now will be used later by the other side to get back whatever ground they have lost. What’s good for the goose is good for the gander. If you use AI to create job postings, expect applicants to use AI to create resumes. If you don’t create a tough policy that governs student use of AI, expect teachers to use it, as well. If you use AI to write your school newsletter, expect families to use AI to read the newsletter—although this possibility (this eventuality) suggests that AI is not so much a tool as a colossal and pointless middle man. Imagine it: the director of communications at a school feeds data points to AI which then creates a long, elegant newsletter for parents to read, and yet parents feed the same newsletter into AI and have it reduced to data points again so it’s quicker to read. Thus, the school creates the illusion of caring about families, families create the illusion of caring what the school says, and yet both sides are essentially at war with one another over whose existence can be more convenient.

It should also be noted that we assume professionals use certain resources and abstain from using others. A carpenter doesn’t hide his use of a hammer, neither does a cook hide his use of a can-opener. However, if a homemaker dresses up a bakery cake such that it appears she made it herself, and if she accepts compliments on the cake without qualification, she is guilty of deception. Similarly, if a classical Christian school—an institution that promotes itself as a place where thinking and writing are taught well—decides to outsource their thinking and writing to robots, that school no longer deserves to be taken seriously. Can anyone at your school write? If so, pay them to write your copy. If not, tell parents. I’m sure they’d like to know before signing reenrollment forms.