TOP OF MIND

I teach a class for USC, in the Architecture Department, and I’ve learned one thing these past few years. If I don’t keep up reviewing student assignments, I will be buried. For example, this past Monday, after USC’s spring break, I had 40 assignments to review and comment on. I haven’t finished them yet, and in a few days, it will already be next week.

But there’s hope! (If I wanted to cheat.) I could use a chatbot to grade my student assignments. Since they might be writing their assignments with the help of a chatbot, we would then have one chatbot grading the work of another. Could be progress!

Or not.

Actually, USC asked me to put a chatbot policy in my course syllabus. I said no to chatbot use. My argument is that writing is a process of looking deep into your soul and trying like hell to come up with something interesting. If you hand that over to a bot, you are turning away from personal growth.

Many other teachers don’t feel the same way. Especially at the university level, when students are getting close to facing a job market that most certainly is scary.

It’s likely that they will need to know how to use AIs to enter that market. You might be going into a job interview ready to demonstrate how a bunch of personal assistant bots can schedule, research, design, and write for you.

Consider that programmers will soon have a new job title. They will be Prompt Engineers, because they won’t program much, just craft text prompts for bots to execute. Expect the same for architecture, design, and engineering workers. There will be standout creative people in all of those fields, but the vast majority of them will be handing off their work to bots.

This has already happened among K-12 teachers, and I’m giving them a failing grade for doing it. As Axios reported:

Writable, which is billed as a time-saving tool for teachers, was purchased last summer by education giant Houghton Mifflin Harcourt, whose materials are used in 90% of K-12 schools. A teacher gives the class a writing assignment — say, “What I did over my summer vacation” — and the students send in their work electronically. ChatGPT offers comments and observations to the teacher, who is supposed to review and tweak them before sending the feedback to the students.

The teachers are supposed to review the ChatGPT notes, but when you fall behind in reviewing papers, as I often do, it would be oh-so-tempting to just let the bot do your thinking for you.

FAKEOUT LOL

ElevenLabs can clone a voice using an audio sample that is just 45 seconds long. Amazing, and easy to abuse. Use case: You answer your phone. Someone you know says they’re in trouble and asks for money. You send it. Turns out your friend or relative’s voice was cloned. They’re fine, but you’ve been scammed.

Voice-cloning technology seems mostly bad, but not completely. A company called Voice Keeper has been banking the voices of people with A.L.S., Parkinson’s, or throat cancer, so later on they can continue speaking with their own voice using text-to-speech software. A South Korean company has launched a type of memorial service that allows the dead to speak to future generations in their own voice.

There are trivial, commercial, and weird use cases also. A chicken restaurant chain in Ohio cloned College Football Hall of Famer Keith Byars’s voice so he could appear to take customer orders in the drive-through. NY Mayor Eric Adams has sent out campaign robocalls in Mandarin and Yiddish, languages that he doesn’t speak.

Hearing Mayor Adams speak languages he doesn’t speak is strange enough, but soon you aren’t going to be able to trust any of the audio coming out of your phone. One consultant I know has a secret, safe word that he has shared only with family, and he uses it to verify that it’s him on the call. More on this audio watermarking in an upcoming newsletter.

REFERENCES

Teachers Are Embracing ChatGPT-powered Grading

The Terrifying A.I. Scam That Uses Your Loved One’s Voice

The Voice Keeper

DISCLAIMER

I wrote this one fast this week. Typographical errors, poor metaphors, broken grammar, and strained examples are not my fault, because reasons.