HIGHER EDUCATION BUBBLE UPDATE: Writing Instructor, Skeptical of Automated Grading, Pits Machine vs. Machine.
Mr. Perelman’s fundamental problem with essay-grading automatons, he explains, is that they “are not measuring any of the real constructs that have to do with writing.” They cannot read meaning, and they cannot check facts. More to the point, they cannot tell gibberish from lucid writing.
He has spent the past decade finding new ways to make that point, and the Babel Generator is arguably his cleverest stunt to date. Until now, his fight against essay-grading software has followed the classic man-versus-machine trope, with Mr. Perelman criticizing the automatons by appealing to his audience’s sense of irony.
By that measure, the Babel Generator is a triumph, turning the concept of automation into a farce: machines fooling machines for the amusement of human skeptics.
Now, here in the office, Mr. Perelman copies the nonsensical text of the “privateness” essay and opens MY Access!, an online writing-instruction product that uses the same essay-scoring technology that the Graduate Management Admission Test employs as a second reader. He pastes the nonsense essay into the answer field and clicks “submit.”
Immediately the score appears on the screen: 5.4 points out of 6, with “advanced” ratings for “focus and meaning” and “language use and style.”
Mr. Perelman sits back in his chair, victorious. “How can these people claim that they are grading human communication?”
The Insta-Daughter faced machine-grading in middle school and figured out how to write for the machines, even though she thought the writing they wanted was lousy. Interestingly, when she was in online high school her essays were graded by real teachers.