DISPATCHES FROM THE EDUCATION APOCALYPSE: Stanford expert on ‘lying and technology’ accused of lying about technology.

In an bizarre twist, a Stanford University expert who studies misinformation appears to have created some of his own — while under oath.

On Nov. 1, Jeff Hancock, a well-known and oft-cited researcher who leads the Bay Area school’s Social Media Lab, filed an expert declaration in a Minnesota court case over the state’s new ban on political deepfakes. Republicans have sued to block the ban, arguing it’s an unconstitutional limit on free speech. Hancock defended the law in his declaration, explaining how artificial intelligence makes it easier to fabricate videos and discussed deepfakes’ psychological impacts. But he seems to have made an ironic mistake.

Hancock cited 15 references in his declaration, mostly research papers related to political deepfakes and their impacts. Two of the 15 sources do not appear to exist. The journals he cites are real, as are some of the two citations’ authors, but journal archives show no sign of either paper. The actual journal pages referenced by Hancock have different articles. SFGATE was unable to find the cited papers on Google Scholar, either.

Republicans are involved, so naturally, they’re pouncing, according to SFGate:

The Minnesota litigants pounced on Hancock’s apparent gaffe. Frank Bednarz, an attorney for Republican state Rep. Mary Franson and conservative social media influencer Christopher Kohls, who are suing to block the deepfake ban, argued in a Nov. 16 filing that Hancock’s declaration should be excluded from the judge’s consideration of whether to give a preliminary injunction against the law’s enforcement.

“The citation bears the hallmarks of being an artificial intelligence (AI) ‘hallucination,’ suggesting that at least the citation was generated by a large language model like ChatGPT,” Bednarz wrote. “Plaintiffs do not know how this hallucination wound up in Hancock’s declaration, but it calls the entire document into question, especially when much of the commentary contains no methodology or analytic logic whatsoever.”

I’ll take “What are AI hallucinations and how do you prevent them?” for $500, Alex.

InstaPundit is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.