Cornell University

09/15/2025 | Press release | Distributed by Public on 09/15/2025 08:55

Holocaust testimony is AI litmus test, and it fails

When Cornell historian Jan Burzlaff asked ChatGPT to summarize the testimony of Luisa D., a seven-year-old Holocaust survivor, the AI tool left out the detail that her mother cut her own finger to give her dying child drops of blood - "the faintest trace of moisture" - to stay alive.

That omission, Burzlaff said, shows why human historians remain indispensable in the age of artificial intelligence.

"This most intimate, daunting moment - unspeakable and unforgettable - was rendered invisible by a model trained to privilege the probable over the profound," Burzlaff writes in "Fragments, Not Prompts: Five Principles for Writing History in the Age of AI."

Published Sept. 11 in the journal Rethinking History, Burzlaff argues in the paper that human historians are vital to capture the emotional and moral complexity behind world events. He warns that the growing dependence of academics on AI means they might fail to recognize true meaning in history. He lays out principles they should follow instead, including emphasizing interpretation over description and rejecting algorothmic ethics.

"If AI falters with Holocaust testimony - the most extreme case of human suffering in modern history - it will distort more subtle histories too," Burzlaff said. "Holocaust testimony is a litmus test for AI, where smoothing and summarization run up against the obligation to preserve fracture, silence and ethical weight."

A recent study by Microsoft ranked historians second on the list of the top 40 occupations that AI threatens. But Burzlaff, a postdoctoral associate in the Jewish Studies Program in the College of Arts and Sciencesand an expert on Nazi Germany, says historical writers possess skills that AI currently lacks - especially the ability to capture human suffering.

In his essay, Burzlaff describes his ongoing study using ChatGPT to summarize the testimonies of Holocaust survivors in recordings made in La Paz, Kraków and Connecticut in 1995. Burzlaff's results show that AI essentially ignored the extent these individuals suffered on an emotional level.

"As tools like ChatGPT increasingly saturate education, research and public discourse, historians must reckon with what these systems can and cannot do," Burzlaff wrote. "They summarize but do not listen, reproduce but do not interpret, and excel at coherence but falter at contradiction.

"The problem we historians now face is not whether AI can recognize meaning, but whether we will continue to do so."

Burzlaff said the idea for the article grew from his course, JWST 3825: The Past and Future of Holocaust Survivor Testimonies, which paired close listening to survivor testimonies with reflective, responsible and collective use of ChatGPT. Watching students confront AI's fluency inspired Burzlaff to develop guidelines for teachers, academics and anyone else writing about history in the modern era, especially those teaching about trauma, genocide and historical injustice. Burzlaff writes that his advice will help historians hold on to the "ethical, intellectual, and stylistic stakes of historical writing."

"At stake is not only the memory of the Holocaust, but how societies everywhere will remember and interpret their pasts in the age of prediction," he writes.

Because the accounts of Holocaust survivors differ according to their individual experiences and some are difficult to categorize, historians need to embrace this lack of uniformity and moments of human experience that algorithms cannot anticipate, Burzlaff said.

"If historical writing can be done by a machine, then it was never historical enough."

Linda Glaser is news and media relations manager for the College of Arts and Sciences.

Cornell University published this content on September 15, 2025, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on September 15, 2025 at 14:55 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]