One of my favorite mantras/quotes from a historian comes from Milton V. Backman. In the preface of his American Religions and the Rise of Mormonism, he states that “to simplify history is to falsify history.” I read this well before my venture into graduate school and the discipline of History. Yet, it has stayed with me throughout. It has percolated in my mind over the years as I engage with different historical themes, periods and discussions. This weeks readings dealt with two major debates/discussions: what has been called The Syuzhet Debate of 2015 and The History Manifesto Debates. Both of these debates deal with big data (either directly or tangentially), each caused a stir among historians, and reading through the blog pots and articles for each caused me to reflect on Backman’s observation.
The Syuzhet Debate centers around an R package, developed by Matthew Jockers, that is “designed to extract sentiment and plot information from prose.” He applies it to a corpus of novels in an attempt to define the archetypal plot lines in literature. The R package, made available through Github, was used by other historians who discovered some issues with the way syuzhet operated. Annie Swafford, a historian out of SUNY, wrote a blog post articulating the problems she found in using syuzhet and thus the debate began. A large part of Swafford’s critiques focused around the over-simplified approach the package had in regards to sentiment in prose. She outlined the lack of attention given to modifiers (“I am not happy”) and the extremely limited classification system for words (-1 for negative, 0 for neutral, 1 for positive). If I were to apply Backman’s mantra to this debate, in part, the “to simplify history” here is not the content but the tools by which historical analysis is achieved. Syuzhet is simplifying the process of identify positive and negative sentiments either by not addressing the complexity of words and modifiers or by ignoring the magnitude of certain words in regards to emotions.
Just a few months before the Syuzhet Debate of 2015, David Armitage and Jo Guldi published The History Manifesto. Armitage and Guldi proclaim that the world, and the discipline of history, has been plagued in recent decades by short-termness. This focus on brevity of time does not allow for policymakers or historians to fully answer and understanding long term historical (or economic, geological, political etc.) trends. They call on historians to return to the longue duree in their historical research and to take a larger (or as they articulate, return to their larger) role in policymaking. One avenue by which historians can return to long form history, and in correlation with the world of abundance, is the use of big data. The text has caused quite a stir within the discipline, both positive and negative. The AHR Exchange critique by Deborah Cohen and Patrick Mandler and the follow up by Armitage and Guldi was inriguing while also quite biting in their dialogue with each other. However, the narrative provided by Armitage and Guldi is very clean with definitive lines leading to each point and position. Reading through, I felt like this was a well drafted movie plot rather than an historical narrative. Nick Funke, a PhD student from the University of Birmingham, articulated it best in a blog post when he wrote “I am not at all nostalgic for a mythical time in which the past seemed neat… Humans are complex, contradictory and confusing, that’s the grandest narrative I’m willing to subscribe to.” The micro-histories that Armitage and Guldi dismiss are precisely the de-simplifying histories that allow for better understanding of the complex mess that is our/your/their past. To somehow remove that specificity and still maintain the level of detail and understanding is beyond me. I share in their excitement for big data and its use within history but I do not subscribe to the notion that historians are the gatekeepers to some utopian world (no matter how big my ego gets).