Half a year ago, November 14, 2015, we participated in a workshop at the Institute of Science and Art in Vienna, entitled “Distant Reading and Discourse Analysis”. Our talk, which went under the headline “Distant Reading with Foucault?”, promised to give thoughts on “the practice of distant reading” and ponder potential “operationalisations of Foucauldian discourse analysis”.
A revised version of the talk was published last week as a featured article in the brilliant foucaultblog, and here comes the caveat, it’s in German: “Fernlesen mit Foucault?” So, the raison d’être of this blog post is to give you a short summary of the article.
What we tried to do in the first place was to define what Distant Reading is, or rather: what it was in the past 15 years since Moretti coined the term in his essay Conjectures on World Literature. We considered five possible answers. Distant Reading, was it …
- … “a joke”?
- … a polemical term, a buzzword?
- … a computer-based method for the analysis of literature?
- … Moretti’s attempt towards a ‘canon-critical’ large-scale literary historiography?
- … a failed or at least terminated project?
All of these possible answers had their grains of truth in them (and are discussed thoroughly in the original German version of our article). However, what Distant Reading certainly did not provide in the past one and a half decade, was a reliable methodology. Scholars using this term usually thought it enough to reference Moretti. At the same time, one of Moretti’s main epistemic moves seems to have been that of following analogies, as has been suggested by scholars such as Christopher Prendergast (2005) or Katja Mellmann (2009), something Moretti is not unaware of given that he himself, for one, discussed the problematic analogy between World-Systems Theory and a “World-Literary System” (in More Conjectures).
‘Old’ and ‘New’ Distant Reading
When rereading Moretti for this talk we were kind of surprised to find that Distant Reading in its original form has nothing at all to do with the practices of the Digital Humanities. It is hard to find Moretti talking about technological implications, he never mentions standards, protocols, scripts, databases and all the nerve-racking little problems you encounter when trying to squeeze literary data for meaningful findings. There was no freely available corpus or code or documentation that made his theses reproducible.
In Vienna, while strolling up and down Berg-Gasse and climbing the notorious Strudlhofstiege a.k.a. Stiedlhufstroge, we were discussing what our quasi-obituary for Old Distant Reading meant for our own research. Because at the very same time we were preparing a data-driven poster for the annual Digital Humanities conference of the German-speaking countries which took place in March, 2016. Our poster set out to be a “Distant-Reading Showcase: 200 Years of German-Language Drama at a Glance” (you can find it on figshare).
We wrote about the making of the poster in our DLINA project blog ([Digital] Literary Network Analysis), where we also give examples of what you can “distantly read” when looking at this obscure bulk of network graphs. What is important here is that our own project confronted us with the question: What should a renewed Distant Reading be like? While we shouldn’t cling to the term, we can state that Distant Reading (or Macroanalysis, or whatever you call it) should be reproducible, which includes all the above-mentioned aspects: freely available corpus, code, documentation, data, something that can cost months of additional work, but something we regard as essential as the eventual presentation of the results.
And Now, Foucault
The clarification on what Distant Reading meant or means ate up the better part of our talk, but it was not too late to check in with Foucault. So, could there be a methodologically clean ‘Distant Reading with Foucault’?
It is obvious that traditional, semantically rich concepts balk – almost programmatically – at their operationalisation in contexts of quantitative, formalised research. This concerns numerous hermeneutic ideas regarding the ‘deeper’ meaning of a text or text element. However, there is no reason why the Digital Humanities shouldn’t utter ideas for possible operationalisations. For the most part, this will result in a genuinely different perception of a subject, or, to paraphrase Moretti: At the end, these terms will probably have their place within completely different theories.
This becomes clear when looking at “the elementary unit of discourse”, the ‘statement’ (‘énoncé’). Foucault’s definition is perimetric, he repeatedly stresses that a ‘statement’ is something quite different from a string of characters (that could be located in a corpus based on definable rules). Instead, considering his extensive explanations in The Archaeology of Knowledge, a ‘statement’ is, to a large extent, context-relative, its determination, it seems, is less a positivist than a hermeneutical act. ‘Hermeneutical’ insofar as the discourse analyst has to “at least superficially understand the meaning of statements” to be able to classify them as such (quoting Philipp Sarasin 2014, p. 66; our translation).
Therefore, in light of the apparently very low operational potential of Foucault’s discourse analysis, we should ask ourselves whether it wouldn’t perhaps make more sense to go another way. Instead of thinking about how Foucault’s theory design could be operationalised, we could, for starters – following Moretti’s slogan “Forget programs and visions” (PDF) – view and discuss the results provided by well-established techniques of text analysis. In fact, decades ago we saw the emergence of a field of discourse analysis relying on corpus-linguistic/lexicometric approaches, just think of the pioneering work on an Analyse automatique du discours (1969) by Michel Pêcheux.
Also Moretti (in a Pamphlet from his post-Distant-Reading phase, so to speak) simply applied corpus-linguistic methods to describe something like the transformation of economic discourse based on an analysis of the annual reports of the World Bank (PDF). Sure, this is not an examination of ‘statements’ in a Foucauldian sense. Instead, Moretti is looking for most frequent words, collocations or the frequency of certain parts of speech and grammatical constructions, in other words, he and his co-author Dominique Pestre are looking for linguistic, not discourse-analytical ‘units’. But maybe, based on this kind of data, a new (and necessarily different) discourse analysis could be established, a revamped implementation of Foucault’s project, which could – unlike most corpus-linguistic approaches – continue what appears to us the key signature of Foucault’s “work on the discourses”: research as a practice of a critical science. ▣
Concluding Remarks
Given the workshop character of the event, our talk was little more than a first tentative approach to the question raised in the title. Sure enough, the workshop did spur some nice discussions that will certainly be continued. All talks of the workshop will soon be part of a separate ‘issue’ on the foucaultblog, edited by Simon Ganahl and Maurice Erb.