From the Wayback Machine: Do we need a Systems Analysis of the Digital Humanities?

This post was first published at ideasunderground.com on 24 May, 2009. I’ve reproduced it here partly because that blog no longer exists, partly because it’s a lazy-but-efficient way of offering an idea I’ve been mulling over for some time to a new audience, and partly because I’m (sadly, perhaps) still quite taken with it. It fits well with my belief that scholars – especially in a post Edward Snowden world – need to understand the engineered nature of the virtualmachines they use in their work (regardless of whether they want to build digital outputs or not). Achieving a robust level of scholarly self-consciousness in the digital age is a challenge that most people have (I suggest) given up on, in the face of technological advance rather than methodological choice or epistemological orientation. This has huge implications for the integrity of future scholarship, but opens up equally fascinating areas for research and analysis.

 

I’m working with a team of systems analysts at the moment and it has got me thinking about what kind of ‘business intelligence’ digital humanists have at our disposal. The humanities have developed organically, in a kind of ‘conversational’ or maybe dialectical process that remains opaque and resistant to formal analysis, but I’m wondering if the convergence of our disciplines with technology necessitates more than this. I accept that I’m flirting with nonsense here: technology is just a delivery mechanism and needn’t determine future directions in the humanities, so why waste time analysing it? We should just use it to deliver our products. It’s a fair attitude to adopt if you’re more focussed on what I’ll call ‘traditional products’ (books, journal articles etc), but I think it’s a bit short-sighted for those us interested in pressing the digital humanities towards what they might worthily be.

So what am I suggesting? The idea would be to undertake a formal systems analysis of the engineered sub-structure to our new field, combined with an analysis of the ‘interface’ between traditional humanist outputs and this sub-structure, potential new directions made possible by that sub-structure, and problems imposed by that sub-structure. It would perhaps need to be done at a disciplinary level initially, with those findings combined into a meta-analysis of the entire field when we’d gathered enough information. I am sure, for instance, that an analysis undertaken by an English professor would be quite different to one undertaken by a historian. Sounds like an idea that’s far too heavy to get off the ground, eh.

Ignoring the leaden quality of the idea, though, what do I mean by ’sub-structure’? Commercial systems analysts would probably suggest it’s a woolly idea (as it may well be), but the idea would be to offer digital humanists a basic overview of the underlying ‘wiring’ of the internet, including domain name servers, routers, key data centres and ’suchlike’. The next layer of analysis would perhaps be to describe the governing bodies like ICANN which control (for want of a better word) the flow of information across the internet, and point out their interactions with international law, national governments and, by extension, humanists sitting in their offices producing their digital outputs. Wikipedia offers a decent overview and there will be very good descriptions offered in various other places, but I’m suggesting that humanists need to do it themselves as well, and that a collaborative approach would suit such a project quite well. We all know how descriptive analysis tends to track the interests of the inquirer, and I have a feeling humanists could offer some interesting perspectives.

Part of my interest in this topic stems from a 1960 article by Ernest Nagel in Philosophy and Phenomelogical Research titled ‘Determinism in History’, where he brilliantly explores the parallels between tightly defined scientific articulations of a ’system’, and the way that historians conceive historical ’systems’. The article was written in the heat of the history-as-system debates prompted by Marxist class analyses, which claimed that History was a closed and definable system determined by economic forces. As is well known, the idea that history was determined in this way sparked furious debate and eventually led to an unspoken (if not uniformly accepted) consensus that this couldn’t possibly be the case: it revolted human sensibilities to suggest that our destiny was in the hands of impersonal systemic forces. Nagel’s analysis was brilliant because, although arguing against naive determinism in historical interpretations, he was brave enough to dive right in and argue that, properly conceived, History is indeed systemically determined (just like a chemical reaction, no less) but in an indeterminate way. Building on the very small / very large anomalies observed in the quantum world, Nagel noted that although basically deterministic, historical systems were simply too large (or small in some cases) and complex to identify the deterministic forces at work. The article isn’t particularly well known, but it still stuns me to think that at a time when most mainstream historical theorists were doggedly refusing to acknowledge that History was determined to any degree at all, lest Marxists leverage the gap, Nagel was capable of offering a sophisticated analysis which allowed for what is, after all, a basic truism of all systems – historical or otherwise.

So, to my point: humanists now work in the context of a large, complex system, which determines (a la Nagel) their work and their future to an unknown degree. It’s still very early days in the digital humanities, so it makes sense to engage in a formal systems analysis to start to work out to what degree this system imposes itself on our work. Leaden, I know, but I think potentially fascinating too.

References

Nagel, E., 1960. Determinism in History. Philosophy and Phenomenological Research, XX(3), 291-317.