Facts, Patterns, Methods, Meaning: Public Knowledge Building in the Digital Humanities

Note: This talk was delivered as a keynote address at the University of Wisconsin – Madison as part of the Digital Humanities Plus Art: Going Public symposium on April 17, 2015. I am grateful to the organizers of the symposium for generously hosting me and occasioning these thoughts.

 Facts, Patterns, Methods, Meaning: Public Knowledge Building in the Digital Humanities

Things have a way of coming full circle, of beginning where they have ended, and so I want to start today with Ralph Waldo Emerson, a man who thought about beginnings and endings, circles and forms. “The eye is the first circle,” he wrote. “The horizon which it forms is the second; and throughout nature this primary figure is repeated without end” (“Circles”).

Circles select and enfold, but also exclude, demarcating a perimeter, an in and an out. “The eye is the first circle”; it is the lens through which those of us lucky enough to have eyesight are able to perceive the world. And yet the eye both makes sight possible and is bounded by a second circle formed by the horizon of our vision, a circle that discloses both constraints and possibilities. We know that the landscape extends beyond the visible horizon, but we are limited by our own perceptions, even as they make possible all that we know. And this figure, this double act of knowing and unknowing, seeing and unseeing, taking in possibilities and limits in the same glance, is the mark of our experience in the world. There is always more to learn, always more outside the reach of our knowledge, always more beyond the edge of our sight.

Emerson teaches us to be humble in the face of such knowledge. “Every action,” he writes, “admits of being outdone. Our life is an apprenticeship to the truth, that around every circle another can be drawn; that there is no end in nature, but every end is a beginning; that there is always another dawn risen on mid-noon, and under every deep a lower deep opens.”

Perhaps it is telling that Emerson’s figure here involves a depth to be plumbed rather than a height to be climbed. For at this moment in the development of the digital humanities, we are pursuing new paths to knowledge, extending the horizons of our abilities with new tools. This is, obviously, not a teleological or progressive journey. We have exciting new tools and provocative new methods, but they are not necessarily leading us to higher truths. We are not marching along straight and ever-improving lines of progress. But we are producing tools that conform to new directions in our thought, and those tools can usefully reset our perspectives, helping us look with new sight on things we thought we understood. They can give us new vantage points and new angles from which we can explore the depths around us. And, of course, even as they make new sights possible, we remember Emerson and note that they foreclose, or at least obscure, others.

Emerson’s aphorisms provide useful reminders both for digital humanists surveying the field and for scholars observing it from its horizons. In today’s talk, I want to think through states of knowing in the digital humanities, situating our practices within larger histories of knowledge production. My talk has three parts:

  1. A discussion of a few approaches to text analysis and their relation to larger perceptions about what DH is and does, and how DH knowledge is produced;
  2. A discussion of some practitioners who are blending these approaches in provocative ways;
  3. A wider view of experimental knowledge in DH, with the suggestion of a new grounding, based in the arts, for future DH public work.

I want to start by discussing current directions of DH research, and in particular to spend some time poking a bit at one of the most influential and vibrant areas of DH — literary text analysis, the type of DH research that most often stands in as a synecdoche for the larger whole of the digital humanities. I do this despite the fact that focusing on literary text analysis risks occluding the many other rich areas of digital humanities work, including geospatial mapping and analysis, data visualization, text encoding and scholarly editing, digital archives and preservation, digital forensics, networked rhetoric, digital pedagogy, advanced processing of image, video, and audio files, and 3D modeling and fabrication, among others. And I should note that I do this despite the fact that my own DH work does not center on the area of literary text analysis.

One reason to focus on text analysis today is that when we talk about DH methods and DH work in the public sphere, literary text analysis of large corpora in particular is over-identified with DH, especially in the popular press, but also in the academy. There, we often see large-scale text analysis work clothed in the rhetoric of discovery, with DHers described as daring adventurers scaling the cliffs of computational heights. A 2011 New York Times book review of a Stanford Literary Lab pamphlet described, tongue-in-cheek, Franco Moretti’s supposed attempt to appear “now as literature’s Linnaeus (taxonomizing a vast new trove of data), now as Vesalius (exposing its essential skeleton), now as Galileo (revealing and reordering the universe of books), now as Darwin (seeking ‘a law of literary ­evolution’)” (Schulz). All that’s missing, it would seem, is mention of an Indiana-Jones-esque beaten fedora.

If literary text mining operates as a kind of DH imaginary in popular discourse around the field, one point I want to make today is that it is an impoverished version of text analysis, or at the very least a one-sided and incomplete one. As a way of complicating that picture, I want to sketch out two prominent areas of research in DH literary text analysis, one premised (not always, but often) upon scientific principles of experimentation that use analysis of large-scale textual corpora to uncover previously unknown, invisible, or under-remarked-upon patterns in texts across broad swaths of time. Known colloquially and collectively through Franco Moretti’s term “distant reading,” Matthew Jockers’s term “macroanalysis,” or Jean-Baptiste Michel and Erez Lieberman Aiden’s term “culturomics,” this approach is predicated on an encounter with texts at scale. As Franco Moretti has noted in his essay “The Slaughterhouse of Literature” when he described the move towards distant reading:

Knowing two hundred novels is already difficult. Twenty thousand? How can we do it, what does “knowledge” mean, in this new scenario? One thing for sure: it cannot mean the very close reading of very few texts—secularized theology, really (“canon”!)—that has radiated from the cheerful town of New Haven over the whole field of literary studies. A larger literary history requires other skills: sampling; statistics; work with series, titles, concordances, incipits. (208-209)

This is knowledge work at a new scale, work that requires, as Moretti notes, quantitative tools of analysis.

Opposed to this, though less often discussed — is a different form of DH work, one based not on an empirical search for facts and patterns, but rather on the deliberate mangling of those very facts and patterns, a conscious interference with the computational artifact, a mode of investigation based not on hypothesis and experiment in search of proof but rather on deformance, alteration, randomness, and play. This form of DH research aims to align computational research with humanistic principles with a goal not of unearthing facts, but rather of undermining assumptions, laying bare the social, political, historical, computational, and literary constructs that underlie digital texts. And sometimes, it simply aims to highlight the profound oddities of digital textuality. This work, which has been carried on for decades by scholar practitioners such as Jerome McGann, Johanna Drucker, Bethany Nowviskie, Stephen Ramsay, and Mark Sample, has been called by many names — McGann terms it deformative criticism, Johanna Drucker and Bethany Nowviskie call it speculative computing, and Steve Ramsay calls “algorithmic criticism,” and though there are minor differences between all of these conceptions, they represent as a whole a form of DH that, while it is well-known and respected within DH circles, is not acknowledged frequently enough outside of them, especially in the depictions of DH that we see in the popular press or the caricatures we see of DH in twitter flame wars. It is especially unseen, I would suggest, in the academy itself, where scholars hostile to DH work tend to miss the implications of deformative textual analysis, focusing their ire on that side of quantitative literary analysis that seeks most strongly to align itself with scientific practices.

I’ve set up a rough binary here, and it’s one I will complicate in multiple ways. But before I do, I want to walk through some parts of the lineage of each of these areas as a way of grounding today’s conversation.

Digital humanities work in large-scale text analysis of course has roots in longstanding areas of humanities computing and computational linguistics. But it was given profound inspiration in 2005 with the publication of Franco Moretti’s Graphs, Maps, Trees, a text that argued for a new approach to textual analysis called “distant reading” where “distance is […] not an obstacle, but a specific form of knowledge” (1). Moretti’s work, at this time, has a wonderful, suggestive style, a style imbued with possibility and play, a style full of posed but unanswered questions. The graphs, maps, and trees of his title proposed various models for the life cycles of literary texts; the book contains strong statements about the need for the kind of work it does, but it also resists conclusions and does not overly stockpile evidence in support of its claims. As Moretti himself put it, addressing the “conceptual eclecticism of his work, “opening new conceptual possibilities seemed more important than justifying them in every detail.” This was a work of scholarship meant to inspire and provoke, not to present proofs.

Eight years later, in 2013, Matthew Jockers, one of Moretti’s colleagues at Stanford who had by then moved on to a professorship at the University of Nebraska, published Macroanalysis: Digital Methods & Literary History, a text that employed a different register to present its claims, beginning with chapter 1, which is titled “Revolution.” In Jockers’s text, we see a hardening of Moretti’s register, a tightening up and sharpening of the meandering suggestiveness that characterized Moretti’s writing. Where Moretti’s slim Maps, Graphs, Trees was elliptical and suggestive, Jockers’s Macroanalysis was more pointed, seeking to marshal strong evidence in support of its claims. In the book, Jockers suggests that literary studies should follow scientific models of evidence, testing, and proof; he writes, “The conclusions we reach as literary scholars are rarely ‘testable’ in the way that scientific conclusions are testable. And the conclusions we reach as literary scholars are rarely ‘repeatable’ in the way that scientific experiments are repeatable” (6). Clearly, this is a problem for Jockers; he argues that literary scholars must engage the “massive digital corpora [that] offer us unprecedented access to the literary record and invite, even demand, a new type of evidence gathering and meaning making” (8). And as he continues, he deploys a remarkable metaphor:

Today’s student of literature must be adept at reading and gathering evidence from individual texts and equally adept at accessing and mining digital-text repositories. And mining here really is the key word in context. Literary scholars must learn to go beyond search. In search, we go after a single nugget, carefully panning in the river of prose. At the risk of giving offense to the environmentalists, what is needed now is the literary equivalent of open-pit mining or hydraulicking. . .. the sheer amount of data makes search ineffectual as a means of evidence gathering. Close reading, digital searching, will continue to reveal nuggets, while the deeper veins lie buried beneath the mass of gravel layered above. What are required are methods for aggregating and making sense out of both the nuggets and the tailings. . . . More interesting, more exciting, than panning for nuggets in digital archives is to go beyond the pan and exploit the trommel of computation to process, condense, deform, and analyze the deeper strata from which these nuggets were born, to unearth, for the first time, what these corpora *really* contain. (9-10; emphasis mine)

Even forgiving Jockers some amount of poetic license, this is a really remarkable extended metaphor, one that figures the process of computational literary work as a strip-mining operation that rips out layers of rock and soil to reach the rich mineral strata of meaning below, which are then presumably extracted in systematic fashion until the mine is emptied of value, its natural resources depleted. One doesn’t need to be an environmentalist to be a bit uneasy about such a scenario.

What’s really notable to me here, though, is the immense pressure this passage reveals. And I refer not to the pressure Jockers’s computational drills are exerting on the pastoral literary landscape, but rather to what his rhetoric reveals about the increasing pressure on DH researchers to find, present, and demonstrate results. Clearly, between Moretti’s 2005 preliminary thought experiments and Jockers’s 2013 strip-mining expedition, the ground had shifted.

In his 2010 blog post “Where’s the Beef? Does Digital Humanities Have to Answer Questions?” digital historian Tom Scheinfeldt compares the current moment in the digital humanities to eighteenth-century work in natural philosophy, when experiments with microscopes, air pumps, and electrical machines were, at first, perceived as nothing more than parlor tricks before they were revealed as useful in what we would now call scientific experimentation. Scheinfeldt writes:

Sometimes new tools are built to answer pre-existing questions. Sometimes, as in the case of Hauksbee’s electrical machine, new questions and answers are the byproduct of the creation of new tools. Sometimes it takes a while; in the meantime, tools themselves and the whiz-bang effects they produce must be the focus of scholarly attention.

Eventually digital humanities must make arguments. It has to answer questions. But yet? Like 18th century natural philosophers confronted with a deluge of strange new tools like microscopes, air pumps, and electrical machines, maybe we need time to articulate our digital apparatus, to produce new phenomena that we can neither anticipate nor explain immediately.

One can see what Scheinfeldt describes clearly in Moretti’s work: a sense of wonder, showmanship, and play in the new perspectives that computational methods have uncovered. In Jockers, we see a more focused, precise, scientifically oriented apparatus focused on testable, repeatable results. Jockers and Moretti are hardly the only DHers exploring large datasets — practitioners such as Ted Underwood, Andrew Goldstone, Andrew Piper, Tanya Clement, Lisa Rhody, and Ben Schmidt, among many others, come to mind as practitioners, each engaging such work in fascinating ways — but Moretti and Jockers (and their labs) may stand in for a larger group of scholars using similar methods to explore patterns in massive groups of texts.

I’ve said that I would describe two discrete areas of DH literary text analysis work. Having outlined what I would characterize as the area of the field proceeding on proto-scientific assumptions, I would now like to turn to a group of DH thinkers who, while occasionally using similar tools, are focused on forms of computational literary analysis that in many ways take a diametrically opposed path to the digital text by seeking to disrupt and play with the structures of the text.

In their 1999 piece published in New Literary History, Jerome McGann and Lisa Samuels outline their concept of “deformative criticism,” a hermeneutic approach to digital textuality that, rather than seeking to discover the underlying structure of texts through exposition, seeks to “expose the poem’s possibilities of meaning” through techniques such as reading backward and otherwise altering and rearranging the sequencing of words in a text. “Deformative” moves such as these, McGann and Samuels argue, “reinvestigate the terms in which critical commentary will be undertaken” (116). Many critics working in this vein argue that all interpretative readings are deformative, reformulating texts in the process of interpreting them.

In her work, Johanna Drucker has collaborated with Bethany Nowviskie and others to explore what she terms “speculative computing,” which is “driven by a commitment to interpretation-as-deformance in a tradition that has its roots in parody, play, and critical methods such as those of the Situationist International, Oulipo, and the longer tradition of ‘pataphysics with its emphasis on ‘the particular’ over ‘the general'” (>>>>PG#). Drucker goes on to differentiate speculative computing from quantitative processes based on “standard, repeatable, mathematical and logical procedures” by exploring “patacritical” methods, which privilege exceptions to rules and deviations to norms. Speculative computing, according to Drucker, “let’s go of the positivist underpinnings of the Anglo-analytic mode of epistemological inquiry,” creating imaginary solutions that suggest generative possibilities rather than answers. Drucker writes:

Humanistic research takes the approach that a thesis is an instrument for exposing what one doesn’t know. The ‘patacritical concept of imaginary solutions isn’t an act of make-believe but an epistemological move, much closer to the making-strange of the early-twentieth century avant-garde. It forces a reconceptualization of premises and parameters, not a reassessment of means and outcomes. (SpecLab 27)

Drucker frames her approach in opposition to the rationalized, positivistic assumptions of the scientific method, embracing instead randomness and play. This is also the approach that Stephen Ramsay takes in his book Reading Machines, arguing for what he terms “algorithmic criticism,” Ramsay writes that “[text analysis] must endeavor to assist the critic in the unfolding of interpretative possibilities” (SpecLab 10). Whereas Drucker seeks everywhere to undermine the positivist underpinnings of digital tools, creating not “digital tools in humanities contexts,” but rather “humanities tools in digital contexts” (SpecLab 25) Ramsay argues that “the narrowing constraints of computational logic–the irreducible tendency of the computer toward enumeration, measurement, and verification–is fully compatible” with a criticism that seeks to “employ conjecture . . . in order that the matter might become richer, deeper, and ever more complicated” (16). Because the algorithmic critic navigates the productive constraints of code to create the “deformative machine” from which she draws insights, the “hermeneutics of ‘what is’ becomes mingled with the hermeneutics of ‘how to’” (63).

And Mark Sample, in his “Notes Toward a Deformed Humanities,” proposes the act of deformance, of breaking things, as a creative-critical intervention, one that is premised on breaking things as a way of knowing. Sample’s projects — which include Hacking the Accident, an Oulipo-inspired version of the edited collection Hacking the Academy, and Disembargo, a project that reveals Sample’s dissertation as it “emerg[es] from a self-imposed six-year embargo, one letter at a time,” as well as a host of twitter bots that mash together a variety of literary and information sources — all demonstrate an inspired focus on interpretation as performed by creative computational expression.

I’ve discussed two major approaches to literary text analysis today — likely not without some reductive description — but I would like to turn now to the conference theme of “Going Public,” as each of these approaches take up that theme in different ways using platforms, methods, and models to foster more open and public DH communication.

Deformative work is often performed – witness Mark Sample’s twitter bots or generative texts, which operate in real time and interact with the public – at times even forming themselves in response to public speech.

Text mining scholars, with their focus on exploration, discovery, proof, and tool development, are admirably public in sharing evidence and code; just a few months ago, we witnessed one of the most fascinating controversies of recent years in DH, as DH scholar Annie Swafford raised questions about Matthew Jockers’s tool Syuzhet. Jockers had set out to build on Kurt Vonnegut’s lecture “The Shapes of Stories”; There, Vonnegut sketched what he described as the basic shapes of a number of essential story plots; following the arc of the main character’s fortunes, he suggested, we could discern a number of basic plot structures used repeatedly in various works of fiction, such as “Man in Hole,” “Boy Meets Girl,” and “From Bad to Worse.”

Jockers’s blog post described his use of Syuzhet, a package he wrote for the statistical software R, and which he also released publicly on Github. Because the code was available and public, Swafford was able to download it and experiment with it; she charged that the tool had major faults, and the ensuing discussion led to some sharp disagreements about the tool itself and Jockers’s findings.

Though Jockers wound up backing down from his earlier claims, the episode was fascinating as a moment in which in-progress work was presented, tested, and defended. This is of course nothing new in the sciences, but it was a moment in which the reproducibility of claims in DH was tested.

Having described these two areas of DH literary text analysis, one employing scientific models and seeking reproducible results and the other seeking to undermine the assumptions of the very platforms through which digital texts are constructed, I would like to finally complicate that binary and discuss some DH practitioners who are blending these approaches in fascinating ways.

First, I will turn to the work of Lisa Rhody, whose work on the topic of modeling of figurative language aims to investigate the very assumptions of the algorithms used in topic modeling. Topic modeling is a technique employed by Jockers and many others to reveal latent patterns in texts; it uses probabalistic algorithms to display a kind of topic-based guide to language in the text, tracking the play of similar concepts across it. Rhody’s project, as she writes, “illustrates how figurative language resists thematic topic assignments and by doing so, effectively increases the attractiveness of topic modeling as a methodological tool for literary analysis of poetic texts.” Using a tool that was designed to work with texts that contain little or no figurative language, Rhody’s study produces failure, but useful failure; as she writes, “topic modeling as a methodology, particularly in the case of highly-figurative language texts like poetry, can help us to get to new questions and discoveries — not because topic modeling works perfectly, but because poetry causes it to fail in ways that are potentially productive for literary scholars.”

Second, I will highlight the work of Micki Kaufman, a doctoral student in History at the CUNY Graduate Center with whom I’ve had the pleasure of working as she investigates memcons and telcons from the Digital National Security Archive’s Kissinger Collection. In her project “Quantifying Kissinger,” Micki has begun to explore some fascinating ways of looking at, visualizating, and even hearing topic models, a mode of inquiry that, I would suggest, foregrounds the subjective experiential approach championed by Drucker without sacrificing the utility of topic modeling and data visualization as investigative tools. Micki will be presenting on this work in January at the 2016 Modern Language Association Convention in Austin, Texas in a panel called “Weird DH.” I think it’s very promising.

Finally, I want to mention Jeff Binder, another student of mine — a doctoral student in English at the CUNY Graduate Center, whose work with Collin Jennings, a graduate student at NYU, on the Networked Corpus project, which aims to map topic models onto the texts they model, and to compare topic models of Adam Smith’s Wealth of Nations to the index published with the book. What this project produces, in the end, is a critical reflection on topic modeling itself, using it not necessarily to examine the main body of the text but rather to explore the alternate system of textual analysis presented by the book’s index.

I single out these three practitioners among the many wonderful scholars doing work in this area primarily for the fact that their practices, to my mind, unite the two approaches to text analysis that I have described this far. They use the computational tools of the proto-scientific group but in self-reflexive ways that embody the approach of deformative criticism, aiming to highlight interpretative complexity and ambiguity.

Johanna Drucker has argued that many digital tools are premised upon systems that make them poor fits for humanities inquiry:

Tools for humanities work have evolved considerably in the last decade, but during that same period a host of protocols for information visualization, data mining, geospatial representation, and other research instruments have been absorbed from disciplines whose epistemological foundations and fundamental values are at odds with, or even hostile to, the humanities. Positivistic, strictly quantitative, mechanistic, reductive and literal, these visualization and processing techniques preclude humanistic methods from their operations because of the very assumptions on which they are designed: that objects of knowledge can be understood as self-identical, self-evident, ahistorical, and autonomous. (“Humanistic Theory”)

Drucker calls for a new phase of digital humanities work, one that embodies a humanities-based approach to technology and interpretation. She writes:

I am trying to call for a next phase of digital humanities that would synthesize method and theory into ways of doing as thinking. . . .The challenge is to shift humanistic study from attention to the effects of technology (from readings of social media, games, narrative, personae, digital texts, images, environments), to a humanistically informed theory of the making of technology (a humanistic computing at the level of design, modeling of information architecture, data types, interface, and protocols). (“Humanistic Theory”)

By turning, in Drucker’s terms, from data to capta, from the presentation of data as transparent indexical fact to open and explicit acknowledgement of the socially constructed nature of information, and by using new DH tools and methods at times in ways that test the visible and occluded assumptions that structure them, these three junior scholars are moving us along on a new and exciting phase of digital humanities work.

If humanities text-mining work often proceeds according to the scientific method, striving to test hypotheses and create reproducible results, its genealogies lie in the work of natural philosophy and the various microscopes, air pumps, and electrical machines mentioned by Tom Scheinfeldt and described in depth in books like Steven Shapin and Simon Schaffer’s Leviathan and the Air-Pump. DH work, in fact, is often framed in terms of this genealogy, with the current moment being compared to the rise of science and experimentation with new tools.

As but one example, Ted Underwood, in response to the Syuzhet controversy and ensuing discussions about experimental methods, tweeted:

 

In the remaining section of this talk, I want to suggest an alternate genealogy for this moment, one that, although it has ties to that same early work of natural philosophy, might help ground digital humanities practice in a new frame. I will return to Emerson for a moment, to his statement that “The eye is the first circle; the horizon which it forms is the second; and throughout nature this primary figure is repeated without end.”

And so I want to explore pre-photographic experimentation with image-making as a way of suggesting new grounding for DH.

In 1839, Henry Louis Daguerre announced the invention of the daguerreotype camera to world, a moment of technical triumph that occluded a larger history of experiment. As the art historian Geoffrey Batchen has shown, when the invention of photography was announced to the world in 1839, the daguerreotype was one of a number of competing photographic technologies. The camera obscura had allowed artists to create replications of the world through a lens for centuries but no one was able to *fix* the image on paper, to make it last, to make it permanent. The project to do so was technical, artistic, and hermeneutic: while experimenters attempted to use different methods and materials to fix camera images on paper and metal, they did so with the confidence that the camera was an instrument of truth, a tool that could help human beings see the world from an unbiased, divine perspective. Daguerre himself was a showman, a painter of theatrical dioramas who had become interested in image-making through that realm.

And in fact, the modern negative-positive photograph descended not from the daguerreotype, but from what was called the calotype, a picture-making technology developed in Britain by William Henry Fox Talbot. While daguerreotypes were one-of-a-kind, positive images that could not be reproduced and that were made using expensive copper plates coated with silver halide and developed over mercury fumes, calotypes were reproducible, negative-positive, paper prints. Daguerreotypes, however, produced much finer gradations of tone and detail than the calotype. As a photographic practice, daguerreotypy grew more quickly in popularity in part because it produced more detailed images, and in part because Talbot restricted the spread of his technology by holding onto his patent license. Daguerre, meanwhile, sold his patent to the French public in exchange for a lifetime pension from the government, but held on to his patent rights in Britain. His announcement in 1839 marked the release of his image-making technology to the world.

In his 2002 examination of Henry Talbot’s work, “A Philosophical Window,” art historian Geoffrey Batchen notes that the specimens of early photography — the failed experiments, the pictures that were not fixed, the images that were faded and obscured, have been viewed by art historians only as indices of technical progression towards the invention of the photographic camera, rather than as art objects in and of themselves. Looking at Talbot’s pictures critically, and taking seriously Talbot as an artist working with a camera, Batchen finds in Talbot a conscious image-maker whose work should have relevance to us today.

 

 

 

Batchen focuses on one of Talbot’s early photographs, “”Latticed Window (with the Camera Obscura) August  1835.” The photograph contains a note: “when first made, the squares of glafs [sic] about 200 in number could be counted, with the help of a lens.”

talbot__latticed_window_taken_with_the_camera_obscura_(1835)1333486167283
Batchen performs a fantastic close reading of this note, highlighting Talbot’s instructions to the viewer, the suggestion that the viewer of the photograph look at it first from afar, and then up close with the aid of a lens. This set of instructions, claims Batchen,

anticipates, and even insists on, the  mobilization of the viewer’s eye, moving it back and forth, up and down, above the image. We are asked to see his picture first with the naked eye and then by means of an optical prosthesis . . . The attempt to improve one’s power of observation by looking through a lens is also a concession that the naked eye alone can no longer be guaranteed to provide the viewer with sufficient knowledge of the thing being looked at. It speaks to the insufficiency of sight, even while making us, through the accompanying shifts of scale and distortions of image that come with magnification, more self-conscious about the physical act of looking. (101-102)

Batchen’s comments here, focusing on scale and perspective, showcasing disorientations produced by new angles of vision, might remind us of Moretti’s discussions of scale, of the need for a new type of seeing to take account of thousands of texts. And indeed, amazingly, Talbot, the progenitor of the modern photograph, was also tied to early computing. He was a friend of Charles Babbage’s, whose Difference Machine is often described as the world’s first mechanical computer. Talbot made photos of machine-made lace and sent them to Babbage. Babbage, Batchen reports, exhibited some of Talbot’s prints in his drawing room, in the vicinity of his Difference Engine, making it likely that early computing and early photography were experienced together in his drawing room (107).

In his discussion of Talbot’s lattice-window photograph, Batchen notes that Talbot indeed narratizes our gaze. He writes:

So, for Talbot, the subject of this picture is, first, the activity of our seeing it, and second, the window and latticed panes of glass, not the landscape we can dimly spy through it. He makes us peer closely at the surface of his print, until stopped by the paper fibres themselves and at the framing of his window, but no further. And what do we see there? Is ‘photography’ the white lines or the lilac ground, or is it to be located in the gestalt between them? (102)

And where, we can ask, is DH to be located? Do we even know the foreground and background between which we can locate its gestalt?

Screen Shot 2015-12-30 at 12.16.23 AM

This, I think, is exactly the question we need to ask as we consider where DH is moving and where it should go.

One thing we can do is think about DH as the gestalt between gazes — not distant reading, not close reading, but the dizzy shape of self-reflexive movement between them. Though technology plays a part, it is critical reflection on that technology that can account for new, provocative digital humanities approaches.

And so, finally, I return to Emerson. The eye is the first circle, the horizon which it forms is the second. Let us plumb the space between.

 

Works Cited

Batchen, Gregory. “A Philosophical Window.” History of Photography 26.2 (2002): 100-112. Print.

Drucker, Johanna. “Humanities Approaches to Graphical Display.” 5.1 (2011): Digital Humanities Quarterly. Web. 19 Apr. 2015.

—–. “Humanistic Theory and Digital Scholarship.” Debates in the Digital Humanities. Ed. Matthew K. Gold. Minneapolis: University of Minnesota Press, 2013. Web. 19 Apr. 2015.

—–. SpecLab: Digital Aesthetics and Projects in Speculative Computing. Chicago: University of Chicago Press, 2009. Print.

Emerson, Ralph Waldo. Essays and English Traits. Vol. 5. New York: P.F. Collier & Son, 1909. Print.

Jockers, Matthew L. Macroanalysis: Digital Methods and Literary History. Urbana: University of Illinois Press, 2013. Print.

—–. “Requiem for a Low Pass Filter.” WordPress. matthewjockers.net. 6 Apr. 2015. Web. 19 Apr. 2015.

—–. “Revealing Sentiment and Plot Arcs with the Syuzhet Package.” matthewjockers.net. WordPress. 2 Feb. 2015. Print.

Moretti, Franco. “The Slaughterhouse of Literature.” Modern Language Quarterly 61.1 (2000): 207-228. Print.

Moretti, Franco, and Alberto Piazza. Graphs, Maps, Trees: Abstract Models for Literary History. London: Verso, 2007. Print.

Ramsay, Stephen. Reading Machines: Toward an Algorithmic Criticism. Urbana: University of Illinois Press, 2011. Print.

Rhody, Lisa M. “Topic Modeling and Figurative Language” Journal of Digital Humanities. Vol. 2, No. 1 (Winter 2012). Web.

Sample, Mark. “Notes toward a Deformed Humanities.” samplereality.com. 2 May 2012. Web. 19 Apr. 2015.

Samuels, Lisa, and Jerome J. McGann. “Deformance and Interpretation.” New Literary History 30.1 (1999): 25–56. Print.

Schultz, Kathryn. “The Mechanic Muse: What Is Distant Reading?” The New York Times 24 June 2011. NYTimes.com. Web. 19 Apr. 2015.

Swafford, Annie. “Problems with the Syuzhet Package.” annieswafford.wordpress.com. 2 Mar. 2015. Web.

Underwood, Ted. “These are such classic history of science problems. I swear we are literally re-enacting the whole 17th century.” 29 March 2015, 10:41 p.m. Tweet. 19 Apr. 2015.

 

Acknowledgements: Thanks to Lindsey Albracht for her help in preparing this web edition of the talk.

7 thoughts on “Facts, Patterns, Methods, Meaning: Public Knowledge Building in the Digital Humanities

  1. Ben Schmidt

    Really great piece; these alternate genealogies matter enormously, because the account of DH as a proto-science forestalls far too much interesting work.

    I hate to copy-edit, but: maybe you meant to write “Andrew Goldstone,” rather than “Andrew Gladstone”?

    Reply
  2. Matt Post author

    Thanks so much, Ben, for the kind words and for the copyediting (I’ve fixed the error). I hope we will continue to see a wide range of experimental practices, and that the wideness of that range will be legible in public discourse around DH.

    Reply
  3. Pingback: Editor’s Choice: Facts, Patterns, Methods, Meaning: Public Knowledge Building in the Digital Humanities | Digital Humanities Now

  4. Tom Scheinfeldt

    Great piece, Matt. It has me thinking: As you generously note, I’ve often used proto-scientific analogies, but my practice (to the extent your binary can apply to digital history as well as literature) falls mainly into the alternative lineage you describe. Not sure what that means, but I really appreciate the new framing for the more playful, performance-, arts- and design-based approaches I’ve taken to digital humanities practice in recent years.

    Thanks!

    Reply
  5. Anke Finger

    Matt, thanks for a wonderful summary and convergence of these ‘trends’. As a member of the play/performance/experimental/avant-garde cohort I’ve often sidelined the hard data and science element – not always to my advantage. As you and Alan Liu (http://liu.english.ucsb.edu/theses-on-the-epistemology-of-the-digital-page/), among others, have pointed out, these are also questions of epistemology. Should we begin to debate DH and cognitive science?

    Reply
  6. Pingback: Editor’s Choice: Facts, Patterns, Methods, Meaning: Public Knowledge Building in the Digital Humanities | The Lapland Chronicles | Digital Humanities Now

  7. Pingback: Syllabus: Approaching Digital Humanities, Spring 2019 – Michael J. Kramer

Leave a Reply to Matt Cancel reply

Your email address will not be published. Required fields are marked *