Author Archives: admin

Response to Critical Infrastructure Studies Panel

The following is a response delivered at the end of the Critical Infrastructure Studies Panel, which took place at the January 2018 Modern Language Association Conference in New York City. Panelists included Tung-Hui Hu, Shannon Mattern, Tara McPherson, and James Smithies. Alan Liu and I co-organized the session.

Susan Leigh Star has made the foundational point that while we often think of infrastructure as a set of mute base layers underlying the systems we use — a group of water pipes, a rack of computer servers, a set of asphalt roads — one person’s invisible infrastructure is another person’s active focus of time, interest, and investment; as she puts it, “for a railroad engineer, the rails are not infrastructure but topic” (380). Our approach to infrastructure must therefore foreground perspective and acknowledge that infrastructure is fundamentally relational and embedded in a net of human activities and concerns.

The papers we’ve just heard make visible a set of embedded human relations around infrastructure in varying ways. Shannon Mattern’s claim that the spaces in which we store and access information mediate our understanding of that information calls attention not just to the shelves, cubbies, cases, and libraries in which we store our work, but also to the human beings designing and accessing those structures. Tung-Hui Hu, by focusing on the affective dimensions of big data, approaches the topic of infrastructure through a perspective that considers the effects it lodges in our bodies, feelings, and minds. Tara McPherson explores how social platforms can act as infrastructures that facilitate or impede the machinations of hate groups. And James Smithies points out that one key role critical infrastructure studies can play is to call attention to the scholarly infrastructures that surround us and how our own research practices intersect with them. 

As we think about what to make of these perspectives and what the implications of critical infrastructure studies might be, we might turn back to Alan Liu’s signal call for this work. In his blog post “Drafts for Against the Cultural Singularity,” taken from an in-progress book, Liu writes that infrastructure offers digital humanities practitioners a key critical possibility, a space within which DHers can “treat infrastructure . . . as a tactical medium that opens the possibility of critical infrastructure studies as a mode of cultural studies.”

For Liu, critical infrastructure studies offers a way for DH practitioners to embrace a critical form of building, one that focuses locally on the creation of scholarly infrastructures in higher education but that can, over time, share the values and practices of the academy with other areas of culture such as “business, law, medicine, government, the media, the creative industries, and NGOs.”

In my brief response today, I want to point out that by pairing critical infrastructure studies with ongoing work in DH and, importantly, with emerging work in the area of critical university studies, as Matt Applegate did the other morning here at the convention and as Erin Glass has been working on in her dissertation on the subject, we have a chance to right our own ship and to enact a form of resistance to capital within higher education that is part of the shift we’d like to see in the larger culture.

To locate this proposition within a concrete set of scholarly infrastructure initiatives, I want to talk about two related projects: the CUNY Academic Commons and the Humanities Commons.

The CUNY Academic Commons is an open-source, faculty-led academic social network established in 2009 for the 24-campus CUNY system. Built on WordPress and BuddyPress, the Commons is used for courses, faculty profiles, publications, CVs, research interest groups, and experiments. It began with no funding, but slowly gained internal funding and now is securely supported on an annual basis by the CUNY Office of Academic Affairs. In 2012, with the help of a grant from the Sloan Foundation, we released the Commons In A Box, a free software project which can be used by any institution to get a Commons site up and running, and next year, with the help of the NEH, we will be releasing the Commons In A Box OpenLab, which will help institutions set up a Commons-based teaching platform.

Soon after we released CBOX, we met with Kathleen Fitzpatrick and her team at the MLA, which soon used CBOX to create first the MLA Commons to link members of the organization, then the Humanities Commons to link members of multiple scholarly organizations, and finally Humanities CORE, an institutional repository tied to the Humanities Commons that helps academics share their scholarship, research data, and syllabi.

Examined through a perspective that combines work in DH, critical infrastructure studies, and critical university studies, we can see that these platforms have helped establish what Christopher Kelty calls “recursive publics,” having been taken up by the communities that they were built for, and that built them. And we can see that the flourishing of these platforms represents an intervention in the enterprise-level IT purchasing practices that determine much of the technology we use in the academy. Efforts like the CUNY Academic Commons and Humanities Commons may seem in some ways small and homegrown, especially when compared to the large sums of money our universities spend on Elsevier subscriptions, but they can have large knock-on effects. The Humanities Commons, for instance, is slowly but surely helping scholars move away from what I would call the academic vulture economy — for instance, proprietary, for-profit, corporate platforms such as that monetize the academic content deposited on them. And, in the wake of New York State giving 4 million dollars to the CUNY system to develop zero-cost courses and open educational resources, the CUNY Academic Commons is beginning to displace corporate OER platforms to become a pedagogical infrastructure that CUNY faculty can use to create, share, and teach with OER materials.

Coming back to Susan Leigh Star, then, and foregrounding the embeddedness of human relations around infrastructure, I want to suggest that the call for critical infrastructure studies can ultimately help us mobilize a critically informed resistance to capital and set of building practices that move the scholarly communications infrastructure of the academy away from corporations and towards the faculty, staff, and students who can build, care for, maintain, and use them.

Works Cited

Kelty, Christopher M. Two Bits: The Cultural Significance of Free Software. Durham: Duke University Press, 2008.

Star, Susan Leigh. “The Ethnography of Infrastructure.” American Behavioral Scientist. 43: 377-391, 1999.

Out of Sync: Digital Humanities and the Cloud

Matthew K. Gold

Out of Sync: Digital Humanities and the Cloud

This is the text of a keynote lecture I gave at DH Congress at the University of Sheffield on September 10, 2016. I’m grateful to Michael Pidd, the University of Sheffield Humanities Research Institute, and the conference organizing committee for inviting me to speak.

* * *

In February 1884, John Ruskin delivered an address to the London Institution titled “The Storm-Cloud of the Nineteenth Century.” Ruskin began with a reference to his somewhat ominous title:

Let me first assure my audience that I have no arrière pensée in the title chosen for this lecture. I might, indeed, have meant, and it would have been only too like me to mean, any number of things by such a title;—but, tonight, I mean simply what I have said, and propose to bring to your notice a series of cloud phenomena, which, so far as I can weigh existing evidence, are peculiar to our own times; yet which have not hitherto received any special notice or description from meteorologists.

Ruskin went on, in his lecture, to do just that — to convey his thoughts on clouds based on his sketches and observations of the sky, though at least some audience members and later critics have seen embedded in his remarks a critique of encroaching industrialization. And so, in a talk focused on clouds — which Ruskin elsewhere described with much beauty and care — we see the seeds of a larger political critique.

Were I to “propose to bring to your notice a series of cloud phenomena” that are, in Ruskin’s words, “peculiar to our own times,” we would begin, most likely, not by looking up to the sky, but rather down at our cell phones. We wouldn’t discuss “storm clouds,” or even “clouds,” but rather “THE cloud,” by which we would refer to the distributed set of services, platforms, and networked infrastructures that string trellised connections between our phones, laptops, desktops, and tablets. We would point to the media systems that have caused us to consign our CDs to closets and to sign up for subscription-based music services such as Spotify and Tidal. We would point to Google drives and docs, Twitter hashtags and Facebook feeds, wifi signals, Bluetooth connections, Github repositories, files synced across DropBox, Box, and SpiderOak. Indeed — the sync — the action of connecting to and syncing with the network, of comparing our local files to those on a remote server and updating them to match — might be the signal action of the cloud-based life. We become dependent on, and interdependent with, the network — always incomplete, awaiting sync, ready to be updated. The cloud produces both security and instability, offering back-up services but keeping us always in need of updates. We look to the cloud not to see an alien sky but rather to recover parts of ourselves and to connect or reconnect to our own work.

My aim in this talk is to spend some time thinking with you about the cloud and about what it portends for the digital humanities. But it’s difficult to talk about the cloud without also talking about infrastructure, in part because of the clear ways in which cloud-based services and models are dependent upon physical conduits, things in the world, that belie the cloud’s supposedly abstract, virtual, and ineffable nature. I’d like to draw the DH community’s attention to a set of conversations that are occurring both in DH and also outside of it–in the realm of media studies, and in particular, the growing area of critical infrastructure studies. In connecting these conversations, I want to encourage us to think about how DH work relates to or should relate to issues of infrastructure — particularly as these issues involve larger concerns that have been raised in the humanities about DH work around issues of instrumentalism and neoliberalism. My premise is that thinking about DH work within an infrastructural context may allow us to both focus on the work that DHers do so well — reflexive and theoretically informed building, making, and critique — and to build or rebuild that human, social, and scholarly communications infrastructure upon sturdier grounds of social justice.

It’s easy to see that an “infrastructural turn” has been growing over the past year in DH and allied fields. We can see it in evidence in the July 2016 King’s College symposium “Interrogating Infrastructure“; in the recent DH2016 panel on “Creating Feminist Infrastructure in the Digital Humanities”; in DH2016 presentations such as James Smithies “Full Stack DH,” which described his project to build a Virtual Research Environment on a Raspberry Pi; and in experiments such as my own team’s DH Box project. In media studies, we see this infrastructural turn in recent publications such as Tung-Hui Hu’s A Prehistory of the Cloud and Nicole Starosielski’s The Undersea Network; in a renewed focus, generally, on the material nature of computers; in the growth of the field of media archaeology; in calls across the academy to pay more attention to the social and political contexts of digital work; and in efforts to recover the diverse histories of early computing. This work encourages us to take account of computational work as enmeshed in the different levels of the “stack,” Benjamin H. Bratton’s term for the set of infrastructures – media infrastructures, data infrastructures, political infrastructures, physical infrastructures and legal infrastructures — that have accreted over time into an accidental whole through what Bratton calls “planetary scale computing.” As more and more DH work moves to the cloud and becomes dependent on networked infrastructure, thinking about the protocols, dependencies, and inter-dependencies of the Stack can help us fruitfully shape our work as it relates to both allied scholarly disciplines and to larger publics.

What role should the digital humanities play in conversations about infrastructure? What particular insights does work in the field have to contribute to them? And to what extent are DHers already doing the work of infrastructure in the academy broadly, and the humanities more specifically? I will argue in this talk that DHers should engage the Cloud and its associated infrastructures critically, thinking about how the emergence of the Cloud, even as it makes possible new forms of networked connection, also foregrounds multiple risks. It’s my belief, as I’ll detail later on in the talk, that DH should step back and re-consider its use of proprietary social networks, and that it should focus on building alternate forms of scholarly publishing and communication infrastructure that help move us away from proprietary networks where every interaction is always already commodified and where the network effect all too often puts marginalized populations at risk.

In his opening remarks at the “Interrogating Infrastructure” event — an event I did not attend, but which I have at least some sense of thanks to his online notes — ­­Alan Liu positioned the topic of infrastructure as a key future direction for the digital humanities. He argued not just that the topic was well-suited to the field, but that it was one which DH was well-positioned to address. Infrastructure, as the set of social and technological systems undergirding many aspects of networked modern life, for Liu, has the “potential to give us the same general purchase on social complexity that Stuart Hall, Raymond Williams, and others sought when they reached for their all-purpose interpretive and critical word, ‘culture.’”

I think Liu is right that DHers can and should pay increased attention to issues of infrastructure and the effects of that infrastructure on the larger communicative and meaning-making networks of contemporary society. And clearly, much work on infrastructure is already blending scholarship in new media studies, science and technology studies, and the digital humanities. I think here of Matthew Kirschenbaum’s work on forensic materiality and software platforms, Lori Emerson’s work on interfaces, Jentery Sayer’s work on prototyping the past, Jussi Parrikka’s work on media archaeology, Simone Browne’s work on surveillance networks and race, and Kari Kraus’s work on speculative design. All of these scholars are already exploring the intersections of infrastructure, platform/material studies, design and new media.

The past year has been notable within the emerging field of infrastructure studies, as scholars in the fields of new media studies and science and technology studies have published a range of books that put the infrastructure of the Cloud into theoretical and infrastructural contexts. Across four of those books — Nicole Starosielski’s The Undersea Network; Tung-Hui Hu’s A Prehistory of the Cloud; John Durham Peters’s The Marvelous Clouds; and Benjamin H. Bratton’s The Stack — we see a range of approaches:

  • An examination of the physical infrastructure underlying virtual networks. Starosielski examines the undersea cables that continue to carry much internet traffic; while Hu looks at how fiber-optic network infrastructure has been “grafted” onto America’s aging railroad track system. In both cases, we see attention paid to the physical infrastructures of the internet that are often overlooked, if not purposefully hidden.
  • An exploration of how power plays across networked interfaces, infrastructures, and protocols — and particularly how the traditional laws of the nation-state become confused and overwritten across the liminal space of the web. Starosielski looks at cable stations across the Pacific, exploring past and present effects of colonized states; Hu examines what he calls the “sovereignty of data,” exploring how we “invest the cloud’s technology with cultural fantasies about security and participation.” And Bratton delineates, as I’ve noted earlier, the various layers of what he calls “the Stack.” In Bratton’s view, computation itself has become a “global infrastructure [that] contributes to an ungluing and delamination of land, governance, and territory, one from the other” (14).
  • A connection, drawn by Peters through what he calls “infrastructuralism,” of computing technologies and the environment. This involves partly a consideration of the effect of computing technology on the environment — what Bratton calls “the ponderous heaviness of Cloud computing” — and partly, through Peters’s book, a consideration of media as environment, as space and place through which we move.

Across all of these works, we see concerns over issues of power, capital and surveillance; the physical and commercial structures through which the phenomenon we refer to as “the network” is built; and the growing sense in which media and networked infrastructures have become constitutive of much of our experience in the world.

The cloud is blurring lines and connecting us in ways that have reshaped conventional boundaries. For instance, As Bratton considers issues of sovereignty, citizenship, the polis, and the network, he ponders the dividing lines between “citizen” and “user,” between subject and state, wondering whether the network itself provides for new understandings of citizenship. He asks:

What if effective citizenship in a polity were granted not according to categoriocal juridical identity but as a shifting status derived from any user’s generic relationship to the machine systems that bind that polity to itself?” In other words, if the interfaces of the city address everyone as a “user,” then perhaps one’s status as a user is what really counts. The right to address and be addressed by the polity would be understood as some shared and portable relationship to common infrastructure. Properly scaled and codified, this by itself would be a significant (if also accidental) accomplishment of ubiquitous computing. From this perhaps we see less the articulation of citizenship for any one city, enclosed behind its walls, but of a “citizen” (Is that even still the right word?) of the global aggregate urban condition, a “citizen-user” of the vast, discontiguous city that striates Earth, built not only of buildings and roads but also perplexing grids and dense, fast data archipelagos. Could this aggregate “city” wrapping the planet serve as the condition, the grounded legitimate referent, from which another, more plasmic, universal suffrage can be derived and designed? Could this composite city-machine, based on the terms of mobility and immobility, a public ethics of energy and electrons, and unforeseeable manifestations of data sovereignty . . . provide for some kind of ambient homeland? If so, for whom and for what? (10, emphasis added)

The questions seething through Bratton’s book — especially those around citizenship, subjectivity, and participation in the techno-sphere – embody the kinds of questions DHers might ask of infrastructure. As enormous forces of capital and computation engender new networked publics around us, to what extent are those publics built on the grounds of equity and social justice? As DHers participate in these new cloud polities, to what extent are we asking Bratton’s question, “for whom and for what,” as we do our work?

DH has always been wildly various and multivalent, and its practices and methods range widely (some see this as a feature; others, as a bug. Count me on the side of those who appreciate DH’s capacious frame). The increasing prevalence of the cloud in our lives and works offers us a chance to intervene in the systems of media and communication developing around us. We can and should ask where and how DH insight might best contribute to scholarly conversations around infrastructure.

One possibility is the work on large-scale text, sound, and image corpora that many DHers — Franco Moretti, Ted Underwood, Tim Hitchcock, Andrew Piper, Richard Jean-So, Mark Algee-Hewitt, Matthew Jockers, Tanya Clement, Lev Manovich, and many others — have been working on, often through larger infrastructural platforms such as the Hathi Trust. Surely, this work involves issues of the cloud, infrastructure, and culture, and surely it builds on methods central to, and perhaps in some ways unique to, DH work. DHers excel at contributing to and taking advantage of this kind of networked infrastructure for scholarly work – look at how a set of national and international scholarly infrastructure projects — such as DARIAH-EU, Compute Canada, Hathi Trust, and Europeana Research — are helping DH researchers do their work at scale and also to participate in larger public conversations.

These platforms, and this type of infrastructural work is important. And they may be the answer for DH as it thinks about cloud-based infrastructure. But – aside from the fact that, as I have argued elsewhere, large-scale text mining too often stands in the public mind as a synecdoche of what DH is and should be — this kind of infrastructural work is sometimes hampered by the complex rights issues that attend cultural heritage materials, and these platforms often have a somewhat problematic relationship to access, offering member institutions one set or quality of resources, and offering the public another. Such platforms often embrace a stance of political neutrality that may be inadequate to the increasing complexity of the cloud. And so — perhaps for those reasons, and perhaps because of the direction of my own work – I’d like to consider other possibilities for DH in the cloud, as well.

Earlier, I noted that the action of the “sync” – the point where the user connects to a cloud-based service such as DropBox, Gmail, iTunes, or Google Docs to upload and download files – as the quintessential act of the cloud. As DHers, we know and recognize that these systems do much more than update files – they check us in with that vast global network of users, update terms of agreement, provide companies with a chance to flag illegal downloads. The sync is as much an act of corporate surveillance as it is an act of routine file maintenance.

As DHers, we know this and to some degree accept this in the same way that we know and accept our participation in proprietary networks like Twitter. It seems at times a cost of living in a cloud-based world.

But when we think about what DH is and what it can be, and of how it might relate to the cloud, we might consider that DHers are, among academics, perhaps best suited to reshape the nature of academic research itself. This work — often described as scholarly communication — has focused on the creation of new publishing interfaces and platforms; on the extension of humanities work to include alternatives to text-based argument; on the use of social media and blogging platforms to share in-process scholarship in public ways; on the consideration of collaborative work in the humanities; and on a reconsideration of what scholarship itself is and should be.

Perhaps the great work of DH is to envision alternate infrastructures for technical and scholarly work that help divorce us from systems of entrenched capital, that help move us away from our shared dependence on the set of proprietary service platforms — Twitter, Facebook, Github, Slack, — that have dominated scholarly communication in the humanities (and digital humanities) to date, and to recognize this shared dependence as such. Perhaps a central mission of DH is to build alternate infrastructures that are premised upon social and political understandings of the cloud, as articulated at least in part through scholarship in new media and science and technology studies.

This kind of work could help us address one of the most ironic gestures we see in current critiques of DH: harsh, outraged attacks on the supposed “neoliberalism” of DH, delivered by scholars through commercial proprietary platforms like Facebook and Twitter, or through online publication venues that use clickbait-y headlines to capture page views in the attention economy. It’s hard to see how the platform of delivery of those critiques does not detract at least a bit from their bite.

And yet if we think we are immune from this problem ourselves, we are wrong — this is an issue that affects not just new-media scholars or conservative humanists; it is undeniably present in the digital humanities community, as well. Yes, we have Humanist-L, DH Q&A, personal academic blogs, and multiple scholarly journals that we use to share work in the field. Yes, we are building new venues for open-access publishing such as The Open Library of the Humanities. Yes, we are building out institutional and inter-institutional methods of conversation and connection such as MLA Commons and MLA CORE, not to mention institution-specific repositories.

But DHers also participate actively and enthusiastically in Facebook, Twitter, Instagram, and Slack, among others. Twitter has, for many, become the de facto meeting ground of the field. And there is an undeniable good here: a strong DH presence on these platforms has enabled DHers to share their work with larger publics. But they also suggest a missed opportunity for scholarly communication and a regrettable participation in the larger systems of capital accumulation that DH could potentially resist.

In “The Scandal of Digital Humanities,” Brian Greenspan’s response to “Neoliberal Tools (and Archives): A Political History of Digital Humanities,” published by Daniel Allington, Sarah Brouillette, and David Golumbia in the L.A. Review of Books, Greenspan argues that digital humanities work is fundamentally aligned against the “strictly economic logic’ of neoliberalism; he notes that much DH work resists the “pressure to commercialize” and in fact involves “either detourning commercial tools and products for scholarly purposes, or building Open Access archives, databases and platforms.” Greenspan remarks sardonically that that is why so many DH projects are “so often broken, unworking or unfinished, and far from anything ‘immediately usable by industry.'”

DH work, as Alan Liu and others have argued, presents to the academy a mode of engagement between the humanities and computational methods and tools that is self-reflexive and empowering. Building on that notion, Greenspan argues that:

DH involves close scrutiny of the affordances and constraints that govern most scholarly work today, whether they’re technical (relating to media, networks, platforms, interfaces, codes and databases), social (involving collaboration, authorial capital, copyright and IP, censorship and firewalls, viral memes, the idea of “the book,” audiences, literacies and competencies), or labour-related (emphasizing the often hidden work of students, librarians and archivists, programmers, techies, RAs, TAs and alt-ac workers).

“If anything,” Greenspan notes, “DH is guilty of making all too visible the dirty gears that drive the scholarly machine, along with the mechanic’s maintenance bill. . . .And that’s precisely its unique and urgent potential: by providing the possibility of apprehending these mechanisms fully, DH takes the first steps toward a genuinely materialist and radical critique of scholarship in the 21st century.”

To the extent that this critique can be baked into the building that digital humanists do — and I do think that that is one of the key aims of DH as a field and practice, particularly in the age of the cloud — Greenspan helps us see DH’s potential contribution to questions of infrastructure. Digital humanities work can indeed help us reposition the infrastructure of scholarship away from the formations we now have in place and towards a more purposeful, and more resistant digital humanism that is grounded not just in non-commercial practices, but in anti-commercial practices. In this way, the strength of DH, its ability to peek into the black box of technological platforms, can be strengthened and can help the academy as it faces the onslaught of techno-capital from all sides. The need for this kind of work is urgent, as the drumbeat of constant reductions in state funding, certainly felt here in England but also in the U.S., force institutions to adopt austerity measures of various kinds.

To some degree, DH is already doing this kind of work—and I don’t want to erase the important contributions of these projects by failing to mention them. For example, we can look at a range of representative projects — Zotero, which has provided an effective alternative to costly bibliographic software; Omeka, which has created an easy way to present cultural heritage objects; Mukurtu, which is designed specifically to take account of diverse cultural attitudes towards the sharing of heritage materials; Scalar, which encourages multi-modal and non-linear argument; Domain of One’s Own, which helps students familiarize themselves with hosting infrastructure and take some measure of control over their online persona; and a few of the projects I have been involved in — Manifold Scholarship, which is creating a new open-source platform for interactive scholarly monographs; Commons In A Box, which provides a free platform for academic social networks; DH Box, which opens DH computing software to communities without technical DH infrastructure; and Social Paper, which is planting seeds that may one day help us move away from Google docs. Across all of these projects are the beginnings of an infrastructure for shared scholarly work that offer alternatives to commercial environments and platforms.

And yet, as Miriam Posner has noted, “the digital humanities [still] actually borrows a lot of its infrastructure . . . from models developed for business applications.” For many, the mere fact that DH involves the kind of technical training that may be very much in line with marketplace demands is evidence of its complicity with the forces of neoliberalism in the academy. How can we ensure that the infrastructure DH builds is self-reflexive infrastructure for scholarly practice and communication; that its builders ask themselves Bratton’s question — for whom and for what — at every turn; that it foreground humanistic research questions and resist the persistent encroachment of capital into higher education?

I don’t have answers, but I can suggest starting points:

  • We need a re-articulation of DH technical practice as essentially reflexive endeavor. DHers tend to approach technological systems by seeking to understand them, to historicize them, to unpack the computational and ethical logics that structure them. This gives DHers a good starting place for building out more ethical tools for scholarly communication. But we need to make this case more powerfully to the public.
  • We need open and robust conversations about inclusive practices. As recent years have shown, DHers need to pay careful attention to the make-up of their own projects and conferences, seeking to counter the forces of structural racism and gender bias. We might move this conversation forward by consciously seeking to expand our project teams and ensure that our projects engage issues diversity and difference.
  • We should expand our notions of what we mean by infrastructure, Jacqueline Wernimont’s argument at DH2016 in her talk “Feminist Infrastructure as Metamorphic Infrastructure.” There, Jacque described a concept of feminist infrastructure that commits to people, that is built upon relational accountability, that embeds ideals of collaboration, collectivity, and care, and that, as Jacque notes about FemTechNet’s charter, foregrounds pedagogies that are anti-racist, queer, decolonizing, trans-feminist, and focused on civil rights.
  • We should continue to build infrastructures and infrastructural conversations that encourage the growth of global DH; Alex Gil’s minimal computing is a wonderful example of this, in that it is an infrastructural philosophy and set of technological platforms — such as Ed, his Jekyll theme designed to produce minimal textual editions. The Gacaca Community Justice archive that we heard about from Marilyn Deegan on Thursday is another wonderful example of this.
  • That we speak more about, and continue to think through, the kind of education and training that many of us provide for our colleagues and students at our universities, and to situate that work within the context of critical pedagogy, ensuring that when we teach our students, we do so by emphasizing humanities values. Our students need to use DH methods to explore and explicate ambiguity rather than to flatten it. I think we do this already, but our academic colleagues sometimes miss this point.
  • That we take seriously the proposition put forward by Geoffrey Rockwell and Stephen Ramsay that for digital projects to be taken seriously, they have to make arguments. And to the extent that they can make arguments in their conception and function, they will help explain what DH is and can be.

And that, I think, is the challenge for DH infrastructure: it needs to make an argument, and it needs to make an argument through its projects, as Tara McPherson argued in 2010. Many of the projects I mentioned above do just that — think of Alex Gil’s minimal computing, of that way it embeds an argument about access and infrastructure into its codebase. Think about James Smithies attempts to build a virtual research environment on a cheap and affordable Raspberry Pi. Consider the DH Box’s team to make DH tools available to institutions that don’t have reliable networked infrastructure. Consider how Commons In A Box offers academics an alternative to Facebook, and how it has been used by scholarly associations such as Modern Language Association to build out alternates to corporate sites like

There are limits, of course. DH Box, though it is available free software, currently runs through Amazon Web Hosting. Domains of One’s Own similarly is a project that is ultimately based on commercial web hosting space. As Tung-Hui Hu reminds us, network infrastructure is often literally laid on top of older commercial infrastructure. It’s hard to live completely off the commercial grid, to live on the bare wires of the network – especially if we want to be involved in larger public conversations. The cloud calls to us to sync with it, and that call is hard to resist.

And there are other challenges. Free software communities, at least those in the US with which I am most familiar, are dominated by white males and are not always welcoming to women and minorities (something that I think and hope is changing through organizations such as PyLadies, Black Girls Code, and similar organizations).

And the work is painful. We are using Twitter, and not Diaspora or, for a reason. The slick, seductive surfaces, the smooth user interfaces of commercial social media platforms are not just hard to resist — they are where other conversations are happening. Removing ourselves from those platforms would cost DHers exposure — and, were more academics to follow — would risk moving academic discourse even farther from the public sphere than it already is.

But as Eben Moglen pointed out in his talk “Freedom in the Cloud” — the talk that inspired a group of NYU undergrads to create the twitter alternative Diaspora — when we use Gmail, it comes with the “free” service of semantic email analysis by Google for Google; that when we get free email and document storage we get a “free” service which is “spying all the time.” That location-based tweets may be used to squash protest. We know this – everyone knows this – but we could do more to combat the force of the commercial cloud.

DH can and will be useful to the humanities and to the academy. But it has the opportunity to consider what the next generation of scholarly communication platforms is and can be. It has the opportunity, and perhaps the responsibility, to approach questions of infrastructure with political and social contexts in mind — to consider, for instance, how its infrastructure can be modeled, to use language from Elizabeth Losh, Jacqueline Wernimont, Laura Wexler, Hong-An Wu upon feminist values, embracing “genuinely messy, heterogeneous, and contentious pluralism” in its design. Or, to return to the cloud, to offer us new visions of what it means to sync with the cloud and with the world. DH can and perhaps should be a primary force for resisting the entrance of capital into the ecosystem of educational institutions, by insisting upon critical engagements with commercial technologies. We can and must interrupt the sync.

Resisting the smooth services of the corporate web — building tools, platforms, and communities that embrace core humanities values of discourse, dialogue, inclusivity, and intellectual exchange — perhaps represent another side of what Miriam Posner has called the “radical, unrealized potential of the digital humanities.”

Were we to engage in that work — and I think we are already doing it, just not as purposefully and mindfully as we might — we would in fact have made a significant contribution to the world and would perhaps help dissipate the storm clouds of our times.

* * *

I’m grateful to Lauren F. Klein, Kari Kraus, and Brian Croxall for their comments on an earlier draft of this paper.

Facts, Patterns, Methods, Meaning: Public Knowledge Building in the Digital Humanities

Note: This talk was delivered as a keynote address at the University of Wisconsin – Madison as part of the Digital Humanities Plus Art: Going Public symposium on April 17, 2015. I am grateful to the organizers of the symposium for generously hosting me and occasioning these thoughts.

 Facts, Patterns, Methods, Meaning: Public Knowledge Building in the Digital Humanities

Things have a way of coming full circle, of beginning where they have ended, and so I want to start today with Ralph Waldo Emerson, a man who thought about beginnings and endings, circles and forms. “The eye is the first circle,” he wrote. “The horizon which it forms is the second; and throughout nature this primary figure is repeated without end” (“Circles”).

Circles select and enfold, but also exclude, demarcating a perimeter, an in and an out. “The eye is the first circle”; it is the lens through which those of us lucky enough to have eyesight are able to perceive the world. And yet the eye both makes sight possible and is bounded by a second circle formed by the horizon of our vision, a circle that discloses both constraints and possibilities. We know that the landscape extends beyond the visible horizon, but we are limited by our own perceptions, even as they make possible all that we know. And this figure, this double act of knowing and unknowing, seeing and unseeing, taking in possibilities and limits in the same glance, is the mark of our experience in the world. There is always more to learn, always more outside the reach of our knowledge, always more beyond the edge of our sight.

Emerson teaches us to be humble in the face of such knowledge. “Every action,” he writes, “admits of being outdone. Our life is an apprenticeship to the truth, that around every circle another can be drawn; that there is no end in nature, but every end is a beginning; that there is always another dawn risen on mid-noon, and under every deep a lower deep opens.”

Perhaps it is telling that Emerson’s figure here involves a depth to be plumbed rather than a height to be climbed. For at this moment in the development of the digital humanities, we are pursuing new paths to knowledge, extending the horizons of our abilities with new tools. This is, obviously, not a teleological or progressive journey. We have exciting new tools and provocative new methods, but they are not necessarily leading us to higher truths. We are not marching along straight and ever-improving lines of progress. But we are producing tools that conform to new directions in our thought, and those tools can usefully reset our perspectives, helping us look with new sight on things we thought we understood. They can give us new vantage points and new angles from which we can explore the depths around us. And, of course, even as they make new sights possible, we remember Emerson and note that they foreclose, or at least obscure, others.

Emerson’s aphorisms provide useful reminders both for digital humanists surveying the field and for scholars observing it from its horizons. In today’s talk, I want to think through states of knowing in the digital humanities, situating our practices within larger histories of knowledge production. My talk has three parts:

  1. A discussion of a few approaches to text analysis and their relation to larger perceptions about what DH is and does, and how DH knowledge is produced;
  2. A discussion of some practitioners who are blending these approaches in provocative ways;
  3. A wider view of experimental knowledge in DH, with the suggestion of a new grounding, based in the arts, for future DH public work.

I want to start by discussing current directions of DH research, and in particular to spend some time poking a bit at one of the most influential and vibrant areas of DH — literary text analysis, the type of DH research that most often stands in as a synecdoche for the larger whole of the digital humanities. I do this despite the fact that focusing on literary text analysis risks occluding the many other rich areas of digital humanities work, including geospatial mapping and analysis, data visualization, text encoding and scholarly editing, digital archives and preservation, digital forensics, networked rhetoric, digital pedagogy, advanced processing of image, video, and audio files, and 3D modeling and fabrication, among others. And I should note that I do this despite the fact that my own DH work does not center on the area of literary text analysis.

One reason to focus on text analysis today is that when we talk about DH methods and DH work in the public sphere, literary text analysis of large corpora in particular is over-identified with DH, especially in the popular press, but also in the academy. There, we often see large-scale text analysis work clothed in the rhetoric of discovery, with DHers described as daring adventurers scaling the cliffs of computational heights. A 2011 New York Times book review of a Stanford Literary Lab pamphlet described, tongue-in-cheek, Franco Moretti’s supposed attempt to appear “now as literature’s Linnaeus (taxonomizing a vast new trove of data), now as Vesalius (exposing its essential skeleton), now as Galileo (revealing and reordering the universe of books), now as Darwin (seeking ‘a law of literary ­evolution’)” (Schulz). All that’s missing, it would seem, is mention of an Indiana-Jones-esque beaten fedora.

If literary text mining operates as a kind of DH imaginary in popular discourse around the field, one point I want to make today is that it is an impoverished version of text analysis, or at the very least a one-sided and incomplete one. As a way of complicating that picture, I want to sketch out two prominent areas of research in DH literary text analysis, one premised (not always, but often) upon scientific principles of experimentation that use analysis of large-scale textual corpora to uncover previously unknown, invisible, or under-remarked-upon patterns in texts across broad swaths of time. Known colloquially and collectively through Franco Moretti’s term “distant reading,” Matthew Jockers’s term “macroanalysis,” or Jean-Baptiste Michel and Erez Lieberman Aiden’s term “culturomics,” this approach is predicated on an encounter with texts at scale. As Franco Moretti has noted in his essay “The Slaughterhouse of Literature” when he described the move towards distant reading:

Knowing two hundred novels is already difficult. Twenty thousand? How can we do it, what does “knowledge” mean, in this new scenario? One thing for sure: it cannot mean the very close reading of very few texts—secularized theology, really (“canon”!)—that has radiated from the cheerful town of New Haven over the whole field of literary studies. A larger literary history requires other skills: sampling; statistics; work with series, titles, concordances, incipits. (208-209)

This is knowledge work at a new scale, work that requires, as Moretti notes, quantitative tools of analysis.

Opposed to this, though less often discussed — is a different form of DH work, one based not on an empirical search for facts and patterns, but rather on the deliberate mangling of those very facts and patterns, a conscious interference with the computational artifact, a mode of investigation based not on hypothesis and experiment in search of proof but rather on deformance, alteration, randomness, and play. This form of DH research aims to align computational research with humanistic principles with a goal not of unearthing facts, but rather of undermining assumptions, laying bare the social, political, historical, computational, and literary constructs that underlie digital texts. And sometimes, it simply aims to highlight the profound oddities of digital textuality. This work, which has been carried on for decades by scholar practitioners such as Jerome McGann, Johanna Drucker, Bethany Nowviskie, Stephen Ramsay, and Mark Sample, has been called by many names — McGann terms it deformative criticism, Johanna Drucker and Bethany Nowviskie call it speculative computing, and Steve Ramsay calls “algorithmic criticism,” and though there are minor differences between all of these conceptions, they represent as a whole a form of DH that, while it is well-known and respected within DH circles, is not acknowledged frequently enough outside of them, especially in the depictions of DH that we see in the popular press or the caricatures we see of DH in twitter flame wars. It is especially unseen, I would suggest, in the academy itself, where scholars hostile to DH work tend to miss the implications of deformative textual analysis, focusing their ire on that side of quantitative literary analysis that seeks most strongly to align itself with scientific practices.

I’ve set up a rough binary here, and it’s one I will complicate in multiple ways. But before I do, I want to walk through some parts of the lineage of each of these areas as a way of grounding today’s conversation.

Digital humanities work in large-scale text analysis of course has roots in longstanding areas of humanities computing and computational linguistics. But it was given profound inspiration in 2005 with the publication of Franco Moretti’s Graphs, Maps, Trees, a text that argued for a new approach to textual analysis called “distant reading” where “distance is […] not an obstacle, but a specific form of knowledge” (1). Moretti’s work, at this time, has a wonderful, suggestive style, a style imbued with possibility and play, a style full of posed but unanswered questions. The graphs, maps, and trees of his title proposed various models for the life cycles of literary texts; the book contains strong statements about the need for the kind of work it does, but it also resists conclusions and does not overly stockpile evidence in support of its claims. As Moretti himself put it, addressing the “conceptual eclecticism of his work, “opening new conceptual possibilities seemed more important than justifying them in every detail.” This was a work of scholarship meant to inspire and provoke, not to present proofs.

Eight years later, in 2013, Matthew Jockers, one of Moretti’s colleagues at Stanford who had by then moved on to a professorship at the University of Nebraska, published Macroanalysis: Digital Methods & Literary History, a text that employed a different register to present its claims, beginning with chapter 1, which is titled “Revolution.” In Jockers’s text, we see a hardening of Moretti’s register, a tightening up and sharpening of the meandering suggestiveness that characterized Moretti’s writing. Where Moretti’s slim Maps, Graphs, Trees was elliptical and suggestive, Jockers’s Macroanalysis was more pointed, seeking to marshal strong evidence in support of its claims. In the book, Jockers suggests that literary studies should follow scientific models of evidence, testing, and proof; he writes, “The conclusions we reach as literary scholars are rarely ‘testable’ in the way that scientific conclusions are testable. And the conclusions we reach as literary scholars are rarely ‘repeatable’ in the way that scientific experiments are repeatable” (6). Clearly, this is a problem for Jockers; he argues that literary scholars must engage the “massive digital corpora [that] offer us unprecedented access to the literary record and invite, even demand, a new type of evidence gathering and meaning making” (8). And as he continues, he deploys a remarkable metaphor:

Today’s student of literature must be adept at reading and gathering evidence from individual texts and equally adept at accessing and mining digital-text repositories. And mining here really is the key word in context. Literary scholars must learn to go beyond search. In search, we go after a single nugget, carefully panning in the river of prose. At the risk of giving offense to the environmentalists, what is needed now is the literary equivalent of open-pit mining or hydraulicking. . .. the sheer amount of data makes search ineffectual as a means of evidence gathering. Close reading, digital searching, will continue to reveal nuggets, while the deeper veins lie buried beneath the mass of gravel layered above. What are required are methods for aggregating and making sense out of both the nuggets and the tailings. . . . More interesting, more exciting, than panning for nuggets in digital archives is to go beyond the pan and exploit the trommel of computation to process, condense, deform, and analyze the deeper strata from which these nuggets were born, to unearth, for the first time, what these corpora *really* contain. (9-10; emphasis mine)

Even forgiving Jockers some amount of poetic license, this is a really remarkable extended metaphor, one that figures the process of computational literary work as a strip-mining operation that rips out layers of rock and soil to reach the rich mineral strata of meaning below, which are then presumably extracted in systematic fashion until the mine is emptied of value, its natural resources depleted. One doesn’t need to be an environmentalist to be a bit uneasy about such a scenario.

What’s really notable to me here, though, is the immense pressure this passage reveals. And I refer not to the pressure Jockers’s computational drills are exerting on the pastoral literary landscape, but rather to what his rhetoric reveals about the increasing pressure on DH researchers to find, present, and demonstrate results. Clearly, between Moretti’s 2005 preliminary thought experiments and Jockers’s 2013 strip-mining expedition, the ground had shifted.

In his 2010 blog post “Where’s the Beef? Does Digital Humanities Have to Answer Questions?” digital historian Tom Scheinfeldt compares the current moment in the digital humanities to eighteenth-century work in natural philosophy, when experiments with microscopes, air pumps, and electrical machines were, at first, perceived as nothing more than parlor tricks before they were revealed as useful in what we would now call scientific experimentation. Scheinfeldt writes:

Sometimes new tools are built to answer pre-existing questions. Sometimes, as in the case of Hauksbee’s electrical machine, new questions and answers are the byproduct of the creation of new tools. Sometimes it takes a while; in the meantime, tools themselves and the whiz-bang effects they produce must be the focus of scholarly attention.

Eventually digital humanities must make arguments. It has to answer questions. But yet? Like 18th century natural philosophers confronted with a deluge of strange new tools like microscopes, air pumps, and electrical machines, maybe we need time to articulate our digital apparatus, to produce new phenomena that we can neither anticipate nor explain immediately.

One can see what Scheinfeldt describes clearly in Moretti’s work: a sense of wonder, showmanship, and play in the new perspectives that computational methods have uncovered. In Jockers, we see a more focused, precise, scientifically oriented apparatus focused on testable, repeatable results. Jockers and Moretti are hardly the only DHers exploring large datasets — practitioners such as Ted Underwood, Andrew Goldstone, Andrew Piper, Tanya Clement, Lisa Rhody, and Ben Schmidt, among many others, come to mind as practitioners, each engaging such work in fascinating ways — but Moretti and Jockers (and their labs) may stand in for a larger group of scholars using similar methods to explore patterns in massive groups of texts.

I’ve said that I would describe two discrete areas of DH literary text analysis work. Having outlined what I would characterize as the area of the field proceeding on proto-scientific assumptions, I would now like to turn to a group of DH thinkers who, while occasionally using similar tools, are focused on forms of computational literary analysis that in many ways take a diametrically opposed path to the digital text by seeking to disrupt and play with the structures of the text.

In their 1999 piece published in New Literary History, Jerome McGann and Lisa Samuels outline their concept of “deformative criticism,” a hermeneutic approach to digital textuality that, rather than seeking to discover the underlying structure of texts through exposition, seeks to “expose the poem’s possibilities of meaning” through techniques such as reading backward and otherwise altering and rearranging the sequencing of words in a text. “Deformative” moves such as these, McGann and Samuels argue, “reinvestigate the terms in which critical commentary will be undertaken” (116). Many critics working in this vein argue that all interpretative readings are deformative, reformulating texts in the process of interpreting them.

In her work, Johanna Drucker has collaborated with Bethany Nowviskie and others to explore what she terms “speculative computing,” which is “driven by a commitment to interpretation-as-deformance in a tradition that has its roots in parody, play, and critical methods such as those of the Situationist International, Oulipo, and the longer tradition of ‘pataphysics with its emphasis on ‘the particular’ over ‘the general'” (>>>>PG#). Drucker goes on to differentiate speculative computing from quantitative processes based on “standard, repeatable, mathematical and logical procedures” by exploring “patacritical” methods, which privilege exceptions to rules and deviations to norms. Speculative computing, according to Drucker, “let’s go of the positivist underpinnings of the Anglo-analytic mode of epistemological inquiry,” creating imaginary solutions that suggest generative possibilities rather than answers. Drucker writes:

Humanistic research takes the approach that a thesis is an instrument for exposing what one doesn’t know. The ‘patacritical concept of imaginary solutions isn’t an act of make-believe but an epistemological move, much closer to the making-strange of the early-twentieth century avant-garde. It forces a reconceptualization of premises and parameters, not a reassessment of means and outcomes. (SpecLab 27)

Drucker frames her approach in opposition to the rationalized, positivistic assumptions of the scientific method, embracing instead randomness and play. This is also the approach that Stephen Ramsay takes in his book Reading Machines, arguing for what he terms “algorithmic criticism,” Ramsay writes that “[text analysis] must endeavor to assist the critic in the unfolding of interpretative possibilities” (SpecLab 10). Whereas Drucker seeks everywhere to undermine the positivist underpinnings of digital tools, creating not “digital tools in humanities contexts,” but rather “humanities tools in digital contexts” (SpecLab 25) Ramsay argues that “the narrowing constraints of computational logic–the irreducible tendency of the computer toward enumeration, measurement, and verification–is fully compatible” with a criticism that seeks to “employ conjecture . . . in order that the matter might become richer, deeper, and ever more complicated” (16). Because the algorithmic critic navigates the productive constraints of code to create the “deformative machine” from which she draws insights, the “hermeneutics of ‘what is’ becomes mingled with the hermeneutics of ‘how to’” (63).

And Mark Sample, in his “Notes Toward a Deformed Humanities,” proposes the act of deformance, of breaking things, as a creative-critical intervention, one that is premised on breaking things as a way of knowing. Sample’s projects — which include Hacking the Accident, an Oulipo-inspired version of the edited collection Hacking the Academy, and Disembargo, a project that reveals Sample’s dissertation as it “emerg[es] from a self-imposed six-year embargo, one letter at a time,” as well as a host of twitter bots that mash together a variety of literary and information sources — all demonstrate an inspired focus on interpretation as performed by creative computational expression.

I’ve discussed two major approaches to literary text analysis today — likely not without some reductive description — but I would like to turn now to the conference theme of “Going Public,” as each of these approaches take up that theme in different ways using platforms, methods, and models to foster more open and public DH communication.

Deformative work is often performed – witness Mark Sample’s twitter bots or generative texts, which operate in real time and interact with the public – at times even forming themselves in response to public speech.

Text mining scholars, with their focus on exploration, discovery, proof, and tool development, are admirably public in sharing evidence and code; just a few months ago, we witnessed one of the most fascinating controversies of recent years in DH, as DH scholar Annie Swafford raised questions about Matthew Jockers’s tool Syuzhet. Jockers had set out to build on Kurt Vonnegut’s lecture “The Shapes of Stories”; There, Vonnegut sketched what he described as the basic shapes of a number of essential story plots; following the arc of the main character’s fortunes, he suggested, we could discern a number of basic plot structures used repeatedly in various works of fiction, such as “Man in Hole,” “Boy Meets Girl,” and “From Bad to Worse.”

Jockers’s blog post described his use of Syuzhet, a package he wrote for the statistical software R, and which he also released publicly on Github. Because the code was available and public, Swafford was able to download it and experiment with it; she charged that the tool had major faults, and the ensuing discussion led to some sharp disagreements about the tool itself and Jockers’s findings.

Though Jockers wound up backing down from his earlier claims, the episode was fascinating as a moment in which in-progress work was presented, tested, and defended. This is of course nothing new in the sciences, but it was a moment in which the reproducibility of claims in DH was tested.

Having described these two areas of DH literary text analysis, one employing scientific models and seeking reproducible results and the other seeking to undermine the assumptions of the very platforms through which digital texts are constructed, I would like to finally complicate that binary and discuss some DH practitioners who are blending these approaches in fascinating ways.

First, I will turn to the work of Lisa Rhody, whose work on the topic of modeling of figurative language aims to investigate the very assumptions of the algorithms used in topic modeling. Topic modeling is a technique employed by Jockers and many others to reveal latent patterns in texts; it uses probabalistic algorithms to display a kind of topic-based guide to language in the text, tracking the play of similar concepts across it. Rhody’s project, as she writes, “illustrates how figurative language resists thematic topic assignments and by doing so, effectively increases the attractiveness of topic modeling as a methodological tool for literary analysis of poetic texts.” Using a tool that was designed to work with texts that contain little or no figurative language, Rhody’s study produces failure, but useful failure; as she writes, “topic modeling as a methodology, particularly in the case of highly-figurative language texts like poetry, can help us to get to new questions and discoveries — not because topic modeling works perfectly, but because poetry causes it to fail in ways that are potentially productive for literary scholars.”

Second, I will highlight the work of Micki Kaufman, a doctoral student in History at the CUNY Graduate Center with whom I’ve had the pleasure of working as she investigates memcons and telcons from the Digital National Security Archive’s Kissinger Collection. In her project “Quantifying Kissinger,” Micki has begun to explore some fascinating ways of looking at, visualizating, and even hearing topic models, a mode of inquiry that, I would suggest, foregrounds the subjective experiential approach championed by Drucker without sacrificing the utility of topic modeling and data visualization as investigative tools. Micki will be presenting on this work in January at the 2016 Modern Language Association Convention in Austin, Texas in a panel called “Weird DH.” I think it’s very promising.

Finally, I want to mention Jeff Binder, another student of mine — a doctoral student in English at the CUNY Graduate Center, whose work with Collin Jennings, a graduate student at NYU, on the Networked Corpus project, which aims to map topic models onto the texts they model, and to compare topic models of Adam Smith’s Wealth of Nations to the index published with the book. What this project produces, in the end, is a critical reflection on topic modeling itself, using it not necessarily to examine the main body of the text but rather to explore the alternate system of textual analysis presented by the book’s index.

I single out these three practitioners among the many wonderful scholars doing work in this area primarily for the fact that their practices, to my mind, unite the two approaches to text analysis that I have described this far. They use the computational tools of the proto-scientific group but in self-reflexive ways that embody the approach of deformative criticism, aiming to highlight interpretative complexity and ambiguity.

Johanna Drucker has argued that many digital tools are premised upon systems that make them poor fits for humanities inquiry:

Tools for humanities work have evolved considerably in the last decade, but during that same period a host of protocols for information visualization, data mining, geospatial representation, and other research instruments have been absorbed from disciplines whose epistemological foundations and fundamental values are at odds with, or even hostile to, the humanities. Positivistic, strictly quantitative, mechanistic, reductive and literal, these visualization and processing techniques preclude humanistic methods from their operations because of the very assumptions on which they are designed: that objects of knowledge can be understood as self-identical, self-evident, ahistorical, and autonomous. (“Humanistic Theory”)

Drucker calls for a new phase of digital humanities work, one that embodies a humanities-based approach to technology and interpretation. She writes:

I am trying to call for a next phase of digital humanities that would synthesize method and theory into ways of doing as thinking. . . .The challenge is to shift humanistic study from attention to the effects of technology (from readings of social media, games, narrative, personae, digital texts, images, environments), to a humanistically informed theory of the making of technology (a humanistic computing at the level of design, modeling of information architecture, data types, interface, and protocols). (“Humanistic Theory”)

By turning, in Drucker’s terms, from data to capta, from the presentation of data as transparent indexical fact to open and explicit acknowledgement of the socially constructed nature of information, and by using new DH tools and methods at times in ways that test the visible and occluded assumptions that structure them, these three junior scholars are moving us along on a new and exciting phase of digital humanities work.

If humanities text-mining work often proceeds according to the scientific method, striving to test hypotheses and create reproducible results, its genealogies lie in the work of natural philosophy and the various microscopes, air pumps, and electrical machines mentioned by Tom Scheinfeldt and described in depth in books like Steven Shapin and Simon Schaffer’s Leviathan and the Air-Pump. DH work, in fact, is often framed in terms of this genealogy, with the current moment being compared to the rise of science and experimentation with new tools.

As but one example, Ted Underwood, in response to the Syuzhet controversy and ensuing discussions about experimental methods, tweeted:


In the remaining section of this talk, I want to suggest an alternate genealogy for this moment, one that, although it has ties to that same early work of natural philosophy, might help ground digital humanities practice in a new frame. I will return to Emerson for a moment, to his statement that “The eye is the first circle; the horizon which it forms is the second; and throughout nature this primary figure is repeated without end.”

And so I want to explore pre-photographic experimentation with image-making as a way of suggesting new grounding for DH.

In 1839, Henry Louis Daguerre announced the invention of the daguerreotype camera to world, a moment of technical triumph that occluded a larger history of experiment. As the art historian Geoffrey Batchen has shown, when the invention of photography was announced to the world in 1839, the daguerreotype was one of a number of competing photographic technologies. The camera obscura had allowed artists to create replications of the world through a lens for centuries but no one was able to *fix* the image on paper, to make it last, to make it permanent. The project to do so was technical, artistic, and hermeneutic: while experimenters attempted to use different methods and materials to fix camera images on paper and metal, they did so with the confidence that the camera was an instrument of truth, a tool that could help human beings see the world from an unbiased, divine perspective. Daguerre himself was a showman, a painter of theatrical dioramas who had become interested in image-making through that realm.

And in fact, the modern negative-positive photograph descended not from the daguerreotype, but from what was called the calotype, a picture-making technology developed in Britain by William Henry Fox Talbot. While daguerreotypes were one-of-a-kind, positive images that could not be reproduced and that were made using expensive copper plates coated with silver halide and developed over mercury fumes, calotypes were reproducible, negative-positive, paper prints. Daguerreotypes, however, produced much finer gradations of tone and detail than the calotype. As a photographic practice, daguerreotypy grew more quickly in popularity in part because it produced more detailed images, and in part because Talbot restricted the spread of his technology by holding onto his patent license. Daguerre, meanwhile, sold his patent to the French public in exchange for a lifetime pension from the government, but held on to his patent rights in Britain. His announcement in 1839 marked the release of his image-making technology to the world.

In his 2002 examination of Henry Talbot’s work, “A Philosophical Window,” art historian Geoffrey Batchen notes that the specimens of early photography — the failed experiments, the pictures that were not fixed, the images that were faded and obscured, have been viewed by art historians only as indices of technical progression towards the invention of the photographic camera, rather than as art objects in and of themselves. Looking at Talbot’s pictures critically, and taking seriously Talbot as an artist working with a camera, Batchen finds in Talbot a conscious image-maker whose work should have relevance to us today.




Batchen focuses on one of Talbot’s early photographs, “”Latticed Window (with the Camera Obscura) August  1835.” The photograph contains a note: “when first made, the squares of glafs [sic] about 200 in number could be counted, with the help of a lens.”

Batchen performs a fantastic close reading of this note, highlighting Talbot’s instructions to the viewer, the suggestion that the viewer of the photograph look at it first from afar, and then up close with the aid of a lens. This set of instructions, claims Batchen,

anticipates, and even insists on, the  mobilization of the viewer’s eye, moving it back and forth, up and down, above the image. We are asked to see his picture first with the naked eye and then by means of an optical prosthesis . . . The attempt to improve one’s power of observation by looking through a lens is also a concession that the naked eye alone can no longer be guaranteed to provide the viewer with sufficient knowledge of the thing being looked at. It speaks to the insufficiency of sight, even while making us, through the accompanying shifts of scale and distortions of image that come with magnification, more self-conscious about the physical act of looking. (101-102)

Batchen’s comments here, focusing on scale and perspective, showcasing disorientations produced by new angles of vision, might remind us of Moretti’s discussions of scale, of the need for a new type of seeing to take account of thousands of texts. And indeed, amazingly, Talbot, the progenitor of the modern photograph, was also tied to early computing. He was a friend of Charles Babbage’s, whose Difference Machine is often described as the world’s first mechanical computer. Talbot made photos of machine-made lace and sent them to Babbage. Babbage, Batchen reports, exhibited some of Talbot’s prints in his drawing room, in the vicinity of his Difference Engine, making it likely that early computing and early photography were experienced together in his drawing room (107).

In his discussion of Talbot’s lattice-window photograph, Batchen notes that Talbot indeed narratizes our gaze. He writes:

So, for Talbot, the subject of this picture is, first, the activity of our seeing it, and second, the window and latticed panes of glass, not the landscape we can dimly spy through it. He makes us peer closely at the surface of his print, until stopped by the paper fibres themselves and at the framing of his window, but no further. And what do we see there? Is ‘photography’ the white lines or the lilac ground, or is it to be located in the gestalt between them? (102)

And where, we can ask, is DH to be located? Do we even know the foreground and background between which we can locate its gestalt?

Screen Shot 2015-12-30 at 12.16.23 AM

This, I think, is exactly the question we need to ask as we consider where DH is moving and where it should go.

One thing we can do is think about DH as the gestalt between gazes — not distant reading, not close reading, but the dizzy shape of self-reflexive movement between them. Though technology plays a part, it is critical reflection on that technology that can account for new, provocative digital humanities approaches.

And so, finally, I return to Emerson. The eye is the first circle, the horizon which it forms is the second. Let us plumb the space between.


Works Cited

Batchen, Gregory. “A Philosophical Window.” History of Photography 26.2 (2002): 100-112. Print.

Drucker, Johanna. “Humanities Approaches to Graphical Display.” 5.1 (2011): Digital Humanities Quarterly. Web. 19 Apr. 2015.

—–. “Humanistic Theory and Digital Scholarship.” Debates in the Digital Humanities. Ed. Matthew K. Gold. Minneapolis: University of Minnesota Press, 2013. Web. 19 Apr. 2015.

—–. SpecLab: Digital Aesthetics and Projects in Speculative Computing. Chicago: University of Chicago Press, 2009. Print.

Emerson, Ralph Waldo. Essays and English Traits. Vol. 5. New York: P.F. Collier & Son, 1909. Print.

Jockers, Matthew L. Macroanalysis: Digital Methods and Literary History. Urbana: University of Illinois Press, 2013. Print.

—–. “Requiem for a Low Pass Filter.” WordPress. 6 Apr. 2015. Web. 19 Apr. 2015.

—–. “Revealing Sentiment and Plot Arcs with the Syuzhet Package.” WordPress. 2 Feb. 2015. Print.

Moretti, Franco. “The Slaughterhouse of Literature.” Modern Language Quarterly 61.1 (2000): 207-228. Print.

Moretti, Franco, and Alberto Piazza. Graphs, Maps, Trees: Abstract Models for Literary History. London: Verso, 2007. Print.

Ramsay, Stephen. Reading Machines: Toward an Algorithmic Criticism. Urbana: University of Illinois Press, 2011. Print.

Rhody, Lisa M. “Topic Modeling and Figurative Language” Journal of Digital Humanities. Vol. 2, No. 1 (Winter 2012). Web.

Sample, Mark. “Notes toward a Deformed Humanities.” 2 May 2012. Web. 19 Apr. 2015.

Samuels, Lisa, and Jerome J. McGann. “Deformance and Interpretation.” New Literary History 30.1 (1999): 25–56. Print.

Schultz, Kathryn. “The Mechanic Muse: What Is Distant Reading?” The New York Times 24 June 2011. Web. 19 Apr. 2015.

Swafford, Annie. “Problems with the Syuzhet Package.” 2 Mar. 2015. Web.

Underwood, Ted. “These are such classic history of science problems. I swear we are literally re-enacting the whole 17th century.” 29 March 2015, 10:41 p.m. Tweet. 19 Apr. 2015.


Acknowledgements: Thanks to Lindsey Albracht for her help in preparing this web edition of the talk.

Beyond the PDF: Experiments in Open-Access Scholarly Publishing (#MLA13 CFP)

As open-access scholarly publishing matures and movements such as the Elsevier boycott continue to grow, OA texts have begun to move beyond the simple (but crucial!) principle of openness towards an ideal of interactivity. This special session will explore innovative examples of open-access scholarly publishing that showcase new types of social, interactive, mixed-media texts. Particularly welcome is discussion of OA texts that incorporate new strategies of open peer review, community-based publication, socially networked reading/writing strategies, altmetrical analytics, and open-source publishing platforms, particularly as they inform or relate to print-bound editions of the same texts. Also welcome are critiques of the accessibility of interactive OA texts from the standpoint of universal design.

This roundtable aims for relatively short presentations of 5-7 minutes that will showcase a range of projects.

Interested participants should send 250-word abstracts and a CV to Matthew K. Gold at by March 20, 2012.

Whose Revolution? Towards a More Equitable Digital Humanities

What follows is the text of a talk I gave at the 2012 MLA as part of the Debates in the Digital Humanities panel, which grew out of the just-published book of the same name (more about that in a forthcoming post). Many thanks to my fellow panelists Liz Losh, Jeff Rice, and Jentery Sayers. Thanks, too, to everyone who contributed to the active twitter backchannel for the panel and to Lee Skallerup for archiving it. Finally, I’m grateful to Jason Rhody for his helpful responses to a draft version of this presentation.

“Whose Revolution? Towards a More Equitable Digital Humanities”

The digital humanities – be it a field, a set of methodologies, a movement, a community, a singular or plural descriptor, a state of mind, or just a convenient label for a set of digital tools and practices that have helped us shift the way we perform research, teaching, and service – have arrived on the academic scene amidst immense amounts of hype. I’m sure you’re sick of hearing that hype, so I won’t rehearse it now except to say that the coverage of DH in the popular academic press sometimes seems to imply that the field has both the power and the responsibility to save the academy. Indeed, to many observers, the most notable thing about DH is the hype that has attended its arrival  — and I believe that one of my fellow panelists, Jeff Rice, will be proposing a more pointed synonym for “hype” during his presentation.

It’s worthwhile to point out that it’s harder than you’d think to find inflated claims of self-importance in the actual scholarly discourse of the field. The digital humanists I know tend to carefully couch their claims within prudently defined frames of analysis. Inflated claims, in fact, can be found most easily in responses to the field by non-specialists, who routinely and actively read the overblown rhetoric of revolution into more carefully grounded arguments. Such attempts to construct a straw-man version of DH get in the way of honest discussions about the ways in which DH might accurately be said to alter existing academic paradigms.

Some of those possibilities were articulated recently in a cluster of articles in Profession on evaluating digital scholarship, edited by Susan Schriebman, Laura Mandell, and Stephen Olsen. The articles describe many of the challenges that DH projects present to traditional practices of academic review, including the difficulty of evaluating collaborative work, the possibility that digital tools might constitute research in and of themselves, the unconventional nature of multimodal criticism, the evolution of open forms of peer-review, and the emergence of the kind of “middle-state” publishing that presents academic discourse in a form that lies somewhere between blog posts and journal articles. Then, too, the much-discussed role of “alt-ac” scholars, or “alternative academics,” is helping to reshape our notions of the institutional roles from which scholarly work emerges. Each of these new forms of activity presents a unique challenge to existing models of professional norms in the academy, many of them in ways that may qualify as revolutionary.

And yet, amid this talk of revolution, it seems worthwhile to consider not just what academic values and practices are being reshaped by DH, but also what values and practices are being preserved by it. To what extent, we might ask, is the digital humanities in fact not upending the norms of the academy, but rather simply translating existing academic values into the digital age without transmogrifying them? In what senses does the digital humanities preserve the social and economic status quo of the academy even as it claims to reshape it?

A group of scholars – from both within and outside of the field – have assembled answers to some of those questions in a volume that I have recently edited for the University of Minnesota Press titled Debates in the Digital Humanities. In that book, contributors critique the digital humanities for a series of faults: not only paying inadequate attention to race, class, gender, and sexuality, but in some cases explicitly seeking to elide cultural issues from the frame of analysis; reinforcing the traditional academic valuation of research over teaching; and allowing the seductions of information visualization to paper over differences in material contexts.

These are all valid concerns, ones with which we would do well to grapple as the field evolves. But there is another matter of concern that we have only just begun to address, one that has to do with the material practices of the digital humanities – just who is doing DH work and where, and the extent to which the field is truly open to the entire range of institutions that make up the academic ecosystem. I want to suggest what perhaps is obvious: that at least in its early phases, the digital humanities has tended to be concentrated at research-intensive universities, at institutions that are well-endowed with both the financial and the human resources necessary to conduct digital humanities projects. Such institutions typically are sizeable enough to support digital humanities centers, which crucially house the developers, designers, project managers, and support staffs needed to complete DH projects. And the ability of large, well-endowed schools to win major grant competitions helps them continue to win major grant competitions, thus perpetuating unequal and inequitable academic structures.

At stake in this inequitable distribution of digital humanities funding is the real possibility that the current wave of enthusiastic DH work will touch only the highest and most prominent towers of the academy, leaving the kinds of less prestigious academic institutions that in fact make up the greatest part of the academic landscape relatively untouched.

As digital humanists, the questions we need to think about are these: what can digital humanities mean for cash-poor colleges with underserved student populations that have neither the staffing nor the expertise to complete DH projects on their own? What responsibilities do funders have to attempt to achieve a more equitable distribution of funding? Most importantly, what is the digital humanities missing when its professional discourse does not include the voices of the institutionally subaltern? How might the inclusion of students, faculty, and staff at such institutions alter the nature of discourse in DH, of the kinds of questions we ask and the kinds of answers we accept? What new kinds of collaborative structures might we build to begin to make DH more inclusive and more equitable?

As I’ll discuss later, DH Centers and funding agencies are well aware of these issues and working actively on these problems – there are developments underway that may help ameliorate the issues I’m going to describe today. But in order to help us think through those problems, and in an effort to provoke and give momentum to that conversation, I’d like to look at a few pieces of evidence to see whether there is, in fact, an uneven distribution of the digital humanities work that is weighted towards resource-rich institutions.

Case #1: Digital Humanities Centers

Here is a short list of some of the most active digital humanities centers in the U.S.:

The benefits that digital humanities centers bring to institutions seeking funding from granting agencies should be obvious. DH Centers provide not just the infrastructural technology, but also the staffing and expertise needed to complete resource-intensive DH projects.

There are two other important areas that we should mention and that may not be apparent to DHers working inside DH Centers. The first is the key ways in which DH Centers provide physical spaces that may not be available at cash-poor institutions, especially urban ones. Key basic elements that many people take for granted at research 1 institutions, such as stable wifi systems or sufficient electrical wiring to power computer servers, may be missing at smaller institutions. Then, too, such physical spaces provide the crucial sorts of personal networking that is just as important as infrastructural connection. Finally, we must recognize that grants create immense amounts of paperwork, and that potential DHers working at underserved institutions might not only have to complete the technical and intellectual work involved in a DH project, and publish analyses of those projects to have them count for tenure and promotion, but might also have to handle an increased administrative role in the bargain.

[At this point in the talk, I noted that most existing DH Centers did not spring fully-formed from their universities, but instead were cobbled together over a number of years through the hard and sustained work of their progenitors.]

Case Study #2: Distribution of Grants

Recently, the NEH Office of Digital Humanities conducted a study of its Start-Up grants program, an exciting venture that differs from traditional NEH grant programs in that instead of providing large sums of money to a small number of recipients, it aims to provide smaller starter grants of $25,000 to $50,000 to a wider range of projects. The program allows the ODH to operate in a venture-capitalist fashion, accepting the possibility of failure as it explicitly seeks high-risk, high-reward projects.

The study (PDF), which tracked NEH Digital Humanities Start-Up Grants from 2007-2010, show us how often members of different types of institutions applied for grants. Here is the graphic for universities:

What we see in this graph is a very real concentration of applications from universities that are Master’s level and above. The numbers, roughly, are:

Master’s/Doctoral: 575

BA or Assoc.: 80

Now, those numbers aren’t horrible, and I suspect that they have improved in recent years. And additionally, we should note that many non-university organizations applied for the NEH funding grants. Here is a breakdown of those numbers from the NEH:

What we see here, in fact, is a pretty impressive array of institutional applications for funding – certainly, this is something to build on.

And here are updated numbers of NEH SUG awards actually made – and I thank Jason Rhody, Brett Bobley, and Jennifer Serventi of the NEH ODH for their help in providing these numbers:

Now, there are a few caveats to be made here — only the home institution of the grant is shown, so collaborative efforts are not necessarily represented. Also, university libraries are mostly lumped under their respective university/college type.

Still, we can see pretty clearly here that an overwhelming number of grants have gone to Master’s level and above institutions. And we should be especially concerned that community colleges, which make up the vast number of institutions of higher education in our country, appear to have had a limited involvement in the digital humanities “revolution.”

New Models/New Solutions

Having identified a problem in DH, I’d like to turn now towards some possible solutions and close by discussing some important and hopeful signs for a more equitable future for the digital humanities work.

One of the fun things about proposing a conference paper in April and then giving the paper in January is that a lot can happen in eight months, especially in the digital humanities. And here, I’m happy to report on several new and/or newish initiatives that have begun to address some of the issues I’ve raised today. I’m going to run through them fairly quickly in the hope that many of you are already familiar with them (though I’d certainly be happy to expand on them during the Q&A):

This new initiative seeks to create a large-scale DH community resource that matches newcomers who have ideas for DH projects with experts in the field who can either help with the work itself or serve in an advisory capacity. The project, which is now affiliated with CenterNet, an international organization of digital-humanities centers, promises to do much to spread the wealth of DH expertise. The site has just been launched at this convention and should prove to be an important community-building resource for the field.

  • DH Questions and Answers

Like DH Commons, DH Questions and Answers, which was created by the Association for Computers and the Humanities, offers a way for newcomers to DH to ask many types of questions and have them answered by longstanding members of the field – thus building, in the process, a lasting knowledge resource for DH.

  • THATCamps

These small, self-organized digital-humanities unconferences have been spreading across the country and thereby bringing DH methodologies and questions into a wide variety of settings. Two upcoming THATCamps that promise to expand the purview of the field are THATCAMP HBCU and THATCAMP Caribbean. Both of these events were organized explicitly with the intent of addressing some of the issues I’ve been raising today.

  • The Growth of DH Scholarly Associations

    All of these organizations are actively drawing newcomers into the field. ACH created the above mentioned DH Questions and Answers. NITLE has done excellent public work that is enabling the members of small liberal-arts colleges to be competitive for DH grants. CenterNet is well-positioned to act as an organizational mentor for other institutions.

    These kinds of virtual, regional, and multi-institutional support networks are key, as they allow scholars with limited resources on their own campuses to create cross-institutional networks of infrastructure and support.

    • Continued Commitment to Open Access Publications, Open-Source Tools, and Open APIs

    The DH community has embraced open-access publication, a commitment that has run, in recent years, from Schriebman, Siemens, and Unsworth’s Companion to the Digital Humanities through Dan Cohen and Tom Schienfeldt’s Hacking the Academyto Kathleen Fitzpatrick’s Planned Obsolescence to Bethany Nowviskie’s alt-academy to my own Debates in the Digital Humanities, which will be available in an open-access edition later this Spring. Having these texts out on the web removes an important barrier that might have prevented scholars, staff, and students from cash-poor institutions from fully exploring DH work.

    Relatedly, the fact that many major DH tools – and here the list is too long to mention specific tools – are released on an open-source basis means that scholars working at institutions without DH Centers don’t have to start from scratch. It’s especially crucial that the NEH Office of Digital Humanities states in its proposal guidelines that “NEH views the use of open-source software as a key component in the broad distribution of exemplary digital scholarship in the humanities.”

    These institutes provide key opportunities for DH outreach to academics with a range of DH skills.

    I’d like to close by offering four key ideas to build on as we seek to expand the digital humanities beyond elite research-intensive institutions:

    • Actively perform DH-related outreach at underserved institutions
    • Ask funding agencies to making partnerships and outreach with underserved peer institutions recommended/required practice
    • Continue to build out virtual/consortial infrastructure
    • Build on projects that already highlight cross-institutional partnerships [here I mentioned my own “Looking for Whitman” project]
    • Study collaborative practices [here I mentioned the importance of connecting to colleagues in writing studies]

    While none of these ideas will solve these problems alone, together they may help us arrive at a more widely distributed version of DH that will enable a more diverse set of stakeholders take active roles in the field. And as any software engineer can tell you, the more eyes you have on a problem, the more likely you are to find and fix bugs in the system. So, let’s ensure that the social, political, and economic structures of our field are as open as our code.

    Photo credit: “Abstract #1” by boooooooomblastandruin

DH and Comp/Rhet: What We Share and What We Miss When We Share

What follows is the text of a short talk I gave at the 2012 MLA as part of the session Composing New Partnerships in the Digital Humanities. Many thanks to session organizer Catherine Prendergast, my fellow panelists, and everyone who took part in the discussion in person or through twitter.

Like my fellow panelists, I joined this session because I’d like to see an increased level of communication and collaboration between digital humanists and writing-studies scholars. There is much to be gained from the kinds of partnerships that such collaborations might foster, and much for members of both fields to learn from one another. I suspect that most people in this room today agree upon that much.

So, why haven’t such partnerships flourished? What issues, misconceptions, lapses, and tensions are preventing us from working together more closely?

A shared history of marginalization

Both comp/rhet and the digital humanities scholars have existed at the margins of traditional disciplinary formations in ways that have shaped their perspectives. Writing Studies has a history of being perceived as the service wing of English departments. Beyond heavy course loads, the field is sometimes seen as being more applied than theoretical – this despite the fact that writing studies has expanded into areas as diverse as complexity theory, ecocriticism, and object-oriented rhetoric.

The digital humanities, meanwhile, arose out of comparably humble origins. After years of inhabiting the corners of literature departments, doing the kinds of work, such as scholarly editing, that existed on the margins of English departments, humanities computing scholars emerged, blinking and bit disoriented, into the spotlight as digital humanists. Now the subject of breathless articles in the popular academic press and the recipients of high-profile research grants, DHers have found their status suddenly elevated. One need only look at the soul-searching blog posts that followed Bill Pannapacker’s suggestion at the last MLA that DH had created a cliquish star-system to see a community still coming to terms with its new position.

I bring up these points not to reopen old wounds, but rather to point out that they have a common source: a shared focus on the sometimes unglamorous, hands-on activities such as writing, coding, teaching, and building. This commonality is important, and it’s something, well, to build on, not least of all because we face a common problem as we attempt to help our colleagues understand the work we do.

Given what we share, it’s surprising to me that so many writing-studies scholars seem to misunderstand what DH is about. Recent discussions of the digital humanities on the tech-rhet listserv, one of the primary nodes of communication among tech-minded writing-studies scholars, show that many members of the comp/rhet community see DH as a field largely focused on digitization projects, scholarly editions, and literary archives. Not only is this a limited and somewhat distorted view of DH, it’s also one that is especially likely to alienate writing-studies scholars, emphasizing as it does the DH work done within the very traditional literary boundaries that were used to marginalize comp/rhet in previous decades.

This understanding of DH misses some key elements of this emerging field:

  1. Its collaborative nature, which is also central to comp/rhet teaching and research;
  2. The significant number of digital humanists who, like me, focus their work not on scholarly editions and textual mark-up, but rather on networked platforms for scholarly communication and networked open-source pedagogy;
  3. The fact that the digital humanities are open in a fundamental way, both through open-access scholarship and through open-source tool building;
  4. The fact that DH, too, has what Bethany Nowviskie has called an “eternal September” – a constantly refreshed group of newbies who seem to emerge and ask the same sorts of basic questions that have been asked and answered before. We need to respond to such questions not by becoming frustrated that newcomers have missed citations to older work – work that may indeed be outside of their home disciplines – but rather by demonstrating how and why that past work remains relevant in the present moment.
  5. The fact that there is enormous interest right now in the digital humanities on networked pedagogy. This is a key area of shared interest in which we should be collaborating.
  6. The fact that DH is interdisciplinary and multi-faceted. To understand it primarily as the province of digital literary scholars is to miss the full range of the digital humanities, which involves stakeholders from disciplines such as history, archaeology, classical studies, and, yes, English, and as well as librarians, archivists, museum professionals, developers, designers, and project managers.

    In this sense, I’d like to recall a recent blog post by University of Illinois scholar Ted Underwood, who argued that DH is “a rubric under which a bunch of different projects have gathered — from new media studies to text mining to the open-access movement — linked mainly by the fact that they are responding to related kinds of fluidity: rapid changes in representation, communication, and analysis that open up detours around some familiar institutions.”

To respond to DH work by reasserting the disciplinary boundaries of those “familiar institutions,” as I believe some writing-studies scholars are doing, is to miss an opportunity for the kinds of shared endeavors that are demanded by our moment.

So, let’s begin by looking towards scholars who have begun to bridge these two fields and think about the ways in which they are moving us forward. I’m thinking here of hybrid comp-rhet/DH scholars like Alex Reid, Jentery Sayers, Jamie “Skye” Bianco, Kathie Gossett, Liz Losh, William Hart-Davidson, and Jim Ridolfo, all of whom are finding ways to blend work in these fields.

I’d like to close with some words from Matt Kirschenbaum, who reminds us, in his seminal piece, “What is Digital Humanities and What’s It Doing In English Departments,” that “digital humanities is also a social undertaking.” That is, I think Matt is saying, that DH is not just a series of quantitative methodologies for crunching texts or bunch of TEI markup tags, but rather a community that is in a continual act of becoming. We all need to do a better job of ensuring that our communities are open and of communicating more clearly with one another. This session, I hope, is a start.

An Update

I’m excited to announce that I’ll be joining the CUNY Graduate Center this Fall as Advisor to the Provost for Master’s Programs and Digital Initiatives. My charge there will involve working with the Provost and Associate Provosts to promote and strengthen existing Master’s Programs and to develop new degree programs. I’ll also be collaborating on a variety of digital initiatives with many members of the GC community. It’s an exciting opportunity and I’m looking forward to the work that lies ahead.

While I will continue to teach at City Tech as I take on this new role, I regret to say that I will be unable to continue serving as PI on the U.S. Department of Education “Living Lab” grant. That project has gotten off to a fast and productive start, thanks to the extremely hard work of the entire grant team. In our first year, we’ve had an initial cohort of faculty members participate in a newly designed General Education seminar; we have built the first iteration of the City Tech OpenLab, a socially networked, community-based platform for teaching, learning, and sharing that is currently in a soft-launch; we established the Brooklyn Waterfront Research Center, which has already become part of NYC’s long-term vision for its waterfront; and we have laid the groundwork for numerous other projects that are currently in the pipeline. I am grateful to be leaving the grant in the very capable hands of my friend and colleague Maura Smale, who will be assisted by our excellent Project Coordinator Charlie Edwards and a wonderful team of colleagues. I wish them the very best as they continue the work that we have begun together, and I look forward to remaining involved in the project as it moves forward.

Interview with Bob Stein Now Published in Kairos

I’m happy to report that my interview with Bob Stein (computer pioneer, as Wikipedia disambiguates him), titled “Becoming Book-Like: Bob Stein and the Future of the Book,” is now available in the new issue of Kairos: A Journal of Rhetoric, Technology, and Pedagogy.

The title of the interview comes from the following snippet of our conversation (Bob is speaking about a realization he had in 1981 about the future of the book):

The “aha” moment I had was that adding a microprocessor to the mix meant that producer-driven media, like movies and television, were going to be transformed into user-driven media. For me, the crucial thing — and this happened in the process of writing the paper for Britannica — was when I wrestled with the question of “what’s a book?” and “what happens when we make it electronic?” I realized that everything was going to become book-like in the sense of being user-driven and that the ways in which a user interacts with content becomes an important part of her experience.

I love the way that Bob upends conventional wisdom by defining the book as an active, user-driven medium and the way he foresees digital media becoming more, and not less, “book-like” in the future. “Becoming book-like” also points to the many ways in which new media remediate old media.

The interview is presented in CommentPress, a wonderful theme for WordPress developed by Bob’s Institute for the Future of the Book that allows readers to attach comments to specific paragraphs of text. I encourage you to visit the journal and leave your responses in the comments.

On Reading Like a Hawk

ralph waldo emerson Robert D. Richardson, Jr.’s Emerson: The Mind on Fire (1995) is one of my favorite biographies, and not just because I had the good fortune as an undergraduate to study with the author while he was writing the book. In his careful, moving study of Emerson’s life, Richardson charts the intellectual growth of one of America’s finest thinkers with a novelist’s eye for detail and a scholar’s knowledge of historical context, and he does it all in short, elliptical chapters that echo Emerson’s own aphoristic sentences.

One of my favorite subtexts of the biography is Richardson’s interest in Emerson’s reading and writing practices. Both of the following passages from the biography speak to Emerson’s omnivorous consumption of books and his methods for working through them:

Passage 1 (from Chapter 11: Pray Without Ceasing):

Coleridge notes that there are four kinds of readers: the hourglass, the sponge, the jelly bag, and the Golconda. In the first everything that runs in runs right out again. The sponge gives out all it took in, only a little dirtier. The jelly bag keeps only the refuse. The Golconda runs everything through a sieve and keeps only the diamonds. Emerson was not a systematic reader, but he had a genius for skimming and a comprehensive system for taking notes. Most of the time he was the pure Golconda, what miners call a high-grader, working his way rapidly through vast mines of material and pocketing the richest bits. (67)

Emerson, it appears, was digging into data before his time.

Passage 2 (from Chapter 28: A Theory of Animated Nature):

Goethe’s greatest gifts to Emerson were two. First was the master idea that education, development, self-consciousness, and self-expression are the purposes of life; second was the open, outward-facing working method of sympathetic appropriation and creative recombination of the world’s materials.

There is an important corollary to the axiom of appropriate appropriation. Along with Emerson’s freedom to take whatever struck him went the equally important obligation to ignore what did not. Emerson read widely and advised others to do so, but he was insistent about the dangers of being overwhelmed and overinfluenced by one’s reading. “Do not attempt to be a great reader,” he told a young Williams College student named Charles Woodbury. “Read for facts and not by the bookful.” He thought one should “learn to divine books, to feel those that you want without wasting much time on them.” It is only worthwhile concentrating on what is excellent and for that “often a chapter is enough.” He encouraged browsing and skipping. “The glance reveals what the gaze obscures. Somewhere the author has hidden his message. Find it, and skip the paragraphs that do not talk to you.”

What Emerson was really recommending was a form of speed-reading and the heightened attention that goes with speed-reading. When pressed by the young Woodbury, Emerson gave details:

“Learn how to tell from the beginnings of the chapters and from the glimpses of sentences whether you need to read them entirely through. So turn page after page, keeping the writer’s thoughts before you, but not tarrying with him, until he has brought you the thing you are in search of. But recollect, you only read to start your own team.”

The last point is crucial. Reading was not an end in itself for Emerson. He read like a hawk sliding on the wind over a marsh, alert for what he could use. He read to nourish and to stimulate his own thought, and he carried this so far as to recommend that one stop reading if one finds oneself becoming engrossed. “Reading long at one time anything, no matter how it fascinates, destroys thought,” he told Woodbury. “Do not permit this. Stop if you find yourself becoming absorbed, at even the first paragraph.” (173-174)

These passages speak, in surprising ways, to current debates about digital media. As is often the case, practices popularly understood to be effects of digital media have histories that predate the digital (David Crystal makes this point in Txting: The Gr8 Db8, as does Cathy Davidson in her blog post The Digital Nation Writes Back). Perhaps we might reclaim Emerson as the high priest of continuous partial attention, the ultimate historical rejoinder to the claims of Nicholas Carr and Sherry Turkle.

As Richardson points out, browsing and skimming were, for Emerson, not so much ways of avoiding the hard work of reading deeply as they were methodologies for jump-starting his own writing processes. It’s good practice to remember that there are many possible paths towards wisdom, and that some of them are more direct than others.

Update: Here is a related post by Chris Kelty: How to read a (good) book in one hour.

Clearing Space on the SD Card of a Nexus One Android Phone

CC-licensed photo from Wikimedia

So what if Google has discontinued the Nexus One, closed its N1 web store, and released newer Nexus phones to market? None of that fazes me. I love my Nexus One for the pleasant heft of its metal body and the smooth contours of its rounded corners, its glowing white button and its removable back cover. It’s not for nothing that Wired deemed it “sexy.”

Still, the N1 can frustrate even its adoring owners at times. I ran into just that situation the other day when I tried to use the camera on the phone. An alert notification informed me that I had only 3MB of space left on my 4GB SD card; I would have to lower the quality of the photos I was taking or stop taking them altogether.

This came as a surprise, since I had recently transfered all of my existing photos and videos from my phone to my computer. With that material off of the phone, what could possibly be taking up so much room?

A little bit of googling produced only marginally helpful advice, so I’d like to explain how I found my way back to a nearly empty SD card. In the end, it turned out that an extra step was needed to truly remove those old files from the phone. In the hope that it might be helpful for other N1/Android owners, here is how I cleared additional space on my SD Card:

— Check Settings > SD card & phone storage to see how much free space you have
— Connect N1 to a computer and transfer all photos and videos from the DCIM/camera folder
— Delete all photos and videos from the DCIM/camera folder
— Disconnect N1 from computer
— Download the ASTRO file manager or another file management app from the Android Market. This will allow you to browse the folders on your Android phone from the phone interface itself.
— Open Astro and go to .Trashes
— Delete all files in .Trashes
— Go to Settings > SD card & phone storage to confirm that your SD card now has empty space.

And that’s it — upon completing the above steps, I had 3.69 GB of free space on the card. No need to delete applications or clear caches, as others suggest. Just clear your .trashes folder, and you should be good to go.