CIRCA:Digital Humanities and the Great Project: Why should we operationalize everything?


Jump to: navigation, search




Digital humanities are the connection of research materials (whether analog or digital) with computational methods which depends on decisions made at every stage of a project’s design. Its’ projects comprise 3 basic components:

 1.  Materials: a collection of files or digital assets, for example, images, text, media files or 3-D models.
 2.  Processing: involves the conversion of materials, using computational methods and tools.
 3.  Presentation: which involves the display of results within an online (examples include WordPress and blogs) or offline (examples are publications or 
     reports) user experience. Aesthetics including the background colour, font choice and graphic style are factors that affect the receptiveness of a 
     project. The fundamental activities that take the materials through processing and into the presentation phase are mediation/remediation, 
     datafication/modelling, processing/analytics, presentation/display and sustainability/preservation. This whole process makes for the operationalization 
     of materials.

Digital Humanities have been involved in a great project. Phillip Kavita (2019) explains that in the early days of the field, back when it was called humanities computing, the project involved the retrieval and remediation of the vast collection of primary sources that had accumulated in our libraries and museums and, in particular, those textual sources that form the foundation of two fields that define, along with philosophy, the core of the humanities: literature and history. The signature offering of this project was the digital collection, exemplified by such projects as Ayers’ Valley of the Shadow, which would evolve into what Unsworth and Palmer called the research collection” and what others would label, with some degree of inaccuracy, the “archive.”

The task of operationalization, as defined by Moretti in his 2013 article "'Operationalizing,'" is a data science term of art that refers to a specific technique of representing information for machine use. Although the term derives from the natural sciences, where it refers to the practice of defining the observable and measurable indices of a phenomenon (or its "proxies") so that it can be studied experimentally and quantitatively, Moretti broadens the concept to include the practice of translating a received concept or theory (typically about society, culture, or the mind) into machine-operable form.


Phillip (2019) goes on to say that practically everything that distinguished the area previous to its rebranding as digital humanities can be traced back to this project:

 1.  The work of text encoding; 
 2.  The concern for textual models and formal grammars (a side effect of and motivation for encoding in SGML and XML); 
 3.  A parallel but less intense focus on image digitization; 
 4.  The desire to develop effective digital critical editions; 
 5.  The inclusion of librarians and academic faculty under the same umbrella; 
 6.  The eventual development of tools like Zotero, Omeka, and Neatline; 
 7.  The interest in digital forensics (the need for which became apparent to those actually building these archives); and so forth. 

The digital humanities now have the potential to embark on a new and exciting project: the embrace of operationalization as a type of profound repair. But, why should we operationalize everything?


This digital humanities debate is very important because, on the one hand, we want to operationalize everything for the following reasons:

 1. To increase the longevity of information i.e. archived information that has been transformed to a digital format may last longer or not? On one hand, 
    digitization prevents the physical destruction of books and artifacts from the destruction that results from unforeseen circumstances while on the other 
    hand, the software that allows for the storage of the books or artifacts digitally could reach End-of-Life. Making it hard to retrieve the information.
 2. To improve the accessibility of information. It is easier to access information in its digitized format.
 3. Being inclusive of the big tent, synthetic of theoretical traditions and new research agendas.
 4. Critical of emerging forms of digital culture, and—perhaps above all—being both backwardly compatible with our great work in the building of thematic 
    research collections and forwardly comparable with our engagement with data science and our generous vision of public humanities.
 5. The best part is that the process of operationalization provides a new scope to the data, leading to not only data remediation but also idea 
    remediation! Moretti highlights the significant opportunities that result from the task of transforming a discursively constructed notion into a 
    machine-readable code. To demonstrate, he translates Hegel's theory of tragic opposition, which describes the process by which equally valid human 
    values clash, and observes that the work of operationalization itself can cause us to rethink the original theory in new ways, even if we may imagine 
    arriving at the new perspective through other means in retrospect. For these thinkers, operationalization produces a rationalization effect, a 
    disruption of tacit knowledge caused by the computer’s representational demand for explicit, discrete, and often reductive categories, which frequently 
    requires one to refine held ideas into a clear and distinct form. Along the way, lively philosophical questions, long hidden in the foundations of an 
    idea, are reopened for debate, since the coded representation of the original idea is never the only one possible, but inevitably demands choosing among 


However, on the other hand, we ask, is operationalizing everything really the answer? Is it really the best choice? Some disadvantages include:

 1. The process of operationalization of data requires energy-intensive tasks that release carbon emissions into the air, which is unhealthy for our 
    climate. It should be noted that even the most effective recycling is an energy-intensive industrial operation. Server farms, often known as "the 
    cloud," are characterized by rows upon rows of stacks of servers connected by fibre optic, Ethernet, and other lines, which require enormous quantities 
    of water and power to function and keep the facilities from overheating. This is hazardous to one's health and adds to environmental damage. According 
    to a Wall Street Journal research from 2015, a mid-sized data centre consumes nearly the same amount of water as 100 acres of almond trees, three 
    average hospitals, or more than two eighteen-hole golf courses (Deibert, 2020).
 2. There is also the significant risk of operationalization: the selection and amplification of ideas whose only qualification for inclusion in an argument 
    is their ease of representation via digital means (Alvarado, 2019).
 3. The transfer of knowledge to following generations is therefore at stake in the digital humanist's involvement with operationalization. Which concepts 
    and ontologies will be taught, and which will be forgotten? For operationalization is, whether conducted by a digital humanist or a data scientist, a 
    selective transducer of concepts and theories, an evolutionary conduit through which certain ideas will live and others will die. As humanists, we 
    should not accept the facile premise that the most readily operationalized ideas are the greatest ideas, but rather engage in an open and critical 
    examination of operationalization as a form of argument, even as we use this form to test and explore a big theory (Alvarado, 2019).
 4. Last but not least is the danger of losing some vital parts of information from the text or artifact being operationalized, during the process of 
    conversion (Rockwell, 2021).


This issue has raised important questions about the process and the importance of operationalization by the Digital Humanities community and we cannot overstate the importance of operationalization as outlined in the benefits above. One of my colleagues suggested during class that we can move these data centres where electronic materials are stored to regions with cooler weather and fewer people to reduce pollution and increase the sustainability of this process.

To avoid bias in deciding which materials or artifacts to operationalize and which to leave, when to operationalize a material or artifact and when not to, we propose introducing a uniform criteria guideline for different categories of materials to be operationalized (for example, books/journals, artifacts, maps), against which all materials that are available for operationalization are evaluated and decisions made on which ideas and ontologies will be taught and which will not.


 1.  Should we operationalize everything?
 2.  Would you say that the advantages of operationalization outweigh its drawbacks?
 3.  Can a  possible solution to the problem of the identified drawbacks to operationalization be suggested?
 4.  Weighing in on this debate, what would you say is a possible way forward? How do we balance things?


“Part i ][ chapter 6” in “debates in the digital humanities 2019” on debates in the DH manifold. (n.d.). Debates in the Digital Humanities. Retrieved November 23, 2021, from

Deibert, R. (2020). Reset: Reclaiming the internet for civil society. [E-reader version]. Retrieved from

Roughly as much water as about one hundred acres of almond trees. Fitzgerald, D. (201 5, June 24). Data centers and hidden water use. Retrieved from 100071 1 158351 184369540458106790312603929

Drucker, J. (2021). The Digital Humanities Coursebook. An Introduction to Digital Methods for Research and Scholarship. Routledge.

Personal tools