Digital Humanities Seminar
The Digital Humanities Seminar, co-sponsored by the Hall Center for the Humanities, provides a forum for sharing and discussion of new digitally-enabled humanities research efforts, with a specific focus on what digital humanities tools and practices can do for a range of humanistic research.
Unless otherwise noted, beginning in October 2017 all sessions take place 3:00pm - 4:30pm at the Hall Center for the Humanities.
Current (Fall 2017-Spring 2019) Seminar directors are Peter Grund (English) and Elspeth Healey (KU Libraries).
Please scroll down for a list of all seminars, 2011 - present, with abstracts and video recordings for most of them.
Monday, January 29, 2018
Postcolonial Imperatives in the Digital Age
Dhanashree Thorat, IDRH, University of Kansas
Monday, February 19, 2018
The Digital Natives Are Not Restless, They Are Not Even Native
Bonnie Lynn-Sherow, Department of History, and Executive Director of the Chapman Center for Rural Studies, Kansas State University
Founded in 2008 as an undergraduate history research lab investigating rural life, the Chapman Center for Rural Studies embarked on a voyage of virtual discovery in 2009 with an NEH Digital Humanities Planning grant in 2009. This launched a steep learning curve in the Center that recently culminated in a 2018 NEH grant for Humanities outreach to small museums and historical societies. In 2017, the CCRS was promoted from a departmental initiative to a college wide interdisciplinary center of excellence. We have now come full circle, from beginners to mentors. For us, Digital Humanities is the big tent containing individual disciplinary rooms: of skilled technicians, creative practitioners/critics and end users. We are unashamed end-users; neither creating nor influencing the tools we employ. We are consumers and facilitators of digital tools selected to meet our primary mission of undergraduate education through engagement. I will review our decade long experiment with different digital tools, students' use and response to those tools, and the outcomes, some successful, others less so, of our experiences. Understanding what makes historians distinctive in their use of digital methods , separating digital history from digital humanities, separating history from digital history, is our ongoing conversation.
Wednesday, April 4, 2018
Collecting, Annotating, and Analyzing a Second Language Acquisition Corpus
Nina Vyatkina, Department of Germanic Languages and Literatures, University of Kansas
Learner corpora are digital databases of texts produced by second language (L2) learners that are used in second language acquisition and language teaching research. Since their emergence in the 1990s, learner corpora have been providing researchers with rich samples of learner language produced in real-life contexts that could be analyzed with various automated tools and served for testing research hypotheses and complementing findings from experimental research. In my talk, I will present KANDEL - the Kansas Developmental Learner corpus - an open access collection of L2 German writing samples produced by several cohorts of KU learners over four semesters of instructed language study. I will describe how this corpus was collected, annotated, and analyzed in order to investigate learner linguistic development from beginning to intermediate L2 proficiency levels. KANDEL is unique due to its longitudinal nature, data collected at dense intervals, and annotations for multiple language, learner, and task variables. I will also discuss best annotation practices that can be applied not only to learner corpora but also to any custom-made textual databases collected for any Digital Humanities or Social Sciences purposes.
Monday, April 16, 2018
Black Humanity and the Digital Frontier: The Black Book Interactive Project
Maryemma Graham and Arnab Chakraborty, Department of English, University of Kansas
Monday, September 18, 2017
HathiTrust Research Center: Strategic approaches to opening research opportunities on closed data
J. Stephen Downie, HathiTrust Research Center
Friday, October 6, 2017
Spenser and Historical Stylometrics Medieval and Early Modern
Anupam Basu, Department of English, Washington University of St. Louis
Monday, November 20, 2017
The WWI Immigrant Poetry Digital Humanities Project
Professor Lorie Vanchena (presenting on behalf of Lorie Vanchena and Andrew Crist)
Department of Germanic Languages and Literatures, University of Kansas
Wednesday, December 6
Touching on Archival Absence: Early Modern Women and the Digital Humanities
Whitney Sperrazza, Postdoctoral Researcher, Hall Center for the Humanities, University of Kansas
Wednesday, January 25
Mapping Testimonies of Children Who Survived the 1994 Genocide in Rwanda
Musa Wakhungu Olaka, KU Libraries
Monday, Feb 20
Reconstructing Moses Grandy’s World: The Interplay of GIS With Enslaved Narratives
Christy Hyman, University of Nebraska
Wednesday, March 15
De-centering Centers of Digital Humanities
Brian Rosenblum, KU Libraries & Phil Stinson, Dept. of Classics
In this seminar Brian Rosenblum and Phil Stinson will discuss the role(s) of Digital Humanities centers, along with current trends in digital humanities practices and ongoing challenges and debates within the field and here at KU. We will then lead a discussion about the current state of DH infrastructure and practice at KU.
Monday, April 17
A Portrait of Venice: a ‘Virtually’ Digital Exhibition
Kristin Huffman Lanzoni, Duke University
Global Medical Humanities and the Horizons of Digital Health Innovation
Kathryn A. Rhine, Associate Professor, Anthropology, University of Kansas
New Technologies and Old Things: Why Study Ancient Sculpture and Monuments in 3D?
Philip Sapirstein, Assistant Professor, Department of Art and Art History, University of Nebraska, Lincoln
Digital Humanities Pedagogy in Theory and Practice
Jessica DeSpain, Associate Professor, Department of English Language and Literature; Co-Director of IRIS Center, Southern Illinois University, Edwardsville
Critical Protocols in Indigenous Game Design
Joshua Miner, Assistant Professor, Film and Media Studies, University of Kansas
Head-and-Shoulder Hunting in the Americas: Walter Freeman and the Visual Culture of Lobotomy
Miriam Posner, Coordinator, Digital Humanities Program University of California, Los Angeles
Between 1936 and 1967, Walter Freeman, a prominent neurologist, lobotomized as many as 3,500 Americans. Freeman was also an obsessive photographer, taking patients’ photographs before their operations and tracking them down years — even decades — later. In this presentation, Miriam Posner details her efforts to understand why Freeman was so devoted to this practice, using computer-assisted image-mining and -analysis techniques to show how these images fit into the larger visual culture of 20th-century psychiatry.
Trusting Data: A Technologist’s Perspective
Perry Alexander, AT&T Foundation Distinguished Professor, EECS Department, Director, Information and Telecommunication Technology Center, University of Kansas
Digital Encryption, or, Messages Against the Medium
Andrew Lison, Post-doctoral Fellow, Hall Center for the Humanities, University of Kansas
Recovering the Past through Digital History
Jennifer Guiliano, Assistant Professor, Department of History, Indiana-Purdue University Indianapolis
Monday, August 31
When a Project Demands to be Digital: Reflections by Reluctant DHers
Laura Mielke (English) and Marty Baldwin (English)
In this presentation, a faculty member and a graduate student from the English Department will recount their experience of beginning to work on a traditional (i.e. print) edition of a nineteenth-century text only to realize the necessity of transporting the project into the digital environment. In the process of discussing the specific, bizarre historical and materials contexts for their particular project, the presenters will reflect on: the sources of reluctance to work in DH; the process of identifying and acquiring necessary DH skills and tools; the discovery of DH support networks; the problems that arise in the conversion of a print editorial project to a digital one; and most important, the impact of DH scholarship on collaboration between graduate students and faculty. This presentation is aimed to spur conversation about how scholars who are hesitant to enter the DH world might do so for practical reasons–and happily survive.
Monday, September 21
What Is Digital Humanities and What’s It Doing in the Classroom? Toward a Digital Pedagogy
Pamella Lach (KU Libraries)
Incorporating digital humanities into the classroom, while rewarding, can be difficult and messy—for instructors and students alike. In this talk, Lach will share her experiences experimenting with DH in the classroom. She will discuss a range of attempts along a pedagogic spectrum, from undergraduate blogging about digital objects to graduate students implementing self-designed digital projects. Her talk will address some of the challenges of adopting a digital approach in the classroom, and gesture towards some best practices. This talk is especially geared to those curious about or interested in integrating DH into their teaching.
Monday, October 19
3DGIS for Discourse, Analysis, and Interpretations of Ancient Maya Architecture and Landscapes
Heather Richards-Rissetto, Anthropology, University of Nebraska-Lincoln
Archaeological projects increasingly acquire and create 3D data of objects, buildings, and even landscapes; however, it is still a challenge to make these data accessible for researchers and cultural heritage managers and link these models to geo-referenced data sets for visualization and analysis. To address this issue, the MayaArch3D Project (www.mayaarch3d.org) is working to develop a 3D WebGIS-called QueryArch3D-to allow 3D models and GIS- to “talk to each other” for studies of architecture and landscapes-in this case, the eighth-century Maya kingdom of Copan, Honduras. In this talk, I will discuss how we are using 3D WebGIS to develop new visibility methods to explore the visibility or inter-visibility of monuments and buildings to or from common pathways that inhabitants of different social quarters may have taken while moving through the city of Copan. I will also present on an affiliated project--MayaCityBuilder--recently begun at the University of Nebraska-Lincoln that is using procedural modeling, rapid proto-typing of 3D models from a set of rules, to allow for the efficient and low-cost creation of alternative ancient Maya landscapes in order to foster discourse, analysis, and interpretations.
Monday, November 16
Lori Emerson- Lab-based Tinkering and Open-ended Play in the Era of the Posthumanitie
Lori Emerson, English, Media Archaeology Lab, University of Colorado, Boulder
Nearly all digital media labs are conceived of as places for experimental research using the most up-to-date, cutting-edge tools available. However, the Media Archaeology Lab (MAL) is a place for hands-on, cross-disciplinary experimental research and teaching using still-functioning but obsolete tools, software, hardware, platforms from the past. What the MAL does best is that it provides direct access to defining moments in the history of computing and digital literature. The MAL is also a kind of thinking device in that providing access to the utterly unique, material specificity of these computers, their interfaces, platforms, and software makes it possible to defamiliarize or make visible for critique contemporary, invisible interfaces and platforms. It’s an approach to media of the present via media of the past that aligns the lab with the vibrant field of “media archaeology.” In her talk, Lori Emerson will discuss the history and philosophy of the Media Archaeology Lab along with how her current research projects - “Other Networks” and “The Lab Book: Situated Practices in Media Studies - are part-and-parcel of the lab.
Wednesday, January 28
Mapping the Complexity of Landscape and Law: Capturing the Elusive History of U.S. Homesteading
Sara Gregg (History) and Rhonda Houser (Libraries)
This presentation will trace the process of discovery and exploration of a set of historical maps chronicling the process of land distribution in the United States, as well as the future stages of research on the history of the U. S. Homestead Acts. A set of investigations of historical maps demonstrates the potential to create an entirely new environmental understanding of the effect of federal land law on the landscape of the American West by employing new technology and formerly unmined cartographic and statistical materials. During the introduction to this project the researchers will reflect upon the challenges and opportunities posed by collaborative research, as well as the power of the spatial humanities to transform our understanding of land policy.
Wednesday, February 25
Saeculum: Approaching (Ancient Roman) Culture Through Game Design
David Fredrick, Classical Studies, University of Arkansas
This talk outlines the use of the Unity game engine for classical studies research and teaching, using three examples. The first is a development of Unity as a lecture presentation platform (3D Powerpoint), using an analysis of the distribution and meaning of representations of Hermaphroditus in Pompeian houses. The second and third review the development of game-based online courses in classical mythology and Roman civilization—what is working and what is not, and the value of building this curriculum with in-house student developers, despite the risks.
Wednesday, March 25
Random Borges | Infinite E-Lit: A Look from Hispanic Legacies
Élika Ortega, IDRH
Part of the interinstitutional collaborative project Hispanic Legacies in Electronic Literature, in this presentation Élika Ortega proposes a juxtaposition between Argentinian writer Jorge Luis Borges’ imagined figures of infinity such as The Library of Babel, The Aleph, and The Book of Sand and contemporary examples of Electronic Literature (E-Lit) that analogously and literally enact endlessness in reading and writing. Media figures of infinity (as Ortega terms the conceptual and structural strategies used by writers to create infinites) underscore the tensions between the life span of artworks, the machines and code that materialize them, and the people reading them. Furthermore, the spatial-temporal dimensions of infinite E-Lit works put into question the role of readers and archivists dealing with literary works whose end will not be seen but are likely to stop working or become obsolete and inaccessible.
Wednesday, April 22
Literary Geography of the Twentieth Century: Computational and Statistical Models
Matthew Wilkens, English, Notre Dame
Computational methods allow literary scholars to test their claims against a much larger and more diverse body of texts than would otherwise be possible. Recent examples include work on the evolution of poetic diction in the nineteenth century, on comparative social networks in American and Asian modernism, and on urban space in several centuries of British fiction. But there has been very little such research on contemporary literature, where problems of scale are most acute. This talk presents new computational work on neoliberalism and the literary geography of the twentieth century. To shed light on the extent to which fiction today is shaped by the logic of late capitalism, it assesses the relationship between the century’s significant changes in economic output and the shifting distribution of geographic attention in 10,000 American novels published between 1880 and 1990, finding a surprising – and growing – degree of geographic conservatism in postwar US fiction. This result calls into question the widespread critical assumption that neoliberal ideology demands an increasingly close alignment between market functions and aesthetic production.
Wednesday, August 27
The Case for Close Textual Attention in the Age of Text Glut
Amanda Gailey, University of Nebraska-Lincoln, English
This talk will discuss how the field of literary studies should preserve the scholarly and pedagogical value of close reading even as the digital humanities and the culture at large increasingly prioritize big data. I will discuss some of the blind spots in big data approaches to literature in order to show the continued importance of smaller-scale digital studies, drawing on examples from The Walt Whitman Archive, The Tar Baby and the Tomahawk, and Scholarly Editing, which I use in my research and teaching. I will also talk about how coursework on digital editing can be very effective in teaching students to be careful readers and writers.
Monday, September 29
The first part of ‘text analysis’ is ‘text’: Applying digital methods to an under-documented language
Matt Menzenski, Slavic Languages & Literatures
The digitization and curation of large bodies of text has inspired and encouraged new methods of research into language and literature, but only into those languages for which such corpora have been established. What sort of strategies are available to a researcher wishing to apply these research methods to a language which is not yet represented in a digitized collection? Is construction of a text corpus a feasible task for a researcher more interested in human languages than in programming languages? This talk provides a case study of the creation of a small text corpus for Tohono O’odham (an endangered language of the Southwest), the use of that corpus to investigate questions about the way that verbs are used in narratives, and more broadly, the sometimes unexpected ways in which the development of a text corpus can influence the research process.
Wednesday, October 22
Virtual Reality on Stage
Mark Reaney, Department of Theatre
For almost 20 years, KU has been a world leader in the field of digitally mediated theatre production. Starting in 1995 KU’s University Theatre has mounted a series of ever-increasingly complex productions in which real-time computer-generated graphics or “Virtual Reality” has been used as the scenic medium. In this talk, Prof. Mark Reaney will discuss the underpinning artistic philosophy behind this body of work and present an overview of the 9 VR/Theatre productions mounted at KU.
Wednesday, November 19
Ghosts and Our Machines: Digital Scholarship for Religious Studies
Christopher Cantwell, University of Missouri-Kansas City, History
Ancient Chinese scribal hands: recording, searching and visualizing scribal practices in the fifth-century BC Wenxian Covenant Texts
Crispin Williams, KU East Asian Studies
The Spatial Self: Location-Based Identity on Social Media
Germaine Halegoua, KU Film and Media Studies
Literary History in Conversation with Computer Science
Ted Underwood, University of Illinois
The term “digital humanities” tends to stage contemporary developments in the humanities as a confrontation, not with specific ideas or disciplines, but with digital technology itself. That’s part of the logic of the term’s success, but for good or ill, this talk will aim at a narrower, socially concrete topic. I’m interested, not in the web or in computers as such, but in the human beings who study computer science. To the extent that humanists discuss CS at all, we tend to imagine it as a narrowly instrumental discourse. And there’s some truth to that: a large part of what I want to do is show off some neat tricks computer scientists have invented that turn out to be useful for the humanities (and especially for literary history). I’ll focus on topic modeling (which casts new light on the history of humanistic disciplines), and on supervised learning algorithms (which provide an interestingly flexible way to approach the history of genre). But I also want to suggest that the conversation between computer scientists and humanists needn’t be purely instrumental, or fully contained in “tools” that we borrow from CS. In some ways computer science is a surprisingly flexible hermeneutic discourse, and humanists may have more in common with it than we imagine.
Inside a 13th Century Water Clock: Multi-Disciplinary Teaching Across Computing and the Humanities
Paul Fishwick, UT- Dallas, Computer Science
Can the connections between the humanities and computer science include arts and humanities informing computer science? We are familiar with the idea that computer science results in technologies, and that these are then used as tools by artists and humanists. Going in the other direction is also possible, where deep concepts in computing are covered through cultural artifacts. We will include practical examples of this approach, including al Jazari’s water clock. These examples create new possibilities for humanist-computer science collaborations, and they also suggest that computer science can be viewed as empirically-driven rather than existing purely as an “artificial science.”
Editing Walt Whitman’s Marginalia Today: Digital Humanities Methods at the Edge
Matt Cohen, UT-Austin, English
This talk is about methodology in the humanities. It begins with a discussion of the most basic practice of humanities research: note-taking. Annotations, marginalia, all of the methods of sifting, highlighting, and gathering: these are the substrate of our larger claims and discoveries. Such is the case even when we are working with “big data,” topic modeling, natural language processing, and other automated techniques for what Franco Moretti has called “distant reading.” The talk then reflects on the claims for methodology in and as what is being called the digital humanities. These observations emerge at the junction of two occasions. The first is a project to digitize the poet Walt Whitman’s annotations and marginalia, his personal metadata on his reading. This NEH-funded project is at the end of its first phase, and will be published later this year for free access at the Walt Whitman Archive. The second spur is the active conversation about the digital humanities as a methodological crucible or fountain; both the tenor and the content of that conversation are occasions for considering the status of method in the humanities.
The Light Commodity of Words: Digitizing the Material Book
Jonathan Lamb, English
This talk will explore the issues that arise when material objects are converted into digital form. Although such questions have received much attention from a throng of scholars, librarians, and computer scientists, I wish to address them from the perspective of a ‘domain scientist’ of literary and textual culture—a user perspective, as it were. Taking as my subject the Early English Books Online (EEBO) database, along with EEBO’s newish Text Creation Partnership (TCP) full-text tool, I will argue that book digitization makes possible (and necessary) a material-digital dialectic. Such a dialectic, which privileges neither material nor digital artifacts but allows a researcher to jump like a spark between the two poles, becomes especially visible and productive when studying books from the early modern period (1500-1700). Each domain (i.e., the digital and material) offers medium-specific forms of resistance and friction, and each therefore provokes new insight with respect to the other. The talk will feature a multitude of examples and only light theorization, no prior knowledge of EEBO or EEBO-TCP is required.
Poems on the Page: Reading the Visual Codes of Victorian Books
Natalie Houston, University of Houston
The digitization of nineteenth-century texts offers us the opportunity of asking new research questions that could transform our historical understanding of Victorian culture. My research explores how we can use computational tools with large sets of digitized texts to gain a broader sociological understanding of poetry’s circulation, consumption, and function within Victorian culture.
All printed texts simultaneously convey meaning through both linguistic and graphic signs. Printed poems, for instance, are typically framed by the white space created by line endings, creating a distinctive visual signal of the genre on the printed page. In Victorian books of poetry, rhymed lines were frequently indented the same distance from the left margin to visually indicate the poem’s form and structure. Rhyme is thus simultaneously a linguistic, poetic, and graphic feature of many Victorian books. Most scholarly digital archives recognize the value of this graphical meaning and provide users with page images as well as OCR text, but most tools for large-scale computational analysis focus only on the linguistic content of texts.
In this talk, I will discuss how the visual aspects of printed texts contribute to their cultural significance; how computational analysis can facilitate the identification of unique or representative items, historical trends, and comparisons not accessible to the human eye across large document collections; and present some initial research findings from the current development of VisualPage, a prototype software application for the large-scale identification and analysis of the graphical elements of digitized printed books.
Bio: Natalie M. Houston is an Associate Professor of English at the University of Houston. Her research on Victorian poetry and print culture has appeared in journals such as Victorian Studies, Victorian Poetry, and the Yale Journal of Criticism. She is the Project Director for VisualPage, an NEH-funded project to develop a software application to identify and analyze visual features in digitized printed books and she is currently writing a monograph entitled Reading Victorian Poetry Digitally. She is also a Project Co-Director and Technical Director for the Periodical Poetry Index, a research database of citations to English-language poems published in nineteenth-century periodicals. She contributes regular columns on productivity, pedagogy, and technology to the ProfHacker blog hosted by the Chronicle of Higher Education.
Revising Ekphrasis: Using Topic Modeling to Tell the Sister Arts’ Story
Lisa Rhody, University of Maryland
For the past 20 years, the story of ekphrasis—poetry to, for, and about the visual arts—has been told as a long-standing, gendered contest between rival media, fraught with political, cultural, and religious anxieties. Although skeptical of the necessity of gendered rivalry as a principle of ekphrastic creation, literary scholars have struggled to present a compelling alternative model that sufficiently accounts for the genre’s representational complexity.
This talk begins by asking if computational methods might offer new insights into the canon and tradition of ekphrasic poetry and suggests how topic modeling—one form of computational text analysis—might begin to refocus the aperture of our critical lens on the genre’s conventions.
Oriented toward the non-expert, this presentation will assume no prior knowledge of topic modeling or social network analysis. I will provide a gentle introduction that builds toward an understanding of the potential uses for topic modeling and network analysis as a means for exploring large collections of poetic texts.
Poetic collections, dense and rich with figurative language, require revising how we as humanists interpret topic modeling results. Therefore, this presentation will also address how changes in interpretation affect the questions we might ask and the assumptions we can make about “topics” generated by latent Dirichlet allocation (LDA)—one type of topic modeling algorithm.
Rhody is Research Assistant Professor of History in the Roy Rosenzweig Center for History and New Media at George Mason University. She is an editor for the Journal of Digital Humanities and project manager for the Institute of Museum and Library Services’ (IMLS) signature conference WebWise. In 2012, she was the recipient of a Maryland Institute for Technology in the Humanities (MITH) Winnemore Dissertation Fellowship. Her work has appeared in the Journal of Digital Humanities and ProfHacker, and she is co-author with Wendy Hui Kyong Chun of a forthcoming article in Differences, “Working the Digital Humanities: Uncovering Shadows between the Dark and the Light.”
All Quiet on the [Virtual] Front: Traversing the Digital/Physical Divide
Ben Rosenthal, Visual Art
Benjamin Rosenthal’s work centers around the strategies of how we perform—the systems of control we set in place, and the ways we negotiate our psychological, tangible, and virtual positions. Benjamin will talk about the trajectory of his creative work and his current ongoing research in both digital and non-digital forms. He will address how he, as an artist, engages with the nature of the digital landscape—negotiating the boundaries between physical and mediated experience.
Benjamin received his B.F.A. in Art (Electronic and Time-Based Media) from Carnegie Mellon University in 2006 and his M.F.A. in Art Studio from the University of California, Davis in 2011. While his formal training is primarily in film/video and related forms, his practice extends into performance, animation, web-based interactive work, installation, drawing, and sound.
Genealogy, Bureaucracy, Correspondence, Transportation-Networks and Network Analysis Methods in Humanities Scholarship
Elijah Meeks, Digital Humanities Specialist, Stanford University
Network analysis in the sciences and social sciences typically focuses on citation, social, communication, logical, and neurological networks, and a broad set of methods and research has developed along those lines. However, network analysis in the humanities has grown in visibility and popularity recently and focuses instead on similar but distinct forms that have their own methodological concerns. The role of evidence and agency, for instance, distinguish them from traditional, big data and API driven research on telecommunications and social networking services. This talk will focus on four distinct humanities network types: Genealogical networks of British cultural elites and their families, correspondence networks from the Republic of Letters, transportation networks of Imperial Rome, and Bureaucratic networks from medieval China. The application and adaptation of established network analysis methods will be demonstrated, along with an exploration of methodological problems and techniques for addressing them in humanities network analysis.
Digital Humanities and Electronic Texts: Proliferation, Interoperation, Illumination
Brian Pytlik Zillig, Center for Digital Research in the Humanities, University of Nebraska
Communication and Collaboration: Digging into Digging into Data
Anne D. Hedeman, Judith Harris Murphy Distinguished Professor of Art History & Heather Tennison, MLS and MA in Art History University of Illinois
This talk is about finding beauty in what we would normally describe as mundane experience and then finding a way to make art from it. Since my expertise is in working with sound and music the focus of the talk will be on the phenomena of sound and audio. I will argue that in our everyday lives we are literally surrounded by interesting things, beautiful things, and even profound things. But that we mostly ignore the enriching possibilities inherent in engaging with these “objects”. Beauty can be found anywhere but in order to ‘see’ it or ‘hear’ it we need to be receptive. This talk will examine the Buddhist notion of the Senses and then identify means by which we can create gateways to the experience of beauty in everyday experience. The particular focus will be on how to use digital processes which extend, alter, or confound our senses to “elevate” the mundane.
Recovering the Recovered Text: Diversity, Canon Building, and Digital Studies
Amy Earhart -English, Texas A&M
This paper examines the state of the current digital humanities canon, provides a historical overview of the decline of early digitally recovered texts, literature designed to expand the literary canon, and offers suggestions for ways that the field might work toward expansion of the digital canon. My research shows that a subfield of early literary digitization work, mostly projects unassociated with humanities computing/digital humanities, sought to negate early canon bias found within print and envisaged digital literary scholarship as a tool to reinsert women, queers, and people of color into the canon. The DIY sites built during this period were labors of love and allowed scholars to self publish materials found buried in difficult to access library archives or dusty journal editions. The early wave of small recovery projects has slowed and, even more troubling, the extant digital projects have begun to disappear. If we lose a large volume of texts from the expanded canon we will be returning to a new critical canon that is incompatible with current understandings of literature. In addition, the turn to increased standardization (TEI) and big data troubles our efforts at small-scale recovery project, as DIY scholars, outside the DH community, have difficulty gaining access to required technical skills for small projects, leading to a decline in small-scale digital recovery projects. The poor literary data sets impact digital humanities efforts to experiment with visualization and data mining techniques. If Matt Kirschenbaum is correct and preservation is not a technical problem but a social problem, then it follows that the digital humanities community should be able to address the lack of diversity in the digital canon by attention to acquisition and preservation of particular types of texts. We need a renewed effort in digitizing texts that occurs in tandem with experimental approaches in data mining, visualization and geospatial representations. This paper offers several possible ways of addressing this troubling problem.
Playing without Power in Videogames
Mark Sample, English, George Mason University
Players and scholars alike have characterized videogames as fantasies about unlimited power. In this talk I explore how some videogames have rejected the core mechanic of “leveling up”—in which the player’s character grows increasingly more powerful—and have instead emphasized the vulnerability of the game’s protagonist. Such games test the limits of playing the powerless and the doomed in videogames, allowing us to explore the outer edges of our empathy and our imagination.
Joining Geographic Information Systems (GIS) and Humanistic Research: The Traditional Karez Water System in Southern Afghanistan
Phil Stinson, Classics, University of Kansas
Emerging Opportunities for Visual Analytics in the Digital Humanities
Chris Weaver, School of Computer Science, University of Oklahoma
Research is a complex process of exploration and analysis that encompasses observation, collection, interpretation, discourse, and collaboration. That the digital humanities community aims to marry human and computational capabilities puts it squarely in the vanguard of emerging methodologies. As a growing methodological subdiscipline of the information sciences, visual analytics seeks to facilitate the research process by augmenting innate human visual and cognitive capabilities with interactive computational tools. The commonalities and potential for exchange between the digitial humanities and visual analytics is conspicuous.
Useful but specialized applications of visual analysis now exist in numerous domains that tackle complex, voluminous information sources; well-represented domains include intelligence analysis, emergency response, business logistics, finance, and epidemiology. However, there is of yet little support for an open-ended, user-driven process of broad and deep digital engagement in which data processing, graphical depiction, and human interaction adapt to evolving research needs and goals, particularly in examinations of idiosyncrasy. In this talk, Chris Weaver will offer a vision of humanities scholarship infused with highly interactive, visual, computational facilities for interpretation and discourse. He will also present concrete progress on developing methods, techniques, and tools in support of that vision.
Bio: Chris Weaver is an Assistant Professor in the School of Computer Science and Associate Director of the Center for Spatial Analysis at the University of Oklahoma. He holds a B.S. in Chemistry and Mathematics from Michigan State University and an M.S. and Ph.D. in Computer Science from the University of Wisconsin-Madison. He was a post-doctoral Research Associate with the GeoVISTA Center in the Department of Geography at Penn State, where he helped to found the North-East Visualization and Analytics Center. His research in information visualization and visual analytics focuses broadly on synthesis of highly interactive visual interfaces for exploring and analyzing heterogeneous multidimensional data sets.
Patterns in the transmission of cultural texts: the case of medieval miscellany manuscripts
David Birnbaum, Department of Slavic Languages and Literatures, University of Pittsburgh
Medieval Slavic miscellanies are a type of free-form encyclopedia, compilations of texts of various genres from various sources. Compilation in medieval literary culture was as much a creative act as the authoring of entirely new texts, and the producers of new manuscripts were free to draw on all available sources, creating compilations whose originality was not constrained by any attempt to reproduce earlier compilations literally or faithfully. The variation that occurs in miscellany manuscripts is nonetheless surprisingly constrained; it is almost unheard of for two miscellanies to correspond perfectly in their contents, but there are nonetheless frequent partial correspondences that cannot be explained by genre, subject matter, or any other organizing principle, and that are inconsistent with a hypothesis that scribes compiled manuscripts without explicit constraint–even if that is what they thought they were doing. This presentation describes some of the patterns of agreement that emerge from comparing the structure of miscellany manuscripts, leading to a conclusion that despite the scribe’s complete freedom to choose his texts, the contents of miscellany manuscripts were nonetheless severely constrained by the tradition, and in specific ways.
Bio: David J Birnbaum is chair of the Slavic Languages and Literatures department at the University of Pittsburgh. His specialties include humanities computing (particularly the computer processing of medieval Slavic manuscripts), Slavic linguistics (particularly diachronic and synchronic phonology and morphology), and medieval Slavic texts.
Writing Practices in Early New England: An Electronic Database Tool for Charting the Recorders of the Salem Witch Trials
Peter Grund, Department of English
In my presentation, I will report on my ongoing collaborative project (with Margo Burns and Matti Peikola) on the recorders that took down the some 1,000 documents from the Salem witch trials. Our goal is to produce an online tool that identifies as many of the approximately 250 recorders as possible, and provides bibliographical data as well as data on their scribal practices. At the same time, the plan is to make the tool available to other researchers to use in the charting of handwriting and scribes in other historical contexts. The talk outlines the principles of the work, demonstrates the preliminary setup of the tool, and discusses some of the future avenues for the project. Among other things, I show how collaborative research is crucial in a project of this kind, and how the electronic format of this scribal tool allows us to approach age-old questions not only about the Salem trials but also about writing practices, scribal copying, and literacy.
Bio: Peter J. Grund is Assistant Professor of English Language Studies at the University of Kansas. His research interests include English historical linguistics, early American English, corpus linguistics, electronic editing, and the vernacularization of science. He is co-author of the recent Testifying to Language and Life in Early Modern England, including a CD containing An Electronic Text Edition of Depositions 1560–1760 (ETED) (John Benjamins, 2011), and co-editor of Records of the Salem Witch-Hunt (CUP, 2009).
‘Grounds more relative than this’: towards Semantic Computation in Digital Literary Studies
Patrick Flor, Department of English and Department of Computer Science
In recently published research in Digital Literary Studies (DLS), more and more projects are moving beyond the well-established model in which the computer is used to obtain descriptive statistics about textual parameters (such as sentence lengths and token frequencies), which the researcher then interprets to mean something about the text’s literary style, subject, or authorship. Researchers are beginning to explore tools and resources from computational linguistics such as part-of-speech taggers, semantic role labelers, topic modelers, and deeply tagged texts/corpora, which allow them to ask new kinds of questions about literary texts.
In this seminar, I will explore the nature and extent of this incipient change in DLS through, appropriately, a diachronic analysis of a corpus of research paper abstracts. I will then describe my own literary research with such computational resources, which is primarily concerned with software tool creation and adaptation for literary interpretational usage. Among other things, I hope to show: that this trend in the field can be broadly characterized as an increasing involvement of the computer in the interpretational aspects of literary research; that this involvement is proceeding (naturally) first via lexical and sentential semantics; and that a possible next step is software tools that extract limited semantic models as they proceed through a text, and make classification and processing decisions based on these models and their correspondence with latter parts of the text.
Longitudinal Language Learning: How the Digital Humanities Can Expand Your Research
Nina Vyatkina, Germanic Languages and Literature
Social Networks as a Tool for Visualizing Linguistic Data in Greek Tragedy
Jeff Rydberg-Cox, English, UMKC
This paper will describe ongoing work to study the use of social network diagrams as a tool to explore the language of Greek tragedy. In this project, we are constructing social networks for each surviving Greek Tragedy by Aeschylus, Sophocles, and Euripides. These diagrams are augmented with linguistic data associated with each character in the plays, thereby allowing users to more easily access and understand complex linguistic data associated with each of these characters. For background, see: RYDBERG-COX, J. “[Social Networks and the Language of Greek Tragedy.](https://letterpress.uchicago.edu/index.php/jdhcs/article/view/86)”. Journal of the Chicago Colloquium on Digital Humanities and Computer Science, North America, 1, Jul. 2011.
Bio: Jeff Rydberg-Cox is a professor in the English Department, director of the Classical and Ancient Studies program, and affiliated faculty in the Computer Science Department at the University of Missouri-Kansas City. His research focuses on statistical approaches to Ancient Greek and Latin texts. He is the author of two books including Digital Libraries and the Challenges of Digital Humanities and a student commentary on selected speeches by Lysais. He regularly teaches courses on classical mythology and representations of the ancient world in film.
Digital Cartography and Collaboration in Maine: Transcultural Map Design with the Penobscot Nation
Margaret Pearce, Geography
Indigenous place names delineate political territories, establish ancestral ties, locate and interrelate knowledges about environmental resources, demarcate travel routes and conditions, reenact transformer tales, encode climate change and climate adaptation strategies, and track the movement of communities during seasonal cycles. They are themselves manifestations of traditional cartographies, stored in the landscape and animated through engagements with that landscape. Maps are perceived to be necessary devices for the representation of these names, yet any translation of Indigenous to Western cartographies is challenging, and the maps that result from place name remappings often inadequate in their expression of the meanings and functions of the names. This dilemma is now strongly felt in the current explosion of interest in the digital mapping and dissemination of Indigenous place names, now newly urgent from the time pressures of language loss and climate change, inspired into action through the technological capabilities of digital dissemination.
This presentation explores the methodological and design challenges inherent to cartographic translations of place name landscapes through the example of my collaboration with the Penobscot Nation Cultural & Historic Preservation Department to map the Wabanaki place names of Penobscot territory. I will focus on how and why we are combining both manual and digital mapping tools in our collaboration, whether as mode of inquiry or means of visual expression.
Textual Behavior in the Human Male: Computational Text Analysis and Gender
Stephen Ramsay, English, University of Nebraska, Lincoln
Does digital humanities represent an attempt to “scientize” humanistic inquiry? Some would welcome such a move, and much work in digital humanities is closely aligned with practices and methodologies usually associated with the sciences. This talk places digital humanities within the context of a broader history involving the rise of the social sciences in the twentieth century, and suggests ways that we can think about computation and computational work without abandoning methodologies unique to humanistic study.
Bio: Stephen Ramsay specializes in computational text analysis and visualization. He teaches courses in both theater history and digital humanities. He has lectured widely on subjects related to critical theory and software design in digital humanities, and serves as a member of the Executive Council of the Association for Computers in the Humanities and as a member of the Computer Studies in Language and Literature committee of the Modern Language Association. Ramsay is the author of the book *Reading Machines: Toward an Algorithmic Criticism* forthcoming from the University of Illinois Press.