Discussing Digital Humanities 2009

This thread has been set up as a place to discuss and blog about the Digital Humanities conference at the University of Maryland (22-25 June 2009). It seems there will be over 300 delegates and many exciting presentations and discussion, and hope we can capture and disseminate at least some of that through this forum. The thread is open for everyone and I would like to invite you all to share your thoughts about this important conference.

DH2009: Media & Memories

The conference organisers have put together some materials and links to further information:

The official conference photo album is here:

http://picasaweb.google.com/mkirschenbaum/DigitalHumanities2...

The Flickr pool is here:

http://www.flickr.com/photos/tags/dh09/

Thanks to your generosity and good humor, we collected more than two dozen lightning interviews. You can see them here:

http://www.youtube.com/user/dh09conf

A partial archive of the massive Twitter traffic generated by the conference (we estimate around 2500 "tweets") is available here:

http://hashtags.org/tag/DH09

Lo fidelity audio of the two keynotes, from Lev Manovich and Christine Borgman, is available here and here:

http://terpconnect.umd.edu/~donahrm/temp/Lev%20M..wma

http://terpconnect.umd.edu/~donahrm/temp/Borgman.wma

They may be useful for those wishing to recall some specific point from the talks.

Re: Discussing Digital Humanities 2009

Only one session to go, in Neil Fraistat's words 'the last dance'. We're in the final plenary which is a gathering of the great and the good from European and North American funding bodies, who are here to tell us about 'Funding the Digital Humanities'. The list of panel members includes representatives of NEH, the Mellon Foundation, SSHRC, NSF, DFG, IMLS, and the AHRC. They are each going to speak for up to seven minutes on their particular programmes. We're kicking off with Brett Bobley, who is from the Office of Digital Humanities! at the NEH, who starts by saying 'we give grants'. He talks primarily about start up grants (sugs), and international programmes. They are concerned with sustainability, open access, etc. Second speaker is Helen Cullyer from Mellon - 'What digital humanists need is integrated solutions'. She mentioned Bamboo, and also more specific disciplinary/interdiscipinary projects. The project she used as an example is the 'Integrating Digital Papyrology' project which they funded. Why did she give this as an example: 1. Driven by scholarly needs; 2. Incorporates existing resources into an interoperable environment; 3. Developing reuseable software; 4. Standard's base approach; 5. A staged approach to building a digital environment; 5. Developed and maintained by number of funding agencies and institutions, rather than just the Mellon. This is a very clear statement of what criteria ticks Mellon's boxes. Rachel Frick is up next from the Institute of Museum & Library Services, the major funder for US museums and libraries. She is talking about National Leadership Grants. Murielle Gagnon is up next from SSHRC in Canada. She shows a slide of thematic research priorities for funding: Image, Text, Sound and Techology; Environment and Sustainability; North; Aboriginal Research. She spoke about the importance of funding bodies working together - giving the Digging into Data programme as an example. The deadline for full applications is 15 July 2009. Next up is Stephen M. Griffin from the NSF, talking vaguely about the Stanford Encyclopedia of Philosophy as well as some other projects they've funded. It's a little difficult to see what the relevance of this is for practical funding considerations, but may just be my exhaustion. The penultimate speaker is Christoph Kummel from DFG, who among other points, raised the concern about money being spent on big projects that don't actually get utilized by the community. His big worry is about how there is no master plan and no organization so 'there is a major problem about infrastructure'. And finally Shearer West from the AHRC. The AHRC has only 2.8% of Research Council money in the UK! One new initiative related to DH is a cross-council initiative with EPSRC to fund the 'digital economy'. There are a number of programmes, including Beyond Text, and some future plans (for a research centre on copyright). The AHRC is also very keen on interdisciplinarity and open innovation, and to make sure the humanities get a proper voice in these discussions. Neil Fraistat summed up the session (and the conference) with a comment about how our field is international but our funding structures are national. He is encouraged to see more interaction between the bodies represented here.

Thank you and goodnight.

Interlude: Google Books presentation

It is lunchtime now. However, instead of sitting in the sun, about 200 members of the digital humanities crowd are squeezing into a dark room to listen to Jon Orwant giving a presentation about Google Books. Jon touched upon various issues, so this will be a somewhat random list of comments and facts:

Jon mentions some of the recent legal issues regarding Google Books, especially in relation to fair use and the options authors and copyright users have for opting in or out. Google plan to offer institutional subscriptions to their book collection (currently 7 million, in the future probably more like 30 million), so that, for instance, libraries could get access to millions of books. Now on to copyright: while it is complicated enough to deal with US copyright law (the case of Mickey Mouse and the problem of knowing when an author dies), other countries make it even more difficult: in France, copyright gets extended if an author died 'for France' (in World War 2 for instance) - so Google have to find out not only when an author died but also the context of the death.

There are about 2 trillion words in the Google Book corpus. Google want to give every library in the US access to all out-of-print books in their collection. Jon would like to know what researchers want to search for in 30 million books and he is very keen on getting more input. With regards to the OCR quality, Jon said it was 'mediocre but getting better'. Reaching 100% OCR accuracy is impossible, but a flexible search algorithm can make it possible to increase search accuracy.

What does Google mean by 'making a corpus available'? Jon said that all the 'public domain stuff' would probably be made available completely, but it was more difficult to say about materials that are copyrighted (this would probably happen through an API). Google cannot make all their metadata on the books available as they license some of their metadata information. It would be very difficult to allow users to correct metadata (typo in the title, for instance) as there would be a large number of fake submissions and fraud, so this would need manual correction - and it seems more cost effective to improve general metadata sources.

Re: Discussing Digital Humanities 2009

Day three in the digital humanities house. Session two begins with Louisa Connors on 'Complementary critical traditions and Elizabeth Cary's Tragedy of Mariam'. She's arguing for the value to and overlap of computational stylistics with more traditional literary studies. Critics have been dismissive of the worth of computational stylistics. Her study is based upon 60 tragedies published between 1580-1641. She argues that computational linguistics can help to contextualize plays within historical and literary traditions. Next is Elizabeth Anne Swanstrom on 'Terminal Hopscotch: navigating networked space in Talan Memmott's Lexia to Perplexia'. This is about reading and the effect of the internet and the digital world on how we read. This paper was a close reading of a work which is described as 'a rich and complex exploration of the relationship between human consciousness and network phenomenology. Alluding to traditions ranging from ancient Greek and Egyptian myth to postmodern literary theory, using a creole of human language and code, Lexia is a work in which the functioning and malfunctioning of the interface itself carries as much meaning as the words and images that compose the text'. She has some very interesting things to say about how reading online breaks down but also reinforces traditional means of reading. Finally we have Ed Finn on 'Cultural Capital in the Digital Era: Mapping the Success of Thomas Pynchon'. Using a wide range of citations, from professional and popular reviews, MLA citations, WorldCat, Amazon statistics etc. he is looking at the way in which literary reputation is built. He does some entertaining comparisons between profesional reviews from the year when Gravity's Rainbow was published and Amazon reviews from 1996 onward, and intererstingly exploits the ways in which Amazon recommends 'related' works to its customers to demonstrate the 'cultural capital' of Pynchon.

Teaching Digital Humanities

The first paper in my second session is entitled 'Delivering a Humanities Computing Module at Undergraduate Level: A Case Study' (John Gerard Keating, Aja Teehan, Thomas Byrne). The authors discuss the development of a module on Humanities Computing that was introduced in 2008/9 at the National University of Ireland Maynooth. The module was designed with two main outcomes in mind: 1) Students should be able to design and implement a digital artefact that helps to solve a humanities research problem; 2) students should demonstrate that they can take an informed decision on software and approaches that they might use in future projects. The focus on delivering a resource (a repository holding textual and image data) at the end of the module was described as the major difference to the Digital Humanities MA at CCH.

The second paper - 'Digital History Across the Curriculum' (Amanda French, Peter J. Wosh) - deals with a programmatic attempt to incorporate digital humanities into a teaching programme, the Archives and Public History graduate program in the History department at New York University. Amanda French, who was hired as a consultant to work on the programme, stated: 'All humanities are digital, but they are not necessarily digital literate.' Amanda elaborated on this by saying that there is a lack of comfort with the digital and not much experience with teaching it. A survey she conducted to inform the programme development showed that students are very keen on developing digital skills (especially in relation to the development of web resources). Amanda and Peter outlined the context of the programme and its development. http://digitalhistorycurriculum.org/
To add yet another of my comments: I think it is crucial to embed digital humanities in teaching, to make it an integral part of the main business of universities (teaching). It would be great to see more programmes like the one at NYU.

In the discussion, Amanda just noted that this session is the only one in the conference programme that deals with pedagogy...

Re: Teaching Digital Humanities

This is an incredibly important area for discussion and development. While we develop projects and tools for scholarship, our students are also the beneficiaries and budding scholars themselves. I've been working with Project Bamboo to incorporate pedagogy into their Digital Humanities consortium structure. After all, my large MA-granting public institution is not at all interested in advancing Digital Humanities without the student element. Besides taking these projects into the classroom creates an entirely new level of collaboration. In addition, these students are our users. I'm happy to see this panel represented at the latest dh09.

Re: Teaching Digital Humanities

Simon Mahony

I am unfortunately not at dh2009 but also share Amanda's view. Teaching and pedagogy always take a back seat at conferences that seem to favour 'project report' type papers.

The problem might be one of submitting individual proposals on teaching related matters and those with an interest in pedagogy in the digital field might do better by putting together panel proposals jointly beforehand.

With the first of these two papers: 'Delivering a Humanities Computing Module at Undergraduate Level: A Case Study', I was very pleased to note the similarity in wording of content (if not the actual content - which is not openly available at NUI last time I looked) with the CCH undergraduate modules 'Fundamentals of the Digital Humanities' (no longer offered) and 'Introduction to Digital Humanities'. I hope that if this papers results in a publication some reference might be made to this.

Libararies, bibliographies and collaboration

Another day in the life of Digital Humanities. My first session of the day is dedicated to libraries, bibliographies and collaboration.

Vika Zafrin opens with a paper on the 'Library as Agent of [Re]Contextualization' (co-authors Jack Ammerman, Garth Green). Vika's starting point is the observation that digitisation can lead to a de-contextualisation of information - something that can be useful, as data can be seen in new, unexpected contexts, but also difficult, when the original context is lost in that process. She argues that developing the library as a space for discussion, both in physical and virtual space, can create a dialogue to contextualise information. Vika now gives several examples of how this could be approached, looking at recent projects of the Boston University. On the physical side, there are efforts to transform the library from a place of silent study to a collaborative space, equipped with multi-media resources to facilitate dialogue. On the virtual side, Vika argues for use of open source and open standards to create an online architecture to support collaboration.

The second paper addresses the issue of 'Library Collaboration with Large Digital Humanities Projects' (William A Kretzschmar, Jr., William G. Potter). The main argument here is that, partly because of ongoing technical development, the environment for digital humanities projects is changing so fast that resources can only be sustainable through a collaboration with libraries who can guarantee sustainability. Projects will come to an end, staff will move away, funding will run out - and no one is left to look after preservation or further development, even with very large projects. The authors also argue that even research computing services cannot guarantee sustainability because these services do rely on continuous funding that projects cannot afford. Using the Linguistic Atlas Project as a case study, the paper outlines how the collaboration with a library can work, also looking at technical aspects of preservation and how to plan for the takeover of the resource by the library.
From my own experience I would second the argument for collaboration with libraries. Actually, I am still surprised that this does not happen more often in the Anglo-Saxon world - all digital humanities projects I was involved in in Germany in the mid to late 90s collaborated with libraries on preservation and this was seen as the normal, almost expected (funders) way of doing it.

The third paper is: 'Supporting the Creation of Academic Bibliographies by Communities through Social Collaboration' (Hamed Alhoori, Omar Álvarez, Miguel Muñiz, Richard Furuta, Eduardo Urbina). The authors argue that online collaboration (social softwae) can support and reduce the costs of creating a scholarly bibliography by benefiting from the 'wisdom of the crowds'. The paper then shows how the Drupal CMS was customised to set up a collaborative bibliography with tagging, commenting and, basic, multilingual support - and how other systems such as citeulike and social bookmarking can be used in this context.
As this is the second paper I have seen that deals with a collaborative bibliography system similar to the one we are using here on arts-humanities.net I do now think I maybe should have put in a paper about this too; but then I think the technology is neither very advanced nor interesting, whereas the social arrangement would be the crucial part and I would really like to hear more about this. Interestingly, the paper now moves on to this by outlining an approach to moderation based on reputation in a social network (measured for instance by how often a users tags, how many contributions they make and how these contributions are ranked). The project team then compared 'closed' systems such as WorldCat and the MLA International Bibliography against more open systems such as CiteUike and the CIBO system that the paper is based on. The conclusion was that social software can increase quaity, quantity and use of a bibliography. These results are somewhat premature, partly because of only a small group of users was engaged in the tests. I would like to see more research into how social software can support academia in similar contexts.

Re: Discussing Digital Humanities 2009

One more session for day two. The first paper by Brian L. Pytlik Zillig is 'Embedded text analysis. The author is trying to address the problem of how you keep the text available and readable onscreen while still being able to access tools for text analysis, and is developing a tool for embedding the analysis in text. The second paper by Federico Meschini and has the fanciful but engaging title 'Should an electronic edition walk, swim, or shake its tailfeathers'. He talks about a number of tools for doing electronic critical editions, but overall is interested in the possibilities that 'dynamic and open publications' offer. There are significant challenges in the different priorities which interoperability and preservation call for. This is a very detailed discussion of the ins and outs of electronic editions, but I'm not sure what the conclusions are, and from the questions, it seems I'm not alone. Meschini when pressed said that 'we don't have to take for granted what an electronic text is' and that the digital text world is changing so fast that an electronic text will be completely different ten years from now. The added paper not mentioned in the programme is by Peter Lewis talking about the Centre for Geographical Analysis at Harvard and a project called AfricaMap. He argues that 'space is agnostic' which makes maps a good collaboration tool, as it can be a way of organizing information in many fields, and then demonstrated some interesting areas in which maps can be useful research tools. He then demonstrated africamap.harvard.edu - a very nifty system for bringing together a lot of diverse data about Africa.

Geospatial infrastructure and tools

My last session of the day deals with geospatial information systems, or to be more precise, several initiatives/projects associated with the Scholar's Lab, University of Virginia (Bethany Nowviskie, Joseph F. Gilbert, Kelly Johnston, Christopher Gist, Adam Soroka).

The presentations started with a few general remarks on GIS, emphasising that GIS can be seen as a visualisation tool, but more importantly as an analysis tool. Humanities GIS was seen as special in the way that there are often problems with ambiguity and uncertainty, for instance uncertainty about place names, borders and boundaries being fuzzy or ambiguity because of different calendars. However, this was seen as a challenge rather than a barrier as ambiguity and uncertainty often relate to the most interesting questions in the humanities.

This introduction was followed by a more detailed discussion of the Geospatial Data Portal (http://gis.lib.virginia.edu/) that gives access to collections of geospatial data at the UVA Library and around the world. The development of the portal was discussed almost as a case study on the use of open source software and standards, especially those coming out of the OGC movement to standardize spatial services and applications (have a look at the website of the Open Source Geospatial Foundation for further information). It was argued that these developments have made it easier to develop tools for the creation, storage, management, delivery and display of spatial data, leading to a movement away from more expensive proprietary web applications towards lighter tools that can interact. The more technical, infrastructure related part of the session was balanced by a few examples of how GIS can be used in research and a discussion of the Google Map marker as an example for the potential difficulty for spatial analysis: It was argued that the symbolic language used by prevalent tools is 'too divorced' from what we talk about when we talk about the world. This is an important point, I think that we should keep in mind when developing and using any kind of visualisation (I would argue it affects GIS as much as anything else).

Re: Discussing Digital Humanities 2009

Back in the saddle at DH for the second day of the conference. This panel is on literature and computing. First paper: David L. Hoover of NYY on 'Modes of Compsition in Henry Janes: Dictation, Style and What Maisie Knew'. His interest is in Late vs Early James. The argument has been made that profound stylistic change occurs because James changed from pen and ink to dictation, and this is widely accepted. Hoover uses stylistic analysis tools to test this conclusion. He makes the claim that the style changes gradually, rather than abruptly - his computational methodolgy meaures word frequency, punctuation and sentence length. His claim that 'Nothing you do can make [the radical change] work' provides an interesting corrective to the long-held truisms. The discussion revolved around the ways in which you might treat such analyses differently. The second talk was by Charles Travis on 'Patrick Kavanagh's Poetic Wordscapes: GIS, Literature and Ireland, 1922-1949. The focus of his work is the visualization of a wordscape of the Irish poet using GIS. He's producing an online 'Digital Literary Atlas of Ireland, 1922-1949'. The use of gualitative GIS seems innovative and produces some very interesting visualizations of a day in the life of a farmer-poet who moves to Dublin and becomes an urban poet. His time-space aquaria are beautiful and fascinating visualizations of the daily journey of the poet throught the fields of rural ireland and subsequently the streets of Dublin. His work really caught the imagination of the room, and was one of the best talks I've seen so far. The final paper was Tomoji Tabata speaking on 'More about gentlemen in Dickens'. This is all about collocation and what they can tell us about how Dickens is portraying gentlemen in his works. His analysis is highly mathematical using formulae to do correspondence analysis. He concludes that the collocations of gentleman tend to have a fictional function in Dickens's work.

Annotation, manuscripts and spatial information

The second session of this morning relates to manuscripts, annotation and spatial information. Charles van den Heuvel started with a presentation on MAPS, Manuscript Annotation and Presentation System, a project that looks at linking formal ontologies with social tagging to (re-)construct relationships between manuscript maps and contextual documents. This is an example of a Web 2.0 approach to connect different sources (manuscript maps) through geospatial annotation - with a lot of work done by a user community (mostly genealogists in this case). The project has conducted research in annotation practices and usability testing of the tool with focus groups to ensure that this kind of work can be done in a Web 2.0 context.

The second paper is about 'Manuscript Annotation in Space and Time' (Erica Fretwell). Erica's project is looking into digital representations of flipbooks (documents that are annotated and then layered on top of each other). Using XML (P5) and a set of software tools, the project allows for tagging, displaying and searching of static documents that mix print, manuscript, and visual images — documents such as printed texts or images bearing handwritten annotations. This will allow, for instance, 'spatial' searching and representation of documents (such as marginalia) to help following the 'temporality' of the development of a document.

Bibliography Management

The next presentation is on 'BiblioMS: A Collaborative, Large-Scale Bibliography Management System', a flexible bibliographical system designed for collaborative bibliographies for digital humanities projects (with work flows etc.). A key aspect of this software is that it can be embedded into other websites. While this is very useful, it also seems to me that there are quite a few systems out there that provide similar functionality - Drupal, the CMS that we use, has a bibliographical module that can be customised with work flows, shared editing etc. (Have a look at http://www.arts-humanities.net/bibliography). Overall, the paper seemed a little too focussed on details of the system - this is to be expected with a project that is at an early stage, but there may have been too much on details such as setting titles of references as bold vs. italic.

Amateur resource creation (museums)

Wednesday morning - Melissa Terras is speaking about 'Resource Creation Via Amateur Digitisation'. Melissa has done research on 'amateur' museums on the Internet (about 100), ranging from the 'nutters' to amateurs that have developed websites to professional standards. Melissa found out that blogs and Flickr are by far the most popular platforms to host online museums. While it may seem that this is not very important, it is actually critical to understand how these museums differ from some of the professional sites. Melissa just demonstrated this by a comparison to the UK Visual Arts Data Service - VADS has mostly static image pages that do not allow visitors to engage with the collection in the way that a blog does. Melissa also found that quite a few of the amateurs (or enthusiasts, as she likes to call them) do demonstrate an intuitive use of metadata and are very good at outreach, especially as platforms such as Flickr seem to be much more successful in attracting visitors than institutional websites. Melissa also concluded that emphemera and popular culture are much better served by the user community, both in terms of quantity of materials, usability and outreach.

This is my favourite paper so far.

Re: Discussing Digital Humanities 2009

Next panel and I'm really looking forward to this one - Blogger Grrrrrls: Feminist Practices, New Media and Knowledge Production. Rather than papers on the outcomes of disciplinary research which utilize computational methods, the focus of this panel is the media itself and its users, looking at the role of bloggers in feminist practice. Carolyn Guertin started the proceedings with a review of the revolutionary possibilities of blogging, but had some interesting things to say about the limitations of blogs - revolutionary bloggers need 'media tactics'. Then gave some very good examples of how gender activists have used new media in transgressive ways and to support some of the world's most vulnerable women. The next talk called 'Are social media virtual worlds? getting at cognitive sensations' was given by Katie King, was somewhat heavy on Second Life jargon, as well as being informed by the work of cyberfeminists such as Donna Haraway. She had some interesting things to say about how intellectual social networks are fostered in Second Life, and argued about how the use of Second Life for these kinds of activities as well as 'play' in Second Life can make us rethink the meaning of the digital humanities. Marilee Lindermann showed us her blog 'Roxie's World' her blog on 'sex, gender and other vectors of gender' which is written through the persona of a 15-year-old Wire-haired Fox Terrier. This has led to an analysis of what blogging can tell us about the act of criticism. How do blogs change previous forms of criticism? The blogs presented included: http://roxies-world.blogspot.com/ and http://grrrlingitinsl.blogspot.com/.

Does scholarly publication have a future?

The session I am attending now addresses several issues of publishing, especially around the crisis of academic publishing. Kathleen Fitzpatrick started out with a brief analysis of the current situation (rising costs of publishing, university presses going out of business, a trend to push university presses to work on a cost recovery basis). She now argues that academic publishing needs to be funded as part of infrastructure provision within universities, not on a cost recovery basis. If the university press wants to survive, it has to transform into a service unit within the university, to become part of a publishing strategy and seen as at the heart of the university's future. This is an argument for open access, but also for a transformation of the press into a unit not too dissimilar to computing services - in fact, Kathleen suggested that repositories and university presses should be merged, making the first more attractive and the latter more viable. Content, Kathleen argued, should not and perhaps cannot be monetised. After all, in academia all content is user generated - and if academic publishing in the humanities was commercially successful it would not need the university press.

Re: Does scholarly publication have a future?

The next presentation deals with 'Platform Models for Scholarly Journal Publishing' (Sarah Toton). Sarah mentioned several publication platforms for publishing journals, for instance: Public Knowledge Project http://pkp.sfu.ca/; Open Journal Systems (OJS) http://pkp.sfu.ca/?q=ojs; DPubs Digital Publishing System http://dpubs.org/ (DPubs was mentioned as being a little more robust, but also more difficult to set up and customise).

Southern Spaces (http://www.southernspaces.org/), the journal Sarah is involved with, has made the decision to switch to the CMS Drupal (http://drupal.org), for the following reasons: ease of set-up; extensibility of platform; large development community; e-journal module. The flexibility of Drupal meant that some customisation was necessary and Sarah is outlining the Southern Spaces approach.

Re: Discussing Digital Humanities 2009

Listening to Julia Flanders's paper on 'Dissent and Collaboration', in which she discusses how the customization, limitation, and extension of collaborative encoding standards (using the example of TEI) formalizes essential academic dissent. TEI provides an excellent mechanism for this by expressing such customization (and therefore dissent) openly and clearly in the ODD language. Will comment on final conclusions and discussion later...

Re: Discussing Digital Humanities 2009

I think I get this and I would like to hear more of this Gabriel. Dissent is an important part of any deliberative process and in my mind; should be apart of the design of any collaborative system. The opposite is Taylorism and I think we are near the wrong end of the 20th Century for that.

Re: Discussing Digital Humanities 2009

So the conclusion, I guess, was an appeal for more standards, tools, implementations, and hacks to better record, on the one hand, and to take advantage of and analyze these patterns of collaboration and dissent that are reflected in customizations of our community standards. I'm not sure I can picture what such an implementation would look like, but maybe someone has more concrete suggestions?

Re: Discussing Digital Humanities 2009

I am looking forward to that. I really wanted to attend that session, but I have this love and hate relation with academic publishing (since we failed in setting up a digital university press in Munich in 2000), so this is what I am listening to now...

Re: Discussing Digital Humanities 2009

The next paper in this session is on computational stylistics of Ibsen plays, and had some interesting stuff about tools that the project is using to do the work of analysis. This two elicited a wide range of questions about the methodology and research decisions taken in the analysis. There seems to be great interest in such issues as how spaces are treated in character counts. By now it will be clear to anyone reading my comments that I'm completely out of my depth when it comes to computational stylistics. I'd welcome some comments from those more in the know!

Re: Discussing Digital Humanities 2009

I am now in a panel on computational stylistic analysis. The first paper is about the researcher's analysis of a number of Poets' stylistic evolution - including Longfellow, Edgar Allen Poe and others. The researcher has traced syntax features through these poets' productive lifespan, and plotted the texts and developed mathematical formulas for the analysis of poetry. It is possible using these statistical techniques to make large generalizations about the trends such as the poets' use of subordinate clauses. Not being a poety scholar myself I'm the person to comment on the meaning of this all, but the paper engendered some very lively discussion.

Infrastructure

Next session: Supporting the Digital Humanities: putting the jigsaw together (Martin Wynne, Steven Krauwer, Seth Denbo, Chad Kainz, Neil Fraistat). This panel addressed the question of the infrastructure needed to support the digital humanities and arts. Presenters came from the following projects that all focus on infrastructure: Clarin, DARIAH, Project Bamboo and Centernet - the first two funded by the European Commission, the third by Mellon and Centernet as an initiative that is supported solely by its members.

The following key issues of infrastructure provision were identified:

  • technology has to map to the way researchers work, not the other way around
  • infrastructure needs to be planned with sustainability in mind
  • infrastructure development is not just a technological task
  • it is crucial to build on what is already there, to leverage existing infrastructure
  • infrastructure development must be informed by lessons learned from the community
  • needs to be broad, inclusive and international
  • resource providers have to overcome insularity and silo-nature
  • user engagement means to involve the community from the start

In his presentation on DARIAH, Seth mentioned a project currently undertaken by the Digital Curation Unit at the Athena Research Centre in Athens that tries to develop process models for arts and humanities digital research. These models will not just take into account the digital resources, tools and methods, but the interaction between the digital and the non-digital or more traditional processes researchers employ. This is obviously something I am very interested in as it links to work undertaken by CeRch (previously AHDS) to map methods used in digital research and resource creation: http://www.arts-humanities.net/ictguides/methods

Re: Discussing Digital Humanities 2009

I'm also currently in the panel Torsten just commented on called Preserving Virtual World: Models & Community (maybe we should coordinate our coverage) which so far is primarily about preservation (the clue is in the title) of some very old computer games - Adventure (the first adventure game), Doom, and various titles popular with gaming enthusiasts which were released on platforms like the Commodore 64 and the Atari 2600. The panel members are mostly Information Science professionals, so are interested in the ins and outs of preservation at a highly detailed level. There has been some lip service played to the cultural relevance of these games, but the interest seems to be primarily at the level of metadata and FRBR reference models. The issues of preservation are, of course, important, and I'm not one to argue that computer games are any less important as products of culture than any other types of 'high' or 'low' culture. However, I would really love to hear more about what that value is. As this is a humanities conference, I want to know what the research questions, or potential research questions are which drive the work forward. Further presentations are discussing the relationships within the gaming community - between professional developers, fans and gamers, and the preservation experts. However, here again the emphasis is upon the issues of preservation from the standpoint of archiving and library science.

Re: Discussing Digital Humanities 2009

Since I am on the LIS side, I'm probably not the best person to answer, but I think some of the work that Matt Kirschenbaum has done recently (see his book 'Mechanisms') as well as folks like Katherine Hayles point to some interesting issues in the relationships of physical media and software affordances with the creative process. And I think the increasing appropriation/reuse of artistic works in the emerging new media landscape (e.g., machinima, animutation) provides a wealth of opportunities to look at how processes of artistic/cultural production are changing.

Games, preservation

I agree with Seth's comment on the last panel - especially the first presentations were quite technical. It might have been beneficial to have a paper on actual research on computer games to show why preserving games is important for research (if you are interested in this, have a look at the website of the Centre for Computer Games Research http://game.itu.dk/ or the International Journal of Computer Games Research http://gamestudies.org/). Still, the issue of preservation needs to be addressed if we want this kind of research to go on.

Preserving digital games

Digital preservation is a key issue, not only for Digital Humanities. The first session this morning, Preserving Virtual Worlds: Models & Community, deals with preservation issues as they relate to computer games. The presentations gave interesting case studies in the difficulties of preserving software - intellectual property rights; companies (and their documentation) going bust; modifications by the user community; preserving the code (preservation of developers) or the game/work flow (preservation for users); use of proprietary software etc. Preserving games is also interesting as the gaming community takes this very seriously, allowing collaboration between preservation specialists/archivists and the public.

Of course you may wonder why bother with preserving games at all? No matter if you like them or not, computer games have become an important part of our culture and preserving games is key to studying it. If we do not take this task seriously, as one of the presenters remarked, we risk losing an important part of our cultural heritage.

vitality of Digital Humanities

With over 300 delegates, this year's is one of the best attended DH conferences. Several of the speakers yesterday, among them host Neil Fraistat and Harold Short, mentioned that as one of many examples of the 'tremendous vitality (Neil) of digital humanities. Julia Flanders called the field very 'vibrant' and Neil even remarked: 'The Digital Humanities - this is our time.' Harold emphasised that interdisciplinary collaboration was a key to this and that the Digital Humanities approach would open up many new opportunities here.

What do you think about this? Are we mainly profiting from the fact that the world gets more digital every day (and DH more visible as a result of this)? Will digital become so normal that, as everyone does it, there is no more specific Digital Humanities? Or are we indeed a vibrant community at the forefront of new ways of researching, answering new questions that are relevant for our time?

It will be interesting to see what the conference has to say about it.

Keynote: Lev Manovich

Yesterday evening, the Digital Humanities conference kicked off with a keynote speech by Lev Manovich. Lev spoke about the analysis and visualisation of large amounts of cultural data: cultural analytics. His starting point was the observation that we are experiencing an explosive growth of cultural content on the web, partly driven by digitisation programmes of museums and libraries, but also by recent trends in social media. Lev argued that, until recently, we had to chose between a shallow analysis of large numbers of objects or a deep analysis of a smaller number of cultural objects. With recent improved computational power and new analytical means it is, as Lev argued, now no longer necessary to chose between these approaches – we can do both, if we want.

Together with his colleague Jeremy Douglass, Lev demonstrated how various visualisation tools and methods could be used to that purposes. He also cautioned the audience that there was an important difference between analysing (just) cultural outputs or also the context in which they emerged. The latter was much harder to achieve, but is obviously of great importance, especially from a (digital) humanities perspective.

Because of time-constraints, Lev did not really have a chance to demonstrate in more detail what kind of answers to research questions visualisation could give, but he gave a great overview of tools and approaches to deal with cultural data (and culture, of corse) - from Hamlet to web comics and design.

Syndicate content