JISC Beginner's Guide to Digital Preservation

…creating a pragmatic guide to digital preservation for those working on JISC projects

Archive for November, 2011

Alliance for Permanent Access Conference

Posted by Marieke Guy on 16th November 2011

Last week (8th – 9th November) I attended the Alliance for Permanent Access (APA) annual conference in London. The APA aims to develop a shared vision and framework for a sustainable organisational infrastructure for permanent access to scientific information.

The event was held at the British Medical Association House, a fantastic setting for an event. It was a really interesting conference which provided a chance to hear about lots of great digital preservation projects.

There were a lot of really interesting plenaries so I’ve summarised a few of my personal favourites:

Digital Preservation What Why Which When With? – Prof. Keith Jeffery, Chair of APA Executive Board.

Unfortunately the European Commissioner Nellie Kroes couldn’t made it so Keith, outgoing chair of the Alliance, gave the keynote instead. Keith reflected on the history of digital preservation starting with the legendary story of the Doomsday book and the chameleon project. Keith talked about the importance of keeping digital resources accessible, understandable and easy to find. He gave an overview of some of the value judgements that need to be made, the standards (OAIS) and best practice (looking at projects like Parse and Aparsen). Keith also emphasised the role of the APA in this area, pulling together digital preservation research.

ODE Project – Dr Salvatore Mele, CERN

Salvatore Mele introduced the Opportunities for Data Exchange (ODE) project, which is about sharing data stories. Currently there are lots of incentives for research but not for preservation and the transition from science to e-Science has resulted in a data deluge that needs serious attention! Salvatore talked about the impossible triangle of reuse, (open) access and preservation – each leans heavily on the other. ODE has considered both the carrot and stick approaches (which have some value e.g the carrot of sharing big data has incentives to research not preservation) but isn’t enough. Mele explained that if there was no stick and no carrot we may to work one by one with researchers to encourage sharing. ODE offers a way to reduce the friction in research data management through awareness raising. The ODE Project booklet Ten Tales of Drivers & Barriers in Data Sharing is definitely worth a read.

Mr Mark Dayer, Consultant Cardiologist, Taunton & Somerset NHS Trust

It was really refreshing to hear the view of an outsider. Mark Dayer is not involved in digital preservation, he is a consultant cardiologist – he operates on hearts. Mark gave an incredibly open and entertaining presentation on the state of play in the National Health Service (NHS). He began by giving some background for the non-UK residents in the audience: “The NHS is a beloved institution that no political party dare dismantle” – or at least it used to be. Unfortunately the NHS and IT has made for grim headlines in the recent past and the NHS has enormous quantities of data and an enormous number diverse systems working locally and in unconnected ways. Many people are still working with paper based systems .Not only this but the NHS needs to make £20 billion of savings. Mark explained how an increasing number of systems (120 different clinical systems in use in one area) and bad IT planning has added to the problem. Other issues such as data security add to the mix: the ‘spine’ personal records system should hold over 50 million records but only has 5 million so far.

After the disaster story Mark moved on to the small successes that have started to happen. He explained that they are starting to build data centres, use the cloud (e.g. Chelsea and Westminster hospital) and use integration engines (which give an idea of number of data standards). He talked about the systems and standards including CDA, HL-7, ICD-10 (classification system), OPCS, SNOWMED-CT and about the new N3 VPN. Mark concluded by saying that it wasn’t just about the right software, but about the right hardware too, and that you need to bring people with you, all the way

Dr Martha Anderson, Director of the NDIIPP, US Library of Congress, Networks as evolving infrastructure for digital preservation

Martha Anderson started off by showing us a picture of the biggest Web seen. She explained that the old African proverb “when spiders unite they can take down a lion” applies here. Almost a dozen spider families were involved in the creation of this Web, the population had exploded due to wet conditions. Martha applied this analogy to digital preservation networks, telling us that we need our network will evolve if the conditions are right. The National Digital Information Infrastructure and Preservation Program (NDIIPP) was created to help create networks between people to undertake preservation – communities working together as bilateral and multi-lateral alliances.
Many different institutions are now involved in digital preservation and in developing alliances across communities. A good example is the blue ribbon task force which cut across sectors including the financial, scientific, aerospace and HE. Other sectors have much to offer us, for example Martha has learnt about video metadada annotation from Major League Baseball! The Data-PASS network gives a picture of what networks are doing. Martha concluded that it is all about setting up and supporting social interaction and local interaction to set up networks – finding common stories. She felt that if there was no local benefit for work then it cannot be sustained and that it cannot last past the funding. Martha observed that it is interesting that groups of institution will act in public interest but in their own interest on their own. Networks are beneficial to all.

UK Government views, Nigel Hickson, Head EU and International ICT Policy DCMS

Nigel Hickson was there to talk about the government’s responsibility for the digital infrastructure which includes the take up of broadband and copyright issues. Nigel began by singing the praises of the Riding the Wave report that was released 2010 by the high level expert group on research data, the Knowledge Exchange. He talked about the importance of having a framework and a holistic approach. For many broadband is an economic driver, mobile data continues to be a disruptive element (doubling every year) and all this spells game change for the public sector. The problem is that mobile data is increasing; the solution is having an ‘auction’ to increase capacity. The current UK approach is that the market should lead and that competition is vital. Britain’s superfast broadband strategy has 530 million to spend by 2015 and potential for an extra 300 million before 2017. Projects require price match from the private sector. The government also wants things to be digital by default, with the option of doing them offline if necessary. Other key priorities are a rights management infrastructure and the proposal on orphan works.

Nigel also outlined the European digital agenda where broadband is again a critical element. The key European targets are for basic broadband by 2013 for 100% citizens. By 2020 50% of households should have subscription of 100Mbits ps or above.

The Report A Surfboard for Riding the Wave builds on the 2010 report and presents an overview of the present situation with regard to research data in Denmark, Germany, the Netherlands and the United Kingdom and offers broad outlines for a possible action programme for the four countries in realising the envisaged collaborative data infrastructure.

Tags:
Posted in Conference, Events | Comments Off