Three Library Science Dissertations

Prof. dr. Frank Huysmans is extraordinary professor of library science at the University of Amsterdam. His chair is funded by the National Library of the Netherlands (KB). On his website warekennis.nl he blogs regularly and recently he discussed three Dutch dissertations on Library Science. We are happy to reblog his Dutch post below.

Drie bibliotheekpromoties in acht weken

Soms lijkt er een jaar niets te gebeuren. Of nog langer. Promovendi ploeteren voort en werken in stilzwijgen door aan het Grote Werk. Dan ineens is alles af en krijg je in korte tijd drie van die boekwerken voor je kiezen. Dat klinkt als een opgave, en dat is het, maar het is ook een feest.

Continue reading

“Towards Open Science” Liber conferentie, 24-26 juni in Londen

“Towards Open Science” was het thema van de Liber conferentie en het lijkt op een nieuw buzzword na jaren van  “Open Access”. Maar waar Open Access zich richt op de eindresultaten, dus de artikelen in tijdschriften, richt Open Science zich op transparantie van het hele wetenschappelijke proces. En bibliotheken kunnen daar een grote rol in spelen, volgens Dr Jean-Claude Burgelman, hoofd van unit Science Policy and Foresight van de Europese Commissie (zie ook http://scienceintransition.eu/ ). “Libraries are the knowledge brokers for Open Science”, luidde het in zijn lezing. Open Science is een grote verandering in denken (paradigm shift) maar biedt ook mogelijkheden, als meer transparantie, meer waar voor je geld en een betere relatie tussen wetenschap en de gemeenschap (society).  Het thema Open Science staat hoog op de agenda van de Europese Commissie en we zullen dat in Nederland merken als we volgend jaar voorzitter van de EU worden. Ook Sir Mark Walport (chief Scientific Advisor van HM Government in de UK) zag ‘electronic publishing’ als de “second Gutenberg”, omdat het zoveel kansen biedt, betere indexering, betere vindbaarheid en  ‘evoluerende’ artikelen doordat zowel positieve als negatieve feedback in nieuwe versies verwerkt wordt (ik denk dan, hoe bewáren we dat?). De impact van wetenschap kan groter worden als er meer gedeeld wordt via Open Science. Continue reading

Europese uitgevers in Berlijn

Barbara Sierman

De 10de bijeenkomst van de Academic Publishing in Europe (APE) vond afgelopen week plaats in Berlijn onder het motto “Web25: The Road Ahead exploring the Future of Scholarly Communication and Academic Publishing”. Voor de KB is dit een gelegenheid om op de hoogte te blijven van de ontwikkelingen in de uitgeverswereld. Zij leveren ons immers de materialen die wij bewaren – voor langere tijd.

De Europese Commissie, zo vertelde Celina Ramjoué, hecht er groot belang aan om van Open Access (wat in de lezing van Jan Velterop terecht werd aangeduid als een middel en niet een doel op zich) te komen tot Open Science. Gedefinieerd als

The transformation, opening up and democratization of science and research and innovation through ICT, with the objectives of making science more efficient, transparant and interdisciplinary, of changing the interaction between science and society, and of enabling broader societal impact and innovation.

tn_WP_20150121_001

Wetenschappelijke resultaten moeten op grote schaal bereikbaar zijn voor iedere geïnteresseerde. Maar daar moet nog wel wat voor gebeuren. Phil Archer van W3C bepleitte een zorgvuldiger gebruik van meta data standaarden (zie schema.org). Meta data die voor meerdere interpretaties vatbaar is, leidt tot slechte zoekresultaten. Met name het linken van relevante informatie via het web wordt moeilijk als de meta data niet eenduidig door een computer geïnterpreteerd kan worden (slaat een naam op een bepaalde persoon, zijn werk, een plaats etc.). En daar spelen uitgevers een belangrijke rol in. Maar dit is niet de enige hinderpaal. Kosten van publiceren zijn hoog, en met name monografieën (gangbaar in de Humanities en Social Sciences) lijden daaronder. Dat is een risico voor wetenschappelijk publiceren in die domeinen.

Er is uitgebreid gesproken over de voor en nadelen van de (6) Creative Commons licences. Subsidiegevers stellen vaak de eis welke licentie de onderzoeker moet kiezen, als hij op kosten van de subsidiegever het artikel publiceert .Maar wordt hiermee de vrijheid van de onderzoeker aangetast om eventueel zelf nog een commercieel gewin te maken ? En als het artikel niet commercieel geëxploiteerd mag worden, wie betaalt dan de infrastructuur om het artikel duurzaam toegankelijk te houden? De spreker  was kennelijk nog niet op de hoogte van de activiteiten van Nationale Bibliotheken!

Open Access levert voor uitgevers én onderzoeksinstellingen ook de nodige administratieve problemen op (tarievenstructuur, afrekening per auteur, instituutskorting, proceskosten etc.) en vanuit verschillende invalshoeken werd geanalyseerd hoe Open Access zo efficiënt mogelijk geïmplementeerd kon worden. En welke meerwaarde heeft de uitgever nog bij Open Access; is de peer review methode nog van deze tijd en levert die voldoende kwaliteit op?

De impact die onderzoek heeft, wordt deels gemeten aan de hand van citaties, maar het was vrij onthullend te horen dat 50% van de scholarly output niet gelezen wordt (behalve dan door de auteur) en dat 90% zelfs nooit geciteerd wordt. Dat moet beter kunnen. Semantische zoekfaciliteiten en het linken van informatie ziet men als een oplossing. Microsoft probeert door innovatieve oplossingen te verwerken in BING en Office producten in elk geval de wetenschappelijke publicaties beter bereikbaar te maken Ook Jan Velterop bepleitte een optimale verspreiding van research resultaten om, zoals hij het noemde “lamppost research” te vermijden (je ziet alleen dat wat binnen je blikveld valt) door innovatieve (semantische) zoekmethoden te ontwikkelen. Open Access is dus op zich niet genoeg, maar moet vergezeld gaan van methoden om de resultaten toegankelijker te maken. Daar hebben uitgevers [ en bibliotheken] een belangrijke rol in. En pas dan is er sprake van Open Science.

Voor wie meer wil weten van deze conferentie, binnenkort worden video’s en samenvattingen van alle presentaties geplaatst op http://www.ape2015.eu/

Reflections on iPRES 2014

Originally posted on: http://digitalpreservation.nl/seeds/reflections-on-ipres-2014/

Perseverance Hotel, Brunswick Street, Melbourne

Perseverance Hotel, Brunswick Street, Melbourne

Recently after 22 hours of flying I attended the iPRES 2014 conference in Melbourne, which was an awesome experience. How often does one have a chance to discuss aspects of the profession of digital preservation with no need of explaining the obvious? Meeting 200 colleagues, gathered in the beautiful State Library of Victoria, was an excellent opportunity to exchange ideas. And because there is no top ten of the buzzwords and no evidence of what was discussed during the breaks, the lunches and the dinners, I will summarize some highlights that inspired me.

First of all: everyone is worried about the fact that we constantly need to defend ourselves. Even in large organisations with mandates to preserve digital material, repeatedly higher management needs to be persuaded to think about the consequences of this mandate. What is wrong? I personally wonder whether we use the right language, or as I said in the panel on Friday, maybe we lack the skills of framing digital preservation properly. How to describe an abstract concept of long term preservation to the non-initiated? Our message is often too complicated, speaks of problems and high costs without sketching the clear benefits and does not appeal to the imagination. This is a threat for our profession, as many colleagues agreed.

The second topic was about involving other disciplines in digital preservation, like the industry and science. Mixing ideas from different disciplines might lead to new ideas and innovations. It is important to convince these disciplines of the importance of digital preservation for their business, research or simply their sustainability and our knowledge can help them to get up to speed, while they can help us to make the next step forward. For this we need to develop a language understood by all parties. ( I’ll ignore the fact that the week started with a discussion about whether it is digital preservation, digital curation, data science or whatever term you could use – in my view this discussion distracts us from the main problem.)

But we all expected great benefits of more collaboration amongst the current preservationists, although there are some aspects that need to be taken into consideration. Collaboration without a clear benefit for the participants is doomed to fail. Sometimes projects might lead to good results, but these projects require an organisation and funding. Chances for these kinds of projects are little in the current economical climate. But on a smaller scale the magic of the conference atmosphere led to new initiatives between a few individuals to exchange information, do something together or to really start working on an old idea.

The final buzzword on the conference was the DPTR: the Digital Preservation Technical Registry. An initiative to develop a file format registry, with a new data model and filled with information from existing registries. Currently it is a proposal under Horizon 2020, based on work done by the National Library and Archive New Zealand and National Library of Australia. There were mixed feelings about this initiative. On the one hand all agreed that the current registries fail for various reasons and that a good format registry is an absolute must. But what are the “lessons learned” about the previous – not so successful registries? Were these lessons incorporated in this new proposal? And was this new proposal not another example of technique first and then use cases?

It were the presentations  – which will be published soon – that gave us new food for thought but I’m convinced that the discussions during the breaks really gave us renewed energy to proceed with the challenge of digital preservation, although it took another 20 hours flying before I was back home!

Europeana – the case for funding: tweet #AllezCulture

A drastic cut was made in the budget for the Connecting Europe Facility (CEF) from 9 billion to 1 billion euros. This will hit Europeana, the infrastructure supporting Europe’s free digital library, museum and archive, very hard. Europeana is now being asked to put the case for funding under the revised guidelines for CEF, which were issued 28 May 2013. Europeana will face severe competition for the available funding from other digital service infrastructure such as e-Justice, e-Health and Safer Internet. All good causes in their own right, but the wonderful digital culture infrastructure that has been built in the last decade will soon get squashed if we do not speak out now!  So here goes: 

europeana

Here is a summary of the three arguments for funding:

1 Europeana supports economic growth.
Some Impact Indicators:

  • To date, 770 businesses, entrepreneurs, educational and cultural organisations are exploring ways of including Europeana information in their offerings (websites, apps, games etc.) through our API. See examples such as inventingeurope.eu and http://www.zenlan.com/collage/europeana.
  • Digital heritage creates jobs – in Hungary, for example, over 1,000 graduates are now involved in digitising heritage that will feed in to Europeana. Historypin in the UK predicts it will double in size with the availability of more open digital cultural heritage.

2. Europeana connects Europe. 

People often speak about closing the digital divide and opening up culture to new audiences but very few can claim such a big contribution to those efforts as Europeana’s shift to cultural commons.’ Neelie Kroes, Vice President of the Commission

3. Europeana makes Europe’s culture available for everyone.

In 2012, all 20m Europeana records were released under a Creative Commons Zero public domain dedication making them available for re-use both commercially and non-commercially. Europeana’s CC0 release is a ‘coup d’état’ that ‘will help to establish a precedent for other galleries, libraries, archives and museums to follow – which will in turn help to bring us that bit closer to a joined up digital commons of cultural content that everyone is free to use and enjoy.Jonathan Gray, Open Knowledge Foundation.

For those unaware of Europeana – here is what they do: 

Europeana has been transformative in opening up data and access to cultural heritage and now leads the world in accessible digital culture that will fuel
Europe’s digital economy. Through Europeana today, anyone can explore
27 million digitised objects including books, paintings, films and audio.

Europeana is a catalyst for change for cultural heritage

– Because they make cultural heritage accessible online.

– Because they have standardised the data of over 2,200 organisations, covering all European countries and 29 European languages.

– Because they provide creative industries and business start-ups with rich, interoperable material, complete with copyright information.

– And because they ensure that every citizen, whether young or old, privileged or deprived, can be a digital citizen.

So please support Europeana by tweeting, blogging, facebooking and whatever other media you like, using the hashtag #AllezCulture!

Re:publica 2013: In/side/out

From 6-8 May the Re:publica conference was held in Berlin – one of the largest international conferences in the field of blogging, social media and our society in the digital age. In numbers alone the event is already quite impressive: with over 5.000 attendees, 300 speakers, 3 Tb of video footage or 95 hours of sessions, talks, and panel discussions and around 27.600 social media activities, it was hard to keep up: already before the conference started, the Twitter stream of #rp13 exploded with tweets!

republica_1          republica_2Photos by Gregor Fischer (https://www.flickr.com/photos/re-publica/)

But it was not only the size of the conference that was impressive: there was also a large number of interesting speakers and talks. Unfortunately my German was not sufficient to follow every detail, so I’ve mainly attended the sessions in English. This did make for an easier choice, as there were often around 10 parallel sessions and workshops. For most of the  presentations a video is online – a good overview of all sessions is available here: http://michaelkreil.github.io/republicavideos/.

Several talks focused on the differences in internet access and internet censorship worldwide. During the session ‘ 403 Forbidden: A Hands on Experience of the Iranian Internet’ you could gain more insight in how the Iranian internet is censored by participating in a quiz. Several websites and newpaper images were shown, and you had to guess which sites are blocked and which photos are manipulated.  The results were often surprising (for example, the sites of the KKK and Absolut Vodka are blocked, but the sites of Smirnoff Vodka and the American Nazi party are not), showing how internet filtering in Iran is both arbitrary and targeted, designed to make you feel insecure online.

michella obama

#403forbidden: Iranian photo manipulation of Michelle Obama’s dress

Another interesting talk in this area was ‘Internet Geographies: Data Shadows and Digital Divisions of Labour’ (unfortunately the video of this talk is not online yet). Mark Graham of the Oxford Internet Institute showed how, despite the initial hope that internet would offer everyone around the world the opportunity to share their knowledge, there are significant concentrations of knowledge. For example, Europe has just over 10% of the world’s population, but produces nearly 60% of all Wikipedia articles. There are even more Wikipedia articles written about Antarctica than about any country in South America or Africa. He illustrated these numbers with great visualisations, most of which you can find through this blog: http://www.zerogeography.net/2011/09/geographies-of-worlds-knowledge.html

In ‘Investigation 2.0’ Stephanie Hankey and Marek Tuszynksi of Tactical Tech discussed ways in which individual hackers and people working with data visualisation are nowadays working together with journalists to make certain underexposed data visible and present it in a way that gives us more insight into complex social and political issues. A good example of this is the visualisation that was made of drone strikes in Pakistan since 2004: http://drones.pitchinteractive.com.

Data visualisation can also be done through food, which the three women behind the cool website http://bindersfullofburgers.tumblr.com call ‘Data cuisine’ (or How to get juicy data from spreadsheets). They are working on visualising data that is usually thought of as boring and dry, such as election results, in creative and appealing ways. In Binders full of Burgers they displayed the US election results of 2012 with burgers and fries, while they used waffles, candy and fruit to show the local Berlin elections (http://wahlwaffeln.tumblr.com/). They also made their own talk more appealing by handing out cookies for audience participation :)

Then there was a good overview presentation by Joris Pekel of the Open Knowledge Foundation on Open Data & Culture – Creating the Cultural Commons. After presenting some impressive numbers of available online records, media files and metadata from Europeana, DPLA and Wikimedia Commons, he focused on the possibilities of enabling the public to connect and contextualise open data in the future through initiatives such as OpenGLAM. His talk is also available from Slideshare at http://t.co/bpqBEB28sb.

A very direct way of promoting open data, transparency and hacker culture was presented by Daniela B. Silva in Hacker culture on the road. In 2011, a Brazilian hacker community came up with the plan of starting a Hacker bus: they crowdfunded the budget for buying an old school bus and started travelling around Brazil, organising local events focused on promoting government transparency and hacker activism. Volunteer hackers, digital activists, lawyers, artists and journalists joined the bus, listened to the needs of the people they met and helped them to develop answers with the help of available technology, such as local street mapping applications and apps but also new local legislations.

Finally, one of the most fascinating talks was that of Neil Harbisson and Moon Ribas on Life with extra senses – How to become a cyborg. They showed ways of extending and creating new senses and perceptions by applying technology to the human body, as well as the artistic projects they created based on these sensory extensions. For example, due to a medical condition Neil Harbisson only sees in black and white, so he developed an electronic eye that transforms the colours around him into sound, allowing him to experience colour in a different way.  Together they have established the Cyborg Foundation to further promote their activities and help more people to become a cyborg, because they believe all humans should have the right to extend their senses and perceptions. Interestingly they see this as a way for people to get closer to nature instead of alienating themselves from it, since many animals have very different sensory perceptions than we do.

935450_502820113100275_1127485294_n

 

 

 

 

 

 

 

 

Neil Harbisson and Moon Ribas: when Neil scans Moon’s dress with his Eyeborg, he will hear the song ‘Moon River’ (Photo by Tony Sojka, https://www.flickr.com/photos/re-publica/)

All this is just a fraction of everything that was presented at Re:Publica: at http://michaelkreil.github.io/republicavideos/ you can find videos of nearly every talk – and if you speak German, you’re even more lucky!

Performance measurement – A state of the art review

Author: Henk Voorbij

The Handboek Informatiewetenschap (Handbook Information Science) contains about 150 contributions from experts on various subjects, such as budgeting, collection management, open access, website archiving, digitization of heritage collections, information retrieval software and gaming in the library. The Handbook is published as a loose-leaf paper version and as an online version (http://www.iwabase.nl/iwa-base/). Unfortunately, it is not well known.

c_iw

Articles are being updated every ten years. Recently, I was requested to update the article on Performance Indicators, originally published in 2000 and authored by Peter te Boekhorst. My contribution starts with definitions of core concepts (for example: what is the difference between statistics and performance indicators). It continues with general guidelines that may be helpful to libraries which aim to develop their own instrument for performance measurement. Among these are models such as the classical system approach (input – throughput – output – outcomes) and the Balanced Scorecard, and international standards, such as provided by ISO and IFLA. In the Netherlands, there are well developed benchmark instruments for university libraries and libraries from universities of applied sciences. I am involved with the development of these systems and analysis of the data since many years and describe my experiences in depth, in order to provide an example of the caveats and benefits of performance measurement. The last three chapters address potential additions to traditional performance indicators: user surveys, web statistics and outcomes.

Updating an earlier version offers an excellent opportunity to depict the progress in the field. Two things struck me most. One is the fast rise of the concept ‘Key Performance Indicators’. There’s no agreement in the literature of what this concepts actually means. Some use it loosely and do not make a genuine distinction between performance indicators and key performance indicators. Others have very pronounced ideas of its meaning: there should be no more than ten KPI’s, they should be measured daily or weekly, they should be coupled to critical success factors rather than strategic goals, and they should be understood by a fourteen year old child. The other thing is the growing interest in outcomes, the popularity of LibQual+ as an instrument to measure user satisfaction and the upsurge of new technologies such as web analytics. I can’t wait to see the 2023 version of the paper.

MOOCs in the Netherlands by Surf Academy

The SurfAcademy, a program set up to encourage knowledge exchange between higher education institutions in the Netherlands, organised a seminar on MOOCs, Massive Open Online Courses, on 26 February. Several Dutch institutions have started with MOOCs on various platforms and subjects, so the special interest group Open Educational Resources (OER) of Surf thought it was time to share experiences and open up the discussion for institutions that wish to jump on this fast moving train.

The Koninklijke Bibliotheek does not normally provide education as the National Library of the Netherlands, but we do work together with the Dutch universities (of applied sciences) and we are happy to share knowledge with our colleagues and users. Also, as one of the founding members of the impact Centre of Competence in text digitisation, we were asked to think about how we can best share the knowledge that was gathered in the 4 year research project IMPACT. Perhaps a MOOC would be a good idea?

The afternoon has an ambitious program, but is filled with experiences and interesting observations. I thought the most interesting parts of the afternoon were the presentations of the universities that are currently working with MOOCs in the Netherlands. Those were LeidenUniversity, presented by Marja Verstelle, the University of Amsterdam, presented by Frank Benneker and Willem van Valkenburg on the work the Technical University Delft is doing with their MOOC.

It is interested to see the different choices each institution made for their own implementation of a MOOC. Leiden chose to work with Coursera and TU Delft joined EdX, while Amsterdam built their own platform (forever beta) in only two months and just 20k euro with a private partner. Each have their own reasons for these choices, such as flexibility (Amsterdam), openness (Delft) or ease (Leiden). Amsterdam is the only university that has started its MOOC already with great success (4800 participants in the first week), Leiden plans to start in May 2013 and Delft follows in September.

Another interesting presentation was the one by Timo Kos, both from KahnAcademy and Capgemini Consulting. He shared the results of two projects he did on OER, including MOOCs. As he showed us that MOOCs are not a technical hype, because they use no new technologies, merely combine existing ones for a new purpose. However, MOOCs can be indicated as a disruptive innovation, but as he says in the panel discussion at the end of the day we do not have to fear that real-life universities will be pushed out by MOOCs.

All in all, I thought it was a very educative day with lots of food for thought. Most presentations are unfortunately in Dutch, but can be found on the website of the Surf Academy, where you will also find the videos made during the seminar. The English presentations have been embedded or linked to in this post.

Some of the questions and insights I took home with me:

  • Leiden and Amsterdam chose to create shorter videos for their MOOCs, while Delft will record regular classes. When do you choose which approach?
  • Do you want to use a platform of your own or will you sign up with one of the existing ones? (Examples: Coursera, EdX, Udacity, canvas.net)
  • Coursera takes 80-90% of the money made in a MOOC and they sell their user’s data to third parties. (Do have to say that I did not did a fact-check on this one!)
  • Do you want to get involved in the world of MOOCs as a non-top-50 university or even as a non-educational institute? The BL will do so, by joining FutureLearn.
  • PR of your MOOC is very important, especially if you use your own platform. However, getting a news item on the Dutch 8 o’clock news will probably mean one server is not enough for the first class.
  • The success of a MOOC also depends on the reputation of your institution.
  • Do students feel they are studying at an institute/university or at i.e. Coursera?
  • Using a MOOC towards your own degree is possible when you take the exam in/with a certified testing centre, such as Pearson or ProctorU.
  • If you plan to go into online education, when do you consider it a MOOC and when is it simply an online course?

Looking back: publications in 2012

2012 was quite a busy year for the people working at the Research department of the KB National Library of the Netherlands. The large-scale project IMPACT was concluded in the summer of 2012, followed by the start of the IMPACT Centre of Competence in Digitisation. Work continued in ongoing research projects such as SCAPE and APARSEN (both in the area of digital preservation) and the newly started Europeana Newspapers project. In addition, articles were written on research areas such as copyright and benchmarking for libraries, and a website was started for the Atlas of Digital Damages. Last but not least, our library & two people from our Research department were featured in an item on the Euronews TV channel.

research

 List of publications in 2012 by the KB research department: