The „Shaping Access!“ conference this year was held at the Museum für Gegenwartskunst [Museum for Contemporary Art] in the Hamburg Station (Berlin) , for the sixth time overall. Here, we report on the course of the conference and the most contentious points in discussions, in our opinion. We have laid our position on these issues elsewhere.
The conference was organized in a cooperation of leading German cultural and commerative institutions, among them the Bundesarchiv [Federal Archive], the Stiftung Preußischer Kulturbesitz [Prussian Cultural Heritage Foundation], the Deutsche Digitale [German Digital Library] and Deutsche Nationalbibliothek [German National Library], the Jüdische Museen [Jewish Museums] of Frankfurt and Berlin, the ZKM Karlsruhe and Wikimedia Deutschland. The conference was to adress two major complexes of problems in particular hitting cultural institutions worldwide: the question of secure long-term storage and archiving of cultural goods of various kinds on the one hand – i.e. the question of sustainability – and the reform of copyright law on the other. It is these two aspects that create major impediments to cultural mediation in the wake of the ubiquitously ongoing digitalization. „Digitalization“, as a keyword, altogether signifies a multi-disciplinary mega-project, in which a great number of specialists must be brought together – from arts and culture historians to administration and IT experts through to open source activists. This primarily brings about serious financial, legal and organizational difficulties. Basically, cultural institutions have to deal with a structural paradox: they must respond to the general demand of „sustainability“ by according solutions to long-term archiving which can only be developed in limited temporary projects. Within this frame, after each project stage discontinuity, instead of sustainability, impends: financing problems due to uncertain subsequent funding, expertise and work progress being lost as project teams have to be dissolved again, barely any chance to establish long-term collaborations between institutions. The copyright aspect of archive work alone demands intense efforts in terms of staff and time from most institutions, entailing severe funding gaps. The REM (Mannheim) provide a rather vivid practical example for the problems created by copyright issues: The Reiss-Engelhorn-Museen are owned by a private foundation, which the municipality of Mannheim holds 15% of in terms of shares. In turn, private owners possess a majority of the objects in the custody of the foundation, with different contracts applying each time. For any publication involving any of the objects in private possession, special permits or legal opinions must be obtained, and profit shares negotiated. Many other objects‘ legal status, here and elsewhere, is completely uncertain and must be researched and verified in every single case, for every single object.
It is in the light of these constraints that the demand for a reform of copyright law is put forth and under which cultural agents seem enforcedly to unify as an interest group. They create a political fault line at the same time. The conflict may be headed with its key concepts of „collective licensing“ and „digital single market“. Roughly speaking, collective licensing would a legal instrument conceding collective rights of use for particular classes of objects, so that their utilization would only have to be negotiated once. The EU digital strategy seems to focus its attention on a different aspect. Jörgen Gren, representative of the EU Commission, thus explained the main points of the agenda to create a digital single market like this: Free access could, according to the understanding of the respective EU comittees, hardly mean no-cost access. Priority rested with combatting piracy, the compensation of publishers, liability for (dead) links and the rights of subsequent use in the case of publications out of print. The aim would have to be to make copyright more transparent in order to strengthen the bargaining position of producers of cultural content. If legal obstacles in the way of commodification were thus removed, giving producers a realistic chance to reach appropriate prices for their work, it might encourage them to start publishing digitally. So while the EU digitalization strategy seemed to focus primarily on the possibilities of commodification, indications of disenabled politics where showing elsewhere. Undersecretary Matthias Schmid, head of the copyright department with the Federal Ministry of Justice, in the light of the complexity of the matter recommended to test for the legal and political needs in concrete cases at hand, using the example of project work and specific practical obstacles that may arise. Otherwise, it was hardly possible to arrive at an understanding of what a functional reform would actually have to look like. Some participants took from this position a subtle call to proactively create a fait accompli in the legal twilight zone, an approach that must naturally be refuted again immediately. Instead of taking constructive steps, the political side thus rather seemed to reaffirm the status quo.
Problem-solving approaches, then, were merely provided by those projects which indeed tackle the given tasks practically. Here, a third complex of problems – hardly less demanding – comes to light: Digitalization first of all challenges conservation work in its technical dimension.
Paradoxical as it may seem, this implies that requirements rise not least in the analog dimension: larger and more specialized staff are needed; both the devices and the according know-how are needed to keep rapidly dating technologies available – particularly in the case of video and film; along with storage mediums the plattforms and players they run on become dated, warranting for them to be conserved, too. The ZKM Karlsruhe provides a perfect example for this type of problem. Analog storage, or literal warehousing, is a primary and heavy expense in archive work. And there are oddities in this: Some archives today revert back to copying digital information on micro-film, as it is said to be extremely long-lived. Such measures, however, are at the same time due to problems of technical obsolescence – that is, digital technologies rapidly and in part arbitrarily becoming out-dated –, so that it also becomes necessary to develop protections against the scenario of information monopolies held by large IT corporations.
Not least, technical difficulties include problems of terminology. These concern standards in programming as well as conventions in information processing. The lack of consistency in this area – in terms of terminology, computing, and infrastructure – becomes most drastically apparent in the setup of databases, a central instrument in providing open access: depending on the institution archiving may be based on completely different systems of order. The terminology applied follows a broad spectrum of technical vocabularies, classifications and taxonomies. In order for those not only to be exchangeable among experts and scientific institutions, but to be open for public use, too, pioneering projects are working on the compilation of thesauri tailored for different fields of academic research. These thesauri are supposed to serve as common ground for communication, while they need to be – for reasons of openness, of sustainability – learnable languages at the same time. In many cases, besides the according departments with bigger institutions, open source and community-based initiatives also take part in the development of future standards, contributing to the structuring of data sets, the development of open programming languages and the sighting and interpretation of source material. Organizationally, but also in terms of ethos, wikipedia along with its sister project wikidata, function as fixed stars in multiple regards.
The true „quip“ of the conference is already hinted at in these examples; it‘s in the question of who it will actually be conserving, curating, imparting, and using the collected works of cultural heritage. Among the few approaches based on a broad consensus participatory models feature the most prominently. Almost all phases and almost all fields of conservation and cultural mediation – from technical infrastructure to information processing to contentual analysis of source material – increasingly involve public groups, topic-centered communities, free initiatives as well as experts and interested laypeople alike. At the service of public education, there seems to be a great willingness to cede interpretational sovereignty on the one hand, and, on the other, to have a public discussion on what objects of cultural heritage actually do define it, without predetermining them. Digitalization then appears – besides all the problems it causes – as a wholesale democracy project, which may just as well succeed or fail.