Kara Van Malssen (NYU) and Karen Cariani (WGBH) represent the PDPTV project at IASA 200940th Annual Conference of the International Association of Sound and Audiovisual Archives (IASA)
20-25 September 2009 in Athens, Greece

by Kara Van Malssen

the author (left) and Karen Cariani (WGBH Media Library and Archives) in front of our poster on sustainability issues for public broadcasting preservation

From 20-25 September 2009, the 40th annual conference of the International Association of Sound and Audiovisual Archivists was held at the Megaro Mousikis (Athens Concert Hall) in Athens, Greece.  The theme: “Towards a new kind of archive? The digital philosophy in audiovisual archives.” After a full week of committee meetings, presentations, and tutorials that provided perspectives from all corners of the globe, the conference really did begin to arrive at a shape for the new kind of archive, one with good digital collection management, a clear sense of ethical responsibility, wide accessibility, and that puts users at the top of the priority list. It seemed that in a shift from other conferences in this field in recent years, where the focus has been on the overwhelming need to migrate physical media to digital and on the various issues that come along with that, the focus is now on how to best take advantage of the opportunities that digital archiving offers. The future is perhaps not so bleak as we might have thought a few years back, but it shouldn’t come as a surprise that to succeed in the digital era a few changes will be required.

The keynote talk, delivered by Edwin van Huis (Cultural Entrepreneur, former Director of Netherlands Institute for Sound and Vision, and former president of FIAT/IFTA), began on this note with a question to the audience: “What is a good archive?” And the answers, both from the attendees and the speaker himself, weren’t just about having good storage and good cataloging. These things are not necessarily a number one concern in an age where users are able to access (seemingly) all the content they want, on the web, at any time. The key is for archives to maintain relevance.  A good archive is one that is connected to users, providing many points of access, and allowing for different relations and meanings to become attached to content. A good archive is one that knows and works with its users, and changes according to their expectations. Because if you don’t keep up with them, they’ll go somewhere else.

To me, there were three overarching themes throughout the conference: the convergence of traditional domains, aggregation of content, and collaborations; new approaches to metadata; and ethical, legal, and moral issues of online access.

Convergence, Aggregation, Collaboration

Convergence is an important topic for heritage institutions in the digital age. This issue was the subject of panel early in the week, and the questions and concerns raised there continued to resound throughout the conference. Many speakers noted that users don’t necessarily care whether something is a video, film, photo, or text — they just care that it relates to a subject. Where then does that leave museums, libraries, and archives? What unique function do they have? And what role then, does the audiovisual archivist play?

In the digital world, the distinction between libraries, archives, and museums as sources of original content is disappearing. Furthermore, the traditional hierarchy of special collections like original print manuscripts over television and other mass media mediums doesn’t really hold anymore. The newly launched uber-content aggregation network, Europeana, provides a case in point. Europeana offers a federated search across digital collections from Europe’s archives, libraries, and museums. Although Europeana allows you to refine your search by language, country, date, provider, and media type, a simple search will give you a potentially enormous range of results from all over the continent, in all media types. At first glance, you can’t even tell where the content came from, and it doesn’t really matter. Suddenly a scan of someone’s diary and a video of that person being interviewed on a newscast does all feel in a way the same: its all content about that person, its all digital. New connections can be made to different pieces content from different places as never before, allowing revolutions in research. It is certain to change the way people find and use material from the traditional keepers of heritage.

Other content aggregation were presented, including the European Film Gateway, another EU-funded initiative that aims to develop an online portal to nearly 800,000 films and related objects from across Europe. Similarly, the recently completed VideoActive (yes, also EU-funded) project collects television programs from broadcast archives across Europe. Both are powerful tools for finding (in these cases, media specific) content and information from varied and disparate institutions.

These concerns and celebrations of convergence as a result of content aggregation portals were focused on access and user interaction. What about preservation? Is there a role for a networked preservation effort in Europe as well? The answer is a loud and clear yes, and the new PrestoPRIME project is a consortium that will serve that precise function. Initiated in 2009, PrestoPRIME is the successor of the important PrestoSpace project that ended in 2008. The group of Europe’s leading AV archives will research and develop solutions for long-term digital preservation of AV media, and will deliver a range of tools and services. They are also creating a networked competence center, which will be a vender-neutral information, resource and advisory organization.

Indeed, Europe is leading audiovisual archives into new digital territory, one that fully exploits the advantages of digital platforms and networked solutions. The rest of the world should keep a close eye on these ambitious projects, especially Europeana and PrestoPRIME, as they develop in the coming months and years. While other countries and regions might not have the support that will allow them to replicate these mammoth efforts, there will certainly be outcomes and lessons from the Europeans that we can all use.

There are interesting (albeit smaller) projects happening outside of Europe that are using collaborative models to build great archives. One of these is the Alan Lomax Archive / Association for Cultural Equity, which is now a completely digital archive. Users are provided with a few different entry points for searching and browsing recordings and associated metadata, including a GeoArchive. The Association for Cultural Equity continues to foster its mission through collaborations with the archives in the regions where Lomax made recordings of the local musicians. By repatriating recordings to the original communities, the ACE builds and strengthens its network and expands its reach.

The Naad Media Collective is another fascinating cooperative endeavor to collect and archive sounds and images that are becoming extinct in a rapidly developing India. Members of the collective record endangered sounds and images, and share them through peer-to-peer networks. The recordings are available through their website. While not a preservation project per se, the group is collecting some wonderful sounds, such as the crackling of bamboo trees bending against each other in the wind, and a snake breathing.

New approaches to metadata

I heard a few of very provocative, related questions during the 5 day event that concern metadata:
Are metadata standards and structures like FRBR a thing of the past?
Will Google enable free text searching of everything?
Should we be looking at linked data instead?

There were a few presentations that seemed to address these issues, in particular, the talk given by Sam Coppens called “Semantic Bricks for Performing Arts Archiving and Dissemination.” This was a report on the PokuMOn (Performing Arts Multimedia Dissemenation) project that seeks to create a de-centralized archive for performing arts organizations in Belgium. The goal was to find a common metadata model, but combine it with each organization’s own model. The problem was that so many metadata models were being used, it was not ideal to map them all. Their solution was to store the metadata records as data, and use a descriptive layer to search over all the records and display limited results in Dublin Core. Users were then able to link to the original record and see the detailed records. Their other strategy was to use the Open Archives Initiative Object Reuse and Exchange (OAI-ORE) protocol for the description and exchange of aggregations of web resources in RDF. The records can then be published as linked open data, allowing machine readable interpretations of metadata. By automatically generating RDF and linking data to DBPedia and GeoNames they were able to enhance the datasets using these sources. Mr. Coppens recommended the OpenCalais toolkit to automatically create rich semantic metadata. I’m looking into it now.

Another speaker on the same panel, Michael Fingerhut, from Ircam – Centre Pompidou in France pointed out that while most people reach their Musique Contemporaine website in the middle of a catalog record via Google search results, and they rely on this for traffic, Google isn’t actually very good at indexing large databases. Sounds to me like another reason to publish catalog records as linked data, and allow semantic search engines to help bring users to archive websites.

I heard at least 3 institutions report that the traffic to their websites went up by a large percentage once they put links to their sites on relevant Wikipedia pages, and that for the National Film and Sound Archive of Australia, Wikipedia is now their #1 source of traffic.

So I suppose the answer to the above questions so far is — there’s an important role for all three.

Library of Congress’s Carl Fleischhauer also gave an interesting presentation on the US Federal Agencies AV Digitization Working Group, a consortium of government agencies working to create standard guidelines for digitization. One of the important differences between their work and existing guidelines, such as IASA TC-04, is that they are emphasizing the need for embedded metadata, that is, certain elements of descriptive information to be encoded into digital files. This is important in a digital world, where a piece of content can easily become separated from its source, context and catalog. If supported by software and web tools, embedded metadata will enable users to know some basic information about the content, like where it came from and what uses they are allowed. This is already happening in the world of still images, with standards EXIF, IPTC, and XMP, and software manufacturers are supporting this shift by making it easy for users to both read and write metadata. But the take up is slow with digital moving image and sound. Perhaps with the US Government pushing on vendors, we might see some changes in this area.

Ethical, legal and moral issues for online access

Quite a large portion of the conference was devoted to issues of ethics and intellectual property rights. And right fully so: now that archives have so much digital content that they can make available online, how should they best protect the rights of the creators while at the same time providing access to users? A number of presentations offered guidelines and best practices for online archival content.

In her presentation “Guidelines for the Reproduction and Sale of Digital Heritage” during a panel on Ethics and Archival Practice, Diane Thram, Director of the International Library of African Music (South Africa) reported on guidelines that were developed during a seminar recently held at her institution. The group’s conclusions included: applying ethics to the use of digital heritage (and looking at relevant ethics statements from professional associatio ns and organizations), respecting substantiated objections to online access, providing open access to low resolution watermarked excerpts so that they cannot be abused, and always respect the rights of the performer.

As another example, both Brigitte Vézina from WIPO, and Janet Topp Fargion of the British Library Sound Archive, discussed the Legal and Ethical Usage disclaimer on the British Library’s website that warns users not to infringe on the rights of indigenous and local communities. Much of WIPO’s effort to protect Traditional Cultural Expressions (TCEs) arose after the case of the music group Deep Forest, who obtained recordings from an ethnomusicology archive and remixed them without acknowledgement of the source or compensation to communities that were the original performers and creators.

In their tutorial on “Online Audiovisual Collections: Legal vs Moral Rights,” (created by Shubha Chaudhuri and Anthony Seeger (UCLA), and delivered by Mr. Seeger with the help of a fine troupe of archivist-actors role playing scenarios) the authors again touch on the issue of TCEs and the rights of the creators of folklore. This is a big concern because while folklore often doesn’t fall under copyright, the creators have rights that fall into murky territory. They note that under new WIPO agreements, indigenous peoples may be getting more rights to their intangible heritage. Careful consideration of the material, the creators, and the rights must be considered before posting things online. In some places, such as Australia and the south Pacific, communities have very strict rules about who can have access to certain types of knowledge — some songs are only for men, some ceremonies are only for women, etc. Consult as many people as possible before posting AV media online, and be prepared for the possibility of removing it if contested. The Documentation and Archiving the Performing Arts website of the American Institute of Indian Studies, Archives and Research Centre for Ethnomusicology has some good information on this topic, including forms for performers to help them understand their rights.

A few other things worth checking out:

2nd Edition of IASA-TC04 Guidelines on the Production and Preservation of Digital Audio Objects
has been published. This is THE definitive guide on digitization and preservation of audio, now much improved.

VIDI-Video: European research consortium, developing a semantic search engine for video and a “1000 element thesaurus for automatically detecting instances of semantic concepts in the audio-visual content.” Aimed at improving indexing and retrieval practices of broadcast archives. Very cool.

Spectaclesdumonde.fr – Portal of traditional and world music from France, with nice geo-interface.

We Know It Project – from Athens and their Visual Image Retrieval and Localization Tool

And of course there is the lovely poster presentation we gave on Strategies for Sustainable Preservation of Born Digital Public Television (30″ W x 45″ H).


No Responses to “Dispatch from the 2009 IASA Annual Conference”  

  1. No Comments

Leave a Reply