Designing and deploying a preservation-compliant media asset management system for television production0 Comments Published by Michelle Michalos April 27th, 2011 in Uncategorized
WNET’s report on designing and deploying a preservation-compliant media asset management system for television production is now available.
This report describes the design and implementation of a functioning video production media asset management (MAM) system designed specifically to facilitate the preservation of digital public television programs and associated content at public television station WNET in New York.
The implementation of a preservation-compliant production MAM at WNET followed on the completion of the NDIIPP Preserving Digital Public Television project (PDPTV), a six-year partnership between public television stations WNET and WGBH, the Public Broadcasting Service (PBS), and New York University. PDPTV sought to address challenges and develop solutions for the preservation of born-digital public television content, primarily by developing a prototype digital repository at New York University. After several years of testing, and many lessons learned, repository staff concluded that without the adoption of standard file formats and the consistent collection of uniform metadata at public television stations, the repository would not be able to scale to serve the wider public broadcasting community. The processing required to ingest diverse file formats and inconsistent and haphazard metadata records that did not conform to any standard was simply unsustainable. The project concluded that, “this is one of the biggest problems that public broadcasting will have to solve in order to design a successful long-term digital repository.” 
US public television production has traditionally been performed by numerous fiercely independent production units, sometimes several within a single station, using a variety of working methods, tools, and formats. Archiving in the analog era was typically an afterthought, performed locally on an ad hoc basis by only a handful of dedicated stations. The result was that databases were homegrown, catalog records incomplete, and archives contained a multitude of tape and film formats.
As public broadcasters moved to file-based workflows and environments in the digital era, these practices proved not only challenging for preservation, but also extremely ineffecient for day-to-day production and distribution operations. WNET recognized the irreconcilable shortcomings of the traditional production working methods as they moved to file-based workflows. They also saw the opportunities that the digital transition provided to streamline creation, distribution, and archiving procedures for all business units within the organization. New tools, protocols, and policies would be needed, including a robust and versatile Media Asset Management solution.
WNET’s MAM was designed to adhere to a set of core principles that address the dual requirements of production and preservation. These include the systematic collection of metadata throughout production, management of a limited number of standard file formats, tools and technical integrations to enable the flow of media and metadata from inception to archiving, use of the PBCore metadata standard, safe and efficient internal storage of content, and automated exports of rich packages for distribution and delivery to a preservation repository. As can be seen throughout the System Description in this report, every aspect of the MAM’s design has been carefully thought through to ensure effective and reliable production, distribution, and preservation.
The selection, development, and deployment of an appropriate MAM has not been without challenges. Changing the institutional culture and working methods of the entire organization is an ongoing process. Yet the new system has already helped WNET to create more rich and standardized preservation packages while creating efficiencies in the production process. As the MAM is used for more projects — including new production projects and the American Archive Content Inventory Project — more and more born-digital content will be preserved for future audiences.
This report was written and edited by Jonathan Marmor (WNET), Kara Van Malssen (AudioVisual Preservation Solutions), and Daniel Goldman (WNET).
We’d love to hear your comments. Please send any feedback to MarmorJ [at] thirteen [dot] org.
 “Repository Design Report With Attached Metadata Plan” by Joe Pawletko, 19 March 2010, p. 2.
Final Report: Preserving Digital Public Television0 Comments Published by Brian Santalone July 6th, 2010 in Uncategorized
After seven years of researching, testing, developing, analyzing, promoting and sharing, it is time to close out the Library of Congress-funded NDIIPP project Preserving Digital Public Television. Here is our Final Report.
The report covers the major areas of our work:
- Project Goals, Structure and Organization
- Activities: Selection and Appraisal
- Inventory of At-Risk Materials
- Metadata and Related Topics
- File Formats and Packages
- Repository Design
- Intellectual Property and Copyright Issues
- Corollary Content Test Activity
- Outreach to the Public Broadcasting Community
- Impact and Contributions
This project was enormously successful. We produced a significant body of reports; published articles in key journals and other publication; and made popular presentations at dozens of conferences, symposia and special events in the U.S, Canada and abroad. Much to our surprise, this project emerged as a respected leader nationally and internationally in approaching technology issues relating to preserving digital video.
Most importantly, by promoting the importance of digital preservation to public broadcasting, we were instrumental in helping to create the American Archive, a new initiative at the Corporation for Public Broadcasting, which is its first genuine investment in long-term preservation and access of U.S. public radio and television programming.
Since Preserving Digital Public Television began, broadcasting has shed its analog systems and moved completely into a digital universe. This project has been able to impress on the public television system the message that digital preservation is not an optional “add-on” cost, but a requirement for any future use of the materials. In this, the project has been instrumental in transforming an attitude of indifference to one that acknowledges the value of properly managing our collective archival holdings.
We are extremely proud of the quality and scope of the activities performed with NDIIPP support. What we did accomplish was much more than we planned, and we had a much greater impact than we could have imagined. As such, Preserving Digital Public Television more than exceeded the project goals and well surpassed the expectations of the Library of Congress as an original NDIIPP project.
* * * * *
I want to thank all the terrific members of the project team, especially my colleagues at WNET Ken Devine, Jonathan Marmor and Winter Shanck; Dr. Howard Besser and Kara van Malssen at New York University; Mary Ide, Karen Cariani and Dave MacCarn at WGBH; and Bea Morse, Glenn Clatworthy and Irene Taylor at PBS. Without them and the participation of many others, this project would not have been possible. And because this was an extremely friendly and accommodating group, it was an easy project to coordinate and an awful lot of fun!
I must also thank the NDIIPP program of the Office of Strategic initiatives at the Library of Congress, in particular Associate Librarian Laura Campbell; Martha Anderson, Director of Program Management; and our wonderful Program Officer Carl Fleishhauer. First, for having the vision to create the NDIIPP program, and then for showing their confidence in our proposal by giving us the first NDIIPP award made to a non-academic institution. This took a leap of faith on both our parts. We are very proud of the significant accomplishments that were made possible through this generous support, and of being a strong partner who contributed to the overall success of the entire NDIIPP venture.
On behalf of the public broadcasting system, we want to express our gratitude to the Library.
And personally, I am very grateful for the opportunity to contribute to such an important and exciting venture and meet so many outstanding people in the field. We hope that through the American Archive and a host of other preservation activities now underway, the relationship between public broadcasting and the Library will continue to flourish and grow into the (digital!) future.
Nan Rubin, Project Director
PDPTV Intellectual Property and Copyright Issues report released0 Comments Published by Nan Rubin April 22nd, 2010 in Uncategorized
“Intellectual Property and Copyright Issues Relating to the Preservation and Future Accessibility of Digital Public Television Programs,” is now available. It was written and edited by Kathleen Maguire (NYU), Nan Rubin (WNET), and Kara Van Malssen (NYU) with the PDPTV Copyright Working Group.
This report explores the particular set of copyright and intellectual property (IP) issues that govern – and often limit – the use of digital public television programs when the broadcast permissions have expired. It also explores the connected question of how the existing copyright laws restrict the ability of archives to adopt functional practices for digital preservation.
We’d love to hear your comments on this paper. Please send any feedback to RubinN [at] wnet [dot] org.
PDPTV Report: Repository Design0 Comments Published by Nan Rubin March 23rd, 2010 in Uncategorized
After 5 years of work, we are very pleased to publish our Repository Design Report, which describes the architecture and design of the PDPTV model preservation repository!
Developed by the Digital Library Technology Services team at New York University, the repository provided an important testbed and prototype for digital preservation of complex video files and their associated metadata. Thanks to the Library of Congress, we know that this work is going to advance the design and creation of “trusted” digital repositories for public broadcasting and related media content.
The report describes:
- The design of the preservation environment;
- The technologies used to support preservation functions;
- The creation of Archival Information Packages for managing complex video files;
- The standards used to support aggregation of disparate sources of metadata, including METS, PBCore, and PREMIS;
- And the process used to determine needs of repository users in order to design output requirements.
This report was written by NYU Software Systems Architect Joseph Pawletko, with contributions by Nan Rubin (WNET) and Kara Van Malssen (NYU). Thanks to all PDPTV team members and fellow travelers for their input and collaboration, without which we could not have produced this caliber of work. Also thanks to the NDIIPP program at the Library of Congress for their consistent support and confidence in our project.
We welcome your comments. Please send any questions or feedback to Nan Rubin at rubinn [at] thirteen [dot] org, and/or Joe Pawletko at jgp [at] nyu [dot] edu.
PDPTV Sustainability report now available!0 Comments Published by Nan Rubin February 19th, 2010 in Uncategorized
Strategies for Sustainable Preservation of Born Digital Public Television is a thorough investigation into the technical, operational, and economic requirements for sustainble preservation of file-based television production. It includes a case study of the costs associated with the PDPTV prototype preservation repository at New York University.
This comprensive study was researched, written and edited by Yvonne Ng (NYU), Nan Rubin (WNET) and Kara Van Malssen (NYU) with generous contributions from the project’s Sustainability Working Group and numerous outside readers and commentators. Thanks to everyone who helped!
We hope you find this report useful, and we welcome your feedback. Please contact Nan Rubin (rubinn at thirteen dot org) and/or Kara Van Malssen (kvm211 at nyu dot edu) with comments or questions.
News for the New Year0 Comments Published by Nan Rubin January 13th, 2010 in Uncategorized
First of all, Happy New Year from the PDPTV gang!
There has been a lot of activity around archives and public broadcasting over the past few months, that we thought we’d share with you.
1. In November, the Association of Moving Image Archivists (AMIA) held their annual conference in downtown St. Louis, Missouri. Not only is the AMIA confererence one of the most fun events (drunken trivia, sexy experimental animation, Julia Child’s blowtorch french onion soup… need I say more?) on the calendar for moving image archivists in North America and abroad, but it is also an incredible gathering of professionals who present their latest reserch, tools, solutions, and practices. This year there were a number of sessions that focused on the preservation of public broadcasting. We encourage you to look at the program to see the full listing of presentations.
- PDPTV partners WGBH presented on the new ways that their Media Library and Archives is providing access to their collections, as well as how their Open Vault is exploring repository sustainability.
- WGBH Media Library and Archives Director Karen Cariani also chaired a session on “Digital Durability and Durable Access: PrestoPRIME and Netherlands Institute for Sound and Vision solutions” that showcased new solutions for digital preservation of broadcast content coming out of Europe.
- Jack Brighton, Director of New Media and Innovation at WILL Public Media chaired a session titled “The Problem of Open Media,” which included engaging and thought-provoking presentations on open access to AV archival content by Peter Kaufman (Intelligent Television), Rick Prelinger (Prelinger Library and Archives), and Karl Fogel (QuestionCopyright.org).
- There was a report on the American Archive Pilot Project and the lessons learned by digitizing and collecting archival public broadcasting content from 25 television and radio stations around the country.
2. Speaking of the American Archive Pilot Project… The project manager, Oregon Public Broadcasting (OPB), plan to wrap things up around February of this year. PDPTV partner WNET/Thirteen is (as I type this) putting the finishing touches on sending in their 50 hours of civil rights related video and metadata content to OPB for the project. We’re sure that there’s going to be loads of incredible media collected for this pilot (we’ve even read about it in the papers!) and we certainly hope there will be a public showcase or summary of the project.
3. The American Archive has hired its first Executive Director! Matthew White started the job in the beginning of 2010. PDPTV Project Director Nan Rubin has already started filling him in on the great work that our project has been doing over the past 5 years. We’re excited to see what shape the AA takes over the coming year under his leadership.
4. CPB has just released an RFP for an American Archive Content Inventory Project Manager. Details can be found here.
5. Finally, the NDIIPP Preserving Digital Public Television Project (that’s us) will be wrapping up this year. As new and exciting things are happening with the emergence of the American Archive, we know that there will be great opportunities for public broadcasting archives in the near future. We’re happy to see that the importance of preserving and providing access to our nation’s public broadcasting heritage has become a priority for so many people.
We will be publishing a number of reports in the early part of 2010, on topics including intellectual property, repository sustainability, and the design of the PDPTV prototype preservation repository at NYU. By mid-2010 you will be able to find all of the reports created throughout the 5 years of this project on this site.
– Kara Van Malssen
Dispatch from the 2009 IASA Annual Conference0 Comments Published by Nan Rubin October 27th, 2009 in Uncategorized
40th Annual Conference of the International Association of Sound and Audiovisual Archives (IASA)
20-25 September 2009 in Athens, Greece
by Kara Van Malssen
the author (left) and Karen Cariani (WGBH Media Library and Archives) in front of our poster on sustainability issues for public broadcasting preservation
From 20-25 September 2009, the 40th annual conference of the International Association of Sound and Audiovisual Archivists was held at the Megaro Mousikis (Athens Concert Hall) in Athens, Greece. The theme: “Towards a new kind of archive? The digital philosophy in audiovisual archives.” After a full week of committee meetings, presentations, and tutorials that provided perspectives from all corners of the globe, the conference really did begin to arrive at a shape for the new kind of archive, one with good digital collection management, a clear sense of ethical responsibility, wide accessibility, and that puts users at the top of the priority list. It seemed that in a shift from other conferences in this field in recent years, where the focus has been on the overwhelming need to migrate physical media to digital and on the various issues that come along with that, the focus is now on how to best take advantage of the opportunities that digital archiving offers. The future is perhaps not so bleak as we might have thought a few years back, but it shouldn’t come as a surprise that to succeed in the digital era a few changes will be required.
The keynote talk, delivered by Edwin van Huis (Cultural Entrepreneur, former Director of Netherlands Institute for Sound and Vision, and former president of FIAT/IFTA), began on this note with a question to the audience: “What is a good archive?” And the answers, both from the attendees and the speaker himself, weren’t just about having good storage and good cataloging. These things are not necessarily a number one concern in an age where users are able to access (seemingly) all the content they want, on the web, at any time. The key is for archives to maintain relevance. A good archive is one that is connected to users, providing many points of access, and allowing for different relations and meanings to become attached to content. A good archive is one that knows and works with its users, and changes according to their expectations. Because if you don’t keep up with them, they’ll go somewhere else.
To me, there were three overarching themes throughout the conference: the convergence of traditional domains, aggregation of content, and collaborations; new approaches to metadata; and ethical, legal, and moral issues of online access.
Convergence, Aggregation, Collaboration
Convergence is an important topic for heritage institutions in the digital age. This issue was the subject of panel early in the week, and the questions and concerns raised there continued to resound throughout the conference. Many speakers noted that users don’t necessarily care whether something is a video, film, photo, or text — they just care that it relates to a subject. Where then does that leave museums, libraries, and archives? What unique function do they have? And what role then, does the audiovisual archivist play?
In the digital world, the distinction between libraries, archives, and museums as sources of original content is disappearing. Furthermore, the traditional hierarchy of special collections like original print manuscripts over television and other mass media mediums doesn’t really hold anymore. The newly launched uber-content aggregation network, Europeana, provides a case in point. Europeana offers a federated search across digital collections from Europe’s archives, libraries, and museums. Although Europeana allows you to refine your search by language, country, date, provider, and media type, a simple search will give you a potentially enormous range of results from all over the continent, in all media types. At first glance, you can’t even tell where the content came from, and it doesn’t really matter. Suddenly a scan of someone’s diary and a video of that person being interviewed on a newscast does all feel in a way the same: its all content about that person, its all digital. New connections can be made to different pieces content from different places as never before, allowing revolutions in research. It is certain to change the way people find and use material from the traditional keepers of heritage.
Other content aggregation were presented, including the European Film Gateway, another EU-funded initiative that aims to develop an online portal to nearly 800,000 films and related objects from across Europe. Similarly, the recently completed VideoActive (yes, also EU-funded) project collects television programs from broadcast archives across Europe. Both are powerful tools for finding (in these cases, media specific) content and information from varied and disparate institutions.
These concerns and celebrations of convergence as a result of content aggregation portals were focused on access and user interaction. What about preservation? Is there a role for a networked preservation effort in Europe as well? The answer is a loud and clear yes, and the new PrestoPRIME project is a consortium that will serve that precise function. Initiated in 2009, PrestoPRIME is the successor of the important PrestoSpace project that ended in 2008. The group of Europe’s leading AV archives will research and develop solutions for long-term digital preservation of AV media, and will deliver a range of tools and services. They are also creating a networked competence center, which will be a vender-neutral information, resource and advisory organization.
Indeed, Europe is leading audiovisual archives into new digital territory, one that fully exploits the advantages of digital platforms and networked solutions. The rest of the world should keep a close eye on these ambitious projects, especially Europeana and PrestoPRIME, as they develop in the coming months and years. While other countries and regions might not have the support that will allow them to replicate these mammoth efforts, there will certainly be outcomes and lessons from the Europeans that we can all use.
There are interesting (albeit smaller) projects happening outside of Europe that are using collaborative models to build great archives. One of these is the Alan Lomax Archive / Association for Cultural Equity, which is now a completely digital archive. Users are provided with a few different entry points for searching and browsing recordings and associated metadata, including a GeoArchive. The Association for Cultural Equity continues to foster its mission through collaborations with the archives in the regions where Lomax made recordings of the local musicians. By repatriating recordings to the original communities, the ACE builds and strengthens its network and expands its reach.
The Naad Media Collective is another fascinating cooperative endeavor to collect and archive sounds and images that are becoming extinct in a rapidly developing India. Members of the collective record endangered sounds and images, and share them through peer-to-peer networks. The recordings are available through their website. While not a preservation project per se, the group is collecting some wonderful sounds, such as the crackling of bamboo trees bending against each other in the wind, and a snake breathing.
New approaches to metadata
I heard a few of very provocative, related questions during the 5 day event that concern metadata:
Are metadata standards and structures like FRBR a thing of the past?
Will Google enable free text searching of everything?
Should we be looking at linked data instead?
There were a few presentations that seemed to address these issues, in particular, the talk given by Sam Coppens called “Semantic Bricks for Performing Arts Archiving and Dissemination.” This was a report on the PokuMOn (Performing Arts Multimedia Dissemenation) project that seeks to create a de-centralized archive for performing arts organizations in Belgium. The goal was to find a common metadata model, but combine it with each organization’s own model. The problem was that so many metadata models were being used, it was not ideal to map them all. Their solution was to store the metadata records as data, and use a descriptive layer to search over all the records and display limited results in Dublin Core. Users were then able to link to the original record and see the detailed records. Their other strategy was to use the Open Archives Initiative Object Reuse and Exchange (OAI-ORE) protocol for the description and exchange of aggregations of web resources in RDF. The records can then be published as linked open data, allowing machine readable interpretations of metadata. By automatically generating RDF and linking data to DBPedia and GeoNames they were able to enhance the datasets using these sources. Mr. Coppens recommended the OpenCalais toolkit to automatically create rich semantic metadata. I’m looking into it now.
Another speaker on the same panel, Michael Fingerhut, from Ircam – Centre Pompidou in France pointed out that while most people reach their Musique Contemporaine website in the middle of a catalog record via Google search results, and they rely on this for traffic, Google isn’t actually very good at indexing large databases. Sounds to me like another reason to publish catalog records as linked data, and allow semantic search engines to help bring users to archive websites.
I heard at least 3 institutions report that the traffic to their websites went up by a large percentage once they put links to their sites on relevant Wikipedia pages, and that for the National Film and Sound Archive of Australia, Wikipedia is now their #1 source of traffic.
So I suppose the answer to the above questions so far is — there’s an important role for all three.
Library of Congress’s Carl Fleischhauer also gave an interesting presentation on the US Federal Agencies AV Digitization Working Group, a consortium of government agencies working to create standard guidelines for digitization. One of the important differences between their work and existing guidelines, such as IASA TC-04, is that they are emphasizing the need for embedded metadata, that is, certain elements of descriptive information to be encoded into digital files. This is important in a digital world, where a piece of content can easily become separated from its source, context and catalog. If supported by software and web tools, embedded metadata will enable users to know some basic information about the content, like where it came from and what uses they are allowed. This is already happening in the world of still images, with standards EXIF, IPTC, and XMP, and software manufacturers are supporting this shift by making it easy for users to both read and write metadata. But the take up is slow with digital moving image and sound. Perhaps with the US Government pushing on vendors, we might see some changes in this area.
Ethical, legal and moral issues for online access
Quite a large portion of the conference was devoted to issues of ethics and intellectual property rights. And right fully so: now that archives have so much digital content that they can make available online, how should they best protect the rights of the creators while at the same time providing access to users? A number of presentations offered guidelines and best practices for online archival content.
In her presentation “Guidelines for the Reproduction and Sale of Digital Heritage” during a panel on Ethics and Archival Practice, Diane Thram, Director of the International Library of African Music (South Africa) reported on guidelines that were developed during a seminar recently held at her institution. The group’s conclusions included: applying ethics to the use of digital heritage (and looking at relevant ethics statements from professional associatio ns and organizations), respecting substantiated objections to online access, providing open access to low resolution watermarked excerpts so that they cannot be abused, and always respect the rights of the performer.
As another example, both Brigitte Vézina from WIPO, and Janet Topp Fargion of the British Library Sound Archive, discussed the Legal and Ethical Usage disclaimer on the British Library’s website that warns users not to infringe on the rights of indigenous and local communities. Much of WIPO’s effort to protect Traditional Cultural Expressions (TCEs) arose after the case of the music group Deep Forest, who obtained recordings from an ethnomusicology archive and remixed them without acknowledgement of the source or compensation to communities that were the original performers and creators.
In their tutorial on “Online Audiovisual Collections: Legal vs Moral Rights,” (created by Shubha Chaudhuri and Anthony Seeger (UCLA), and delivered by Mr. Seeger with the help of a fine troupe of archivist-actors role playing scenarios) the authors again touch on the issue of TCEs and the rights of the creators of folklore. This is a big concern because while folklore often doesn’t fall under copyright, the creators have rights that fall into murky territory. They note that under new WIPO agreements, indigenous peoples may be getting more rights to their intangible heritage. Careful consideration of the material, the creators, and the rights must be considered before posting things online. In some places, such as Australia and the south Pacific, communities have very strict rules about who can have access to certain types of knowledge — some songs are only for men, some ceremonies are only for women, etc. Consult as many people as possible before posting AV media online, and be prepared for the possibility of removing it if contested. The Documentation and Archiving the Performing Arts website of the American Institute of Indian Studies, Archives and Research Centre for Ethnomusicology has some good information on this topic, including forms for performers to help them understand their rights.
A few other things worth checking out:
2nd Edition of IASA-TC04 Guidelines on the Production and Preservation of Digital Audio Objects
has been published. This is THE definitive guide on digitization and preservation of audio, now much improved.
VIDI-Video: European research consortium, developing a semantic search engine for video and a “1000 element thesaurus for automatically detecting instances of semantic concepts in the audio-visual content.” Aimed at improving indexing and retrieval practices of broadcast archives. Very cool.
Spectaclesdumonde.fr – Portal of traditional and world music from France, with nice geo-interface.
We Know It Project – from Athens and their Visual Image Retrieval and Localization Tool
And of course there is the lovely poster presentation we gave on Strategies for Sustainable Preservation of Born Digital Public Television (30″ W x 45″ H).
Analyzing Sustainability0 Comments Published by Nan Rubin December 23rd, 2008 in Uncategorized
One of the final projects of Preserving Digital Public Television is an assessment of sustainability for television archives in the public broadcasting system.
While our final report is still in process, we’ve found a number of issues related to sustainability that are not unique to moving images and television, but which seem likely to reflect much broader concerns over time. Among them:
- Rights management. Television and moving images involve more rights holders than most other types of digital material, and the looming issue is the enormous cost of locating, negotiating and paying for huge collections of underlying rights materials incorporated into thousands of local and national productions. Even the Library of Congress “identified copyright as a potentially serious impediment to the preservation of important digital collections and recognized that solving certain copyright issues was crucial to achieving long-term preservation of important digital content.”
- Economics. The tendency in television archives is to see the collection primarily as a potential source of income. Yet there is both monetary and non-monetary value to our collections, especially when measured against our mission of promoting education. With no existing commitment within public broadcasting to fund preservation (at least right now,) we are scrutinizing our existing funding streams and operating models for potential new models of generating financial support.
- Metadata. The possibilities for describing the contents of television broadcasts are still evolving, and questions remain about how best to do so.
- Preservation quality files. Format complexity, lossy compression, and a wide gap between preservation and access copies all raise quality concerns. Future migration of archived works will involve not only moving from tape to disk to other physical media, but also from one image format to another. The preference to preserve the highest quality image, the potential for loss, the relatively smaller size and costs of storing compressed files vs. uncompressed, and the need to make works available in many different viewing formats are difficult issues for archivists to resolve.
- Scale. Even moderately sized collections of moving images require petabytes of storage. Even so, we are projecting that over time, while costs for storing collections will continue to drop, long-term operating costs will rise, based on the need to maintain personnel, refresh the holdings, and keep the lights on.
In early 2009, we’ll be publishing our full report on sustainability. In the meantime, you can get a sense of what we are thinking from the resources page. And for more background, the Interim Report of the Blue Ribbon Task Force is available for your reading pleasure.
Web Crawl Update0 Comments Published by Nan Rubin December 19th, 2008 in Uncategorized
As part of our digital preservation initiative, we have saved copies of the majority of websites related to the public television system in 2007 – more than 300 websites of stations, program productions, and related organizations.
Working with the Internet Archive, we will be transferring our 5 terabytes of data to the Library of Congress in early 2009. You can read more about this work on the project page.
Marcia Brooks on PBcore0 Comments Published by Nan Rubin December 20th, 2007 in Uncategorized
Current, “the newspaper about public TV and radio,” has a very nice two page article about PBcore by Marcia Brooks, who helped develop the proposal for CPB funding of the PBCore project and directed the project at WGBH for most of the last six years. Some of the main points:
- Almost six years ago CPB had the foresight to fund the development of a metadata standard for the multimedia, multiplatform present and future of public broadcasting.
- Frontline and The NewsHour with Jim Lehrer are using PBCore as the basis of the Frontline/NewsHour video database and are using a modified version to let web users click for “related video.”
- If you go to the National Educational Telecommunications Association Conference next month, check out the PBCore session Thursday morning, Jan. 24, about PBCore in three stations’ real-world workflows. There are many more uses of PBCore in the field, including more documented case examples on PBCore.org.
The Preserving Digital Public Television Project is part of the National Digital Information Infrastructure and Preservation Program of the Library of Congress.
- Designing and deploying a preservation-compliant media asset management system for television production
- Final Report: Preserving Digital Public Television
- PDPTV Intellectual Property and Copyright Issues report released
- PDPTV Report: Repository Design
- PDPTV Sustainability report now available!
- News for the New Year
- Dispatch from the 2009 IASA Annual Conference
- Analyzing Sustainability
- Web Crawl Update
- Marcia Brooks on PBcore