H o m e  |  A b s t r a c t s  |  L i n k s & R e s o u r c e s


Electronic Media Group

AIC Annual Meeting 2004
Hilton Portland and Executive Tower
Portland, Oregon
June 9-14, 2004

Presentation Abstracts

Sunday, June 13
2 - 5:30 p.m.
and
Monday, June 14
10 a.m. - 5 p.m.

RELATED LINKS:
Special Sessions Program Information
Other EMG Events in 2004: Workshop, Luncheon, Reception & Business Meeting

Electronic Cafe International: Aging Records from Technology-based Artistic Activities
Howard Besser
Professor and Director, Moving Image Archiving and Preservation Program, Tisch School of the Arts, New York University

In this presentation, Howard Besser will discuss the InterPARES 2 case study of the Electronic Cafe International (ECI). Since the 1970s, the ECI has produced hundreds of telecollaborative arts projects that pose major preservation challenges. Its collections hold more than 3,000 hours of video, optical disks, audio recordings, computer back-up media, additional electromagnetic storage media, equipment, text, drawings, paper documents, photos and other types of images—all documenting hundreds of artists engaged in telecollaborative works. Besser will discuss the case study, as well as suggest new ways of thinking about the conservation of this type of material.
READ THE FULL PAPER: Besser-EMG2004.pdf
RETURN TO SCHEDULE

Recording the Recoding: The Documentary Strategy
Alain Depocas
Director, Centre for Research and Documentation, Daniel Langlois Foundation (Canada)

Through its different programs, the Daniel Langlois Foundation supports various projects in art, science and technology. Many of these projects lead to the production of technologically complex artworks. The conservation of these artworks and many others produced in the last three decades represent a major preservation challenge. The Daniel Langlois Foundation is actively involved in dealing with this issue.

Documentation is at the center of any preservation strategy for new media and digital art. Improving efforts to preserve new media artworks is insufficient without the support of structured documentation about both the works and the context in which they evolve. In fact, given the huge volatility of many new media and digital art projects, this documentation may often be the only remaining trace of the work. Through its Centre for Research and Documentation (CR+D), the Foundation is developing a collection that seeks to document history, artworks and practices associated with electronic media arts. The CR+D is also developing specialized tools to search this collection and is making them available on its Web site.

The Foundation is also involved in preservation issues through its Researchers in Residence program. Since the launch of the program, two important preservation specialists have worked with the Foundation: Jeff Rothenberg, well known for his research in emulation, and Mona Jimenez, a pioneer in video preservation. During her residency, Jimenez researched the cataloging of instruments, electronic and digital machines and other prototypes developed by or for artists. She also put together a prototype database for this purpose. At the CR+D, she examined the archives of Steina and Woody Vasulka with a focus on instrumentation by the two artists and in particular two instruments, the Rutt/Etra Scan Processor and the Sandin Image Processor. Her research aims to create methods for documenting and describing technological objects designed for contemporary art practices, a field still in its infancy.

The Daniel Langlois Foundation is also involved in an important preservation project, The Variable Media Network, in partnership with the Guggenheim Museum. The project aims to describe works independently of the media used to create, to store, and to present them. Rather than list a work's physical components, the variable media approach is to understand the work's behavioral characteristics and intrinsic effects. It is up to artists to describe their work using these characteristics and effects, which tomorrow's curators and conservators must respect and reproduce. In this way, artists themselves define the limits of the interventions later carried out on their work. During the two years of the partnership, the Fellowship in Variable Media Preservation at the Guggenheim was established; this program has enabled the first recipient, Caitlin Jones, to test the model in a real museum context. “Seeing Double: Emulation in Theory and Practice”, a 2004 exhibition at the Guggenheim Museum in New York, serves to test the approach by presenting several artworks in their initial states accompanied by emulated versions.
RETURN TO SCHEDULE

Preserving Authentic Electronic Art Over the Long-term: The InterPARES 2 Project
Luciana Duranti
Professor and Chair, the Master of Archival Studies Program, School of Library, Archival and Information Studies, University of British Columbia (Canada)

This presentation will introduce the research conducted by InterPARES 2 in order to provide the context for the talks by Howard Besser and Sally Hubbard. InterPARES 2 is a project that was initiated in 2002 and is expected to be completed in 2006. It builds upon the findings of InterPARES 1, the purpose of which was to develop the theoretical and methodological knowledge essential to the long-term preservation of authentic records created and/or maintained in digital form. It focused on the preservation of the authenticity of records created and/or maintained in databases and document management system in the course of administrative activities. In addition to conceptual findings, it produced requirements for authenticity, methodologies of appraisal and preservation, and an intellectual framework for the development of policies, strategies and standards for the long-term preservation of the authenticity of electronic records. InterPARES 2 focuses on records produced in experiential, dynamic and interactive digital environments in the course of artistic, scientific and e-government activities. Digital art is therefore a primary focus of the on-going research, which involves specialists from the visual and performing arts, computer scientists, intellectual rights experts and other scholars from related fields.
READ THE FULL PAPER: Duranti-EMG2004.pdf
RETURN TO SCHEDULE

Audio Reconstruction of Mechanically-Recorded Sound by Digital Processing of Metrological Data
Vitaliy Fadeyev
Post-doctoral Researcher, Lawrence Berkeley National Laboratory
and
Carl Haber
Senior Scientist, Lawrence Berkeley National Laboratory
with Zachary Radding, Lawrence Berkeley National Laboratory; Christian Maul, TaiCaan Technologies Ltd. (UK); John McBride, School of Engineering Science, University of Southampton (UK); Mitchell Golden, Jun Group, Inc.

For much of recorded sound history, audio information was stored on mechanical media, such as a phonograph disc record or cylinder, through undulations of the surface structure (grooves). The groove shape and position can be reconstructed without mechanical contact using precision optical metrology tools. The surface map thus obtained can be digitally processed to remove noise artifacts due to debris, damage and wear, and to convert the groove positional information into audio data. The viability of this approach was recently demonstrated on a 78 rpm shellac disk using two dimensional image capture and analysis methods and further developments are reported. A three dimensional reconstruction of mechanically recorded sound has been recently completed. The surface of the source material, a celluloid cylinder, was scanned using confocal microscopy methods and resulted in a faithful playback of the recorded information. These results are discussed. The approach holds promise for the reconstruction of valuable historical recordings, using full surface information to improve the sound fidelity, and eventually as a means of automated mass preservation. Fast processing is required for the latter application. Methods to accelerate the scan rates and make these techniques practical for use at working archives are discussed.
READ THE FULL PAPER: Fadeyev-EMG2004.pdf
RETURN TO SCHEDULE

The Danube Exodus
Sally Hubbard
Digital Projects Manager, Getty Research Institute

This presentation will discuss preservation issues highlighted by The Danube Exodus, a work by Hungarian artist Péter Forgács (founder of the Private Film & Photo Archives Foundation in Budapest) that is the subject of an InterPARES 2 case study. The Danube Exodus is built on archival/historical analog pieces, including found footage, photographs, early eighteenth-century maps of the Danube River and drawings of the region from the special collections of the Research Library at the Getty Research Institute. This material was used to develop a multimedia interactive piece that has had at least two manifestations—as a gallery installation and as a Web site. The piece was the result of collaboration between Forgács, the Labyrinth Project at the University of Southern California's Annenberg Center for Communication and the Getty Research Institute, and study of it brings up issues, such as the management of hybrid collections and the authenticity and granularity of dispersed works with multiple manifestations.
READ THE FULL PAPER: Hubbard-EMG2004.pdf
RETURN TO SCHEDULE

Artist Instrumentation Database Project
Mona Jimenez
Assistant Professor, Moving Image Archiving and Preservation Program, Tisch School of the Arts, New York University

The study of electronic instruments and devices designed and used by twentieth-century artists is in its infancy. More information must be gained from artists and designers to ensure an understanding of this period by future generations. While there have been several important projects, most notably the “Pioneers of Electronic Arts” exhibition at Ars Electronica in 1992, the topic is largely unexplored. The location of primary materials about artist instrumentation is not widely known.

In addition, those addressing technology-based installation art have been challenged to describe devices used in these works; detailed description is essential to conservation efforts. Without appropriate documentation, we cannot determine how to best preserve the devices themselves, or if they can be re-constructed. The project addresses a first step in this process: to develop a structure for the collection of the data about the devices.

The immediate goal of the Artist Instrumentation Database Project was to create a prototype database for cataloging artist instruments and devices. For test data, research was conducted about two machines developed in the 1970s: the Sandin Image Processor, developed by Dan Sandin, and the Rutt-Etra Scan Processor, developed by Steve Rutt and Bill Etra. This research informed the structure of the database, which was built using FilemakerPro software. To develop the database, samples were collected of existing databases and methods of description were studied, drawing from standards used in electronic arts practice, archival management, object conservation and media conservation.

The Image Processor (IP) and the Rutt-Etra represent two themes or directions that engaged designers and artists at that time who used the devices to create new works through manipulation of video and audio signals. The IP, using Stephen Beck’s terminology, is a “camera image processor”. The video signal is sent through a series of processing modules that allow for changes and a re-combining of the color and black/white components of the image. The Rutt/Etra falls into Beck’s category of “scan modulation/rescan” devices. The machine electronically interferes with television display, affecting the scanning process through control voltages. (More information on the Sandin and the Rutt Etra can be found by searching the Experimental Television Center Web site)

The project was undertaken through a Researcher in Residence grant from the Daniel Langlois Foundation for Art, Science and Technology. The database will be available in the Spring 2004 on the Daniel Langlois Foundation Web site. We are exploring ways to distribute the database template, to obtain feedback from users in order to continue to improve the structure and to link data where appropriate.

The database supports a long-term goal of locating and documenting artist instruments/devices held in private and public collections to allow for the study of these devices by researchers, educators, artists, writers, preservationists and others. The accessibility of the information will lead to a more comprehensive history of the devices and unique forms of artistic practice, and will ultimately aid in their preservation.
RETURN TO SCHEDULE

Seeing Double: Emulation in Theory and Practice
Caitlin Jones, Project Research Assistant: Variable Media Network, Solomon R. Guggenheim Museum

Conservators have explored many traditional and experimental strategies for dealing with works of an ephemeral nature. Among these strategies is a way to replicate obsolete or unavailable materials or hardware – emulation. To emulate a work is to devise a way of imitating the original look of a piece by completely different means. The term can be applied generally to a refabrication of an artwork’s components, but also has a specific meaning in the context of digital media, where emulation offers a powerful technique for running an out-of-date computer on a contemporary one.

As part of a larger program called the Variable Media Network, the Guggenheim Museum, in collaboration with the Daniel Langlois Foundation for Art, Science and Technology, has investigated a series of case studies to formulate creative strategies for endangered works. One work chosen to test emulation is Grahame Weinbren and Roberta Friedman’s video piece “The Erl King” (1982-85). Heralded as one of the first works of interactive video art, “The Erl King” invites the viewer to control the work’s narrative structure through the use of a touch-screen monitor. Due to its unique combination of obsolete hardware (both off the shelf and custom made) and artist-written software, “The Erl King” presented itself as an ideal candidate for hardware emulation. This artwork however proved once again that there is no “miracle cure” for the conservation of electronic artworks. In conjunction with the artists, the Guggenheim conservation department employed computer programmers and technicians to inform the best practice for preserving “The Erl King”.

The original work was exhibited side by side its emulated version in the exhibition “Seeing Double: Emulation in Theory and Practice” with a number of other digital artworks and their emulated counterparts. The exhibition and subsequent symposium, “Echoes of Art: Emulation as a Preservation Strategy”, provided forums for artists, preservation experts, and the public to put emulation to the test.
READ THE FULL PAPER: Jones-EMG2004.pdf
RETURN TO SCHEDULE

Michael Craig-Martin’s “Becoming”: A Conservation Case Study of a Digital Work of Art
Pip Laurenson
Sculpture Conservator for Electronic Media, Tate (UK)

This case study describes the development of a conservation plan for Michael Craig-Martin’s “Becoming”. “Becoming” is the first computer-based digital work to be acquired by Tate and offers a useful starting point from which to explore the conservation of digital art. Described by the artist as “a representational lava-lamp”, “Becoming” has many of the trappings of a conventional work of art in that it is a framed, stand-alone object that hangs on a gallery wall and provides what is essentially a visual experience.

This work is a computer application that controls the visibility or transparency of eighteen images. Decisions about whether images become active and fade up or down are linked to the number of images visible on the screen at any moment. These decisions are made by the program according to probability-based logic and this determines the variability of the system. The work is presented on an eighteen-inch LCD screen which is attached to a computer running Windows XP as its operating system. The program has been written using Director software with the drawings created in Flash. The rendition of the colors, the resolution of the images, the degrees of transparency and the speed of change form the basic parameters of the work.

This paper describes how, in collaboration with the programmer and the artist, the conservator is able to identify the functional and aesthetic elements that are important to the preservation of the work as a whole. The starting point for the construction of a conservation plan is to respond to the risks and associated dependencies of these elements. The closing remarks will draw some general conclusions about the development of procedures applicable to the acquisition of similar works and the evaluation of our criteria for successful conservation in this area.
RETURN TO SCHEDULE

Playing History with Games: Steps towards Historical Archives of Computer Gaming
Henry Lowood
Curator for History of Science & Technology Collections, Stanford University

The historical preservation of interactive software raises a number of issues. In this paper, the focus will be on the particular difficulties presented by computer games, videogames and, to a lesser extent, interactive simulations. Lowood will present the vantage points of both the historian of these media and the curator of collections, first with respect to the problem of defining the objects to be preserved, then with regard to some practical projects planned or underway at Stanford.

Computer and video games are dynamic, interactive and immersive. All of these qualities shape or derive from the interaction of player and game components (hardware, software, game design). They also underline the variability of this medium and its dependence on player input. Thus, games exist in a media space somewhere between the text, the experience and the performance, confounding preservation strategies that rely on notions of content fixity taken from other media. Artifact or activity? Hardware and software objects alone cannot document the medium of the computer game. What is saved by preserving consoles, hardware and software alone, without recording game play, for example?

Just as important, the variability of this medium reflects the nature of games as software, in that the content and the code itself can be changed. Perhaps the most important trend in contemporary game design is the modifiability of published games by the player community, whether in the creation of game “mods” such as Half-Life Counterstrike, the results of subversive play such as speedrunning, or the use of games as platforms for performances such as machinima. Understanding the degree to which game software is modified may also be enlightening with regard to other variable media, but regardless of such generalization, it is a vital step in thinking about problems ranging from how to define computer games as software objects to development of metadata standards.

At a practical level, this paper will present some considerations in the development of The Machinima Archives, as well as some of the hurdles faced in the long-term preservation of Stanford’s substantial collection of computer game and videogame software. Looking forward, it will conclude with some of the early planning for the Archives of Wargames, Simulations and Modeling project.
READ THE FULL PAPER: Lowood-EMG2004.pdf and LowoodImages-EMG2004.pdf
RETURN TO SCHEDULE

Preservation-Worthy Digital Video; or, How to Drive your Library into Chapter 11
Jerome McDonough, Digital Library Development Team Leader, New York University

This session will provide an overview of what NYU has learned so far in trying to establish best practices with regards to archiving digital video, including a brief technical overview of relevant characteristics of digital video, a discussion of the abstract requirements for preservation-worthy digital video and some discussion of costs for creating and maintaining a large scale digital video archive.
READ THE FULL PAPER: McDonough-EMG2004.pdf

RETURN TO SCHEDULE

Collecting and Preserving Computing Equipment at the National Museum of American History
Beth Richwine
Senior Objects Conservator, National Museum of American History, Smithsonian Institution

The National Museum of American History has been collecting computing equipment since it first opened in 1964, but the collections also contain computing artifacts as old as astrolabes from the fifteenth century. Its collections related to electronic media span more than a century from the first telegraph machines to computers of the present day and include some of the earliest computers, such as the ENIAC and the UNIVAC. The “Information Age: People, Information and Technology” exhibition, currently on display at the museum, surveys the history of information technology and its relation to society from the origin of the telegraph to the present.

The curator in charge of the Computer History Collection is responsible for collecting what she feels will best represent the technology of our times and will focus on artifacts that are used in the processing of information. These objects include those that deal with the functions of encoding, decoding, storage and preservation of data and modification of data. Other collections in the museum, such as Engineering, Graphic Arts, Electricity and Photo History, also collect objects related to electronic media, such as typewriters, printing presses, ticker tape, telegraphs, adding machines and photographic equipment.

The collections come from various sources. Some are collected by the curators through contacts throughout the world, while a small portion are collected from outdated technology which had been in use in the museum. At the present time, the holdings of the Computer History Collections alone number about 2,100 pieces of computing equipment and related items.

It is not possible for the museum to keep these machines in operating order because of the immense task of caring for more than three million objects in its entire collections. Rather, we attempt to maintain the condition of the objects in our care. Even this is sometimes a challenge because of the variety of storage conditions available to us both in the museum and at our offsite facilities.

Treatments have varied, based on the use, storage or desires of the curators, but have primarily been aesthetic only. In rare cases when we have been asked to make machines or portions of them operate for educational purposes and public benefit, solutions have had to be worked out. The degradation of modern materials has also created dilemmas in the care of some of these objects. Modern plastics not only degrade but they affect the surrounding materials on these machines. Since computers are a relatively new phenomenon, and so much of the materials they are made of are also newly developed, we really do not know how they are will age or react over time.
RETURN TO SCHEDULE

Archiving the Avant-Garde: Preserving Digital/Media Art
Richard Rinehart
Director of Digital Media, Berkeley Art Museum/Pacific Film Archive & Digital Media Faculty, Department of Art Practice, University of California, Berkeley

Works of digital and Internet art, performance, installation, conceptual and other variable media art represent some of the most compelling and significant artistic creations of our time. These works constitute a history of alternative artistic practice, but because of their ephemeral, technical or otherwise variable natures, they also present significant obstacles to accurate documentation, access and preservation. Without strategies for preservation many of these vital works—and possibly whole new genres such as early Internet art—will be lost to future generations. Long-term strategies must closely examine the nature of ephemeral art and identify core aspects of these works to preserve. Will the future experience these works as physical traces and documentation? Emulated media artifacts? Dynamic cultural events re-performed? All of these? This presentation, focusing mainly on digital art, will outline the problems and introduce early collaborative efforts and strategies for preserving these art forms, with examples and up-to-date progress of the NEA-funded consortium project “Archiving the Avant-Garde”.
READ THE FULL PAPER: Rinehart-EMG2004.pdf and Rinehart-Appendices-EMG2004.pdf
RETURN TO SCHEDULE

Digital Preservation and BBC Domesday
Paul Wheatley
Virtual Learning Environment Service Team Leader and Former Project Manager of CAMiLEON, University of Leeds (UK)

Hardware and software technology is advancing quickly, so we know that technology obsolescence will lead to the loss of valuable digital materials unless we take action. But without a crystal ball to predict how technology will change in the future, taking the right action and to ensure that technologies we use do not themselves quickly become obsolete will be very difficult.

Furthermore, testing any digital preservation strategies we devise with current digital materials will not be conclusive. A PDF file will be recognized and rendered on an average PC with little more than a double click. In fifty years time, we will need some metadata to describe to us what a PDF is and how we should go about making sense of it. Testing digital preservation efforts on current materials will always be difficult.

The CAMiLEON (Creative Archiving at Michigan & Leeds: Emulating the Old on the New) Project took a different approach by looking back in time at the materials of yesteryear in order to inform the preservation strategy of the future. After devising new strategies for the migration and emulation of digital materials in order to combat technology obsolescence, CAMiLEON applied these strategies to old digital materials. This enabled a far more realistic test of the applied approaches and provided some insight as to how technology and standards might change in the future.

The main demonstrator addressed by CAMiLEON was the BBC Domesday Project resource. BBC Domesday was a forerunner of modern multimedia software. A groundbreaking interactive interface was combined with an incredible cross section of digital resources, providing a social snapshot of life in the UK in the mid 1980s. But dependent on obsolete videodisc technology, and virtually lost to the ravages of time (all of twenty years since its release), BBC Domesday provided a complex test case for testing digital preservation techniques.

BBC Domesday worked well to illustrate and inform on many crucial digital preservation issues: timeliness of preservation action, media independent preservation, significant properties (what do we want to preserve?), longevity of preservation tools. It also formed an ideal demonstrator for the evaluation of emulation as a digital preservation strategy. As argued by the many emulation cynics, can emulation really work in a digital preservation context, and perhaps more importantly, do we need it? CAMiLEON’s work on rescuing BBC Domesday tackled these issues head on, revealing that emulation could work and could be made to last. It also demonstrated that it could also be cost-effective and perhaps could form the only practical method of preserving much of our digital heritage.
READ THE FULL PAPER: Wheatley-EMG2004.pdf
RETURN TO SCHEDULE