Multispectral Imaging: People, Processes & Technology

Michael B. Toth

President, R.B. Toth Associates

Alberto Campagnolo, Erin Connelly and Heather Wacha setting up manuscripts at SIMS for multispectral imaging as part of the “Stains Alive” project.

Alberto Campagnolo, Heather Wacha and Erin Connelly have discussed the Stains Alive Project, citing our imaging technology, work processes and data output in support of this unique scientific study into stains on ancient manuscripts. This builds on almost two decades of work we’ve put into developing our equipment and techniques. As we continue our journey from the University of Pennsylvania, Schoenberg Institute of Manuscript Studies (SIMS) and Library of Congress, and venture out on the rest of our journey to the Universities of Wisconsin and Iowa (I’ll try to ignore the forecasts of 16°F and snow showers) I thought I’d discuss what some consider the more mundane aspects of these projects that often go unrecognized.

A small sample of the standardized data output from “Stains Alive” imaging at SIMS

The methodologies and technologies we use for multispectral imaging today are based on our 18 years of experience in narrowband multispectral imaging systems development. Yet for all the advances in the latest equipment – higher resolution sensors, better signal to noise ratio, improved illumination panels – success or failure of these projects depends on more than just the technology. A successful program also requires solid work processes and dedicated people. This is where systems integrators and program managers come in – not just to make sure all the technology is working together, but to ensure the project is fully supported by the processes and people as well. Without these – especially the latter – a project can yield some pretty pictures for scholars and conservators to gasp and drool over, but might not successfully produce and preserve the solid corpus of standardized data and metadata for future generations to study.

Multispectral imaging sequence of images in a darkened room, each illuminated by different wavelengths of narrowband LEDs shining on the manuscript from Ultraviolet to Infrared light

As Alberto noted in his blog, the current narrowband multispectral imaging system used for this project includes commercial-off-the-shelf hardware and software for digital spectral image capture and viewing with the integrated system. This includes customized image processing software developed by Bill Christens-Barry of Equipoise Imaging to allow users to exploit the spectral images, utilizing techniques developed in other scientific and cultural heritage studies.

Our Phase One high-pixel-count camera takes a series of high-quality digital images, each illuminated by a specific wavelength of light from banks of light emitting diodes (LEDs). Everyone tends to get excited about being able to observe multispectral imaging of manuscripts: the sequences of various colored lights are visually compelling, you are seeing new features on an object and are part of leading edge studies. But Heather, Erin and Alberto are learning that after a few sequences of images in a dark room, many people don’t have the patience for more and excuse themselves. System operators like Meghan Hill Wilson and the PRDT team in the Library of Congress, the CHIC team at the John Rylands Library in Manchester, the digitization team in the Duke Libraries, Cerys Jones at UCL, and Damian Kasotakis in the Sinai are the unsung heroes of these projects, as they work in dark rooms day after day setting up and imaging manuscript leaf after manuscript leaf. This is where checklists are needed to make sure mistakes don’t creep in.

Compressed pseudocolor image of SIMS manuscript inner cover digitally processed from a sequence of captured images

The resulting image set is then digitally processed and combined to reveal residues and features in the object that are not visible to the eye in natural light. These processed images generated from the captured images provide the data needed for research into stains and residues.   Lots of data! Each archival 16-bit Tiff image from our current 60 Megapixel Phase One monochrome camera is about 117 MB in size, and we capture 15-18 images in a sequence. So each sequence yields about 2 GB of captured data. Multiply that by the number of leaves imaged and we are quickly piling up data. By the end of this short project, the team will have collected about a quarter terabyte of captured data alone.

Workshop on multispectral imaging system and processing tools at SIMS

While the processed images are usually stored as 8-bit Tiff images, with multiple processed images available from each sequence – including some larger pseudocolor images – they add up to yet more data to store. With open source image processing tools and training, scholars and conservators can now produce their own processed images to meet their research needs, which also need to be managed.


All these data require good metadata and file structures, for without it we would be blindly trying to find data across hard drives and the cloud. And when we found them, we wouldn’t be able to remember details about the imaging, spectral illumination or object. This highlights the additional unsung heroes of our multispectral imaging: the data managers and administrators. The dean of this cadre is Doug Emery, whose pioneering work on data management and preservation on the Archimedes Palimpsest Project was “recognized” by Program Director Will Noel’s dedication in his book (below):

“To Doug Emery, Whose critical contribution to this project goes unrecorded in this book. Sorry. Metadata doesn’t sell. Thank you so much! Will Noel”

Data Manager Doug Emery and Data Administrator Susan Marshall standardizing and organizing Sinai Palimpsests Project data and metadata

Doug’s work, and that of so many others responsible for the metadata and data output, has proven critical to multispectral imaging programs ranging from various palimpsest projects to David Livingstone’s Diaries, Top Treasures at the Library of Congress, and mummy masks around the globe. Starting with the Archimedes Palimpsest Metadata Standard (really a specification) Doug, Bill Christens-Barry and I developed over a decade ago, multispectral imaging data management has advanced on the shoulders of pioneers working with the Image Interoperability Information Framework, Dublin Core, the Text Encoding Initiative, and others. Only with the diligence and attention to detail provided by dedicated data managers and administrators have large amounts of multispectral image data have been archived and made available online for global access.

Training workshop for PACSCL members at SIMS on multispectral image processing and work flow to meet users’ diverse goals

For Stains Alive and our other multispectral imaging projects, we use the latest technology, which is always getting better. At SIMS and for the Philadelphia Area Consortium of Special Collections Libraries (PACSCL) we were able to try out the latest 100 MP Phase One camera back thanks to a loan from Digital Transitions. The CMOS sensor allowed us to autofocus and captures more detail in larger images, while capturing even more high quality data. With these new cameras, illumination panels, processors and other technologies for multispectral imaging, we also have to continuously improve our work processes. Most of all, we need to ensure the people on the team have all the resources they need to carry out their goals.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s