Stanford University News Service
425 Santa Teresa Street
Stanford, California 94306-2245
Tel: (650) 723-2558
Fax: 650) 725-0247
December 5, 2005
Dawn Levy, News Service: (650) 725-1944, firstname.lastname@example.org
Laboratory notebooks are indispensable sidekicks to most scientists. Lining benchtops, these haggard tomes contain page after page crammed with protocols, snapshots, measurements, graphs, charts and sketches. Despite the digital onslaught, this old pen-and-paper standby has remained steadfast, providing a tangible record of completed research.
"Paper notebooks have infinite battery life," said Scott Klemmer, assistant professor of computer science. "Scientists will say, 'I have never had my laboratory notebook crash on me. My notebook never loses data. I have never had to troubleshoot my notebook, spend three hours on hold with IBM or wait for a new part for my notebook to come.'"
In a nutshell, paper is reliable.
But Klemmer has found a compromise between the new and the old. With his graduate student Ron Yeh, he has developed a computer software program, coined ButterflyNet, which gently eases scientists into the digital domain without forcing them to abandon their trusty pen and paper. Aided by special versions of these time-honored tools, ButterflyNet creates identical copies of handwritten notes in a digital format, literally crafting a digital laboratory notebook. Furthermore, the software compiles data gathered from a variety of other media—digital cameras, global positioning system (GPS) devices, wireless sensor networks—and automatically inserts it into the digital notebook on the appropriate "page."
"ButterflyNet has a browser where you can view all your notes and photographs side by side, organized by the time that you took them," said Yeh, the system's lead architect.
The technology is straightforward and user-friendly. The oversized digital pen operates just like an ordinary ink pen. Beyond its writing capabilities, the pen contains in its tip a small camera that captures each stroke as it glides across specialized "smart paper." These sheets, lined with faint ink dots serving as navigational guides, enable the software to later recreate the written page from the series of captured images. The digital data is stored in the pen's built-in memory chip until it can be transferred to a computer.
"This means that you have a notebook in the physical domain, and you also have a copy of that notebook in the electronic domain," Klemmer said. "To the scientists, having a digital backup of their work is a slam-dunk win."
The handwritten notes, each stroke chronologically ordered by the pen's internal clock, are imported by simply plugging the pen into the computer. Supplementary photos are then matched to the notes using the images' own timestamps from the camera's clock. Alternatively, simple written symbols, such as square brackets drawn on the notebook page, prompt ButterflyNet to insert a photo at the designated location by automatically selecting the photo snapped around the time the brackets were drawn.
Once uploaded into the ButterflyNet browser, the data can be organized, searched, exported to spreadsheets or shared between colleagues. The laborious and error-prone task of transcribing reams of notes or data into the computer by hand is eliminated.
The first reviews are in. Field-tested by a team of California biologists, ButterflyNet received high marks for increasing the amount of data collected and enabling the sharing of data with others.
Yet the early prototype is not without glitches, including the occasional insertion of the wrong photo or failure of the pen's camera to detect strokes, leaving holes in the notes. Despite these drawbacks, ButterflyNet's initial success has encouraged Klemmer and Yeh to devise improvements to both enhance its current usage with biologists and archeologists and expand its utility beyond field science. They envision the integration of additional media streams, such as audio and video recordings, and handwriting-recognition software to enhance the searching capabilities.
"While observation of field biologists inspired these techniques, the insight of combining physical and digital representations has broad applicability," Klemmer said. "We have also had significant interest from other communities, such as designers, anthropologists and the medical community."
He added, "This is the first time that there has been a system in this area that is actually engineered and implemented well enough that real people can use it without magic from behind the curtain."
Andreas Paepcke, senior research engineer in the Computer Science Department; Brian Lee, computer science graduate student; Boyko Kakaradov, computer science undergraduate; Jeannie Stamberger, biological sciences graduate student; and Chunyuan Liao and Francois Guimbretiere, graduate student and assistant professor in the Computer Science Department at the University of Maryland-College Park, respectively, also contributed to this research.
Anne Strehlow is a science-writing intern at the Stanford News Service.
Scott Klemmer, Computer Science: (650) 723-3692, email@example.com
This release was written by science-writing intern Anne Strehlow. Photos of collection of field data are available on the web at http://newsphotos.stanford.edu.
Email firstname.lastname@example.org or phone (650) 723-2558.