Computational CellScope LED domeThe Waller Lab develops algorithms and devices (including the LED dome pictured above) to enhance imagery for improved diagnostics (Photos courtesy Laura Waller)

Enhanced microscopic resolution for improved diagnostics

June 17, 2015 by Thomas Walden Levy

Simple, low-cost computational techniques developed by Berkeley Engineering researcher Laura Waller are giving standard optical microscopes — and even smartphones — powerful new ways to see the minuscule.

With LED lights in a custom-built device, Waller can dramatically boost resolution by making several microphotographs of a cell sample while changing only the angle of the lighting. She then processes the set of photographs with algorithms that can greatly enhance resolution, build a 3-D image and make transparent objects visible.

There are several significant advantages to this innovative process. Waller’s methods allow phase imaging — used to provide contrast and make transparent structures visible — to be done digitally, avoiding the use of expensive specialized lighting and cell-damaging dyes. Her techniques work like a kind of Photoshop to digitally blend a set of microphotographs into an image with greater resolution than that of any individual image alone and create a field of view far greater than the original. These new tools also allow scientists to digitally mix microphotographs of a relatively thick sample into a 3-D image and refocus out-of-focus originals.

Laura WallerEECS assistant professor Laura Waller. (Photo by Peg Skorpinski)“We’re developing computational ways to do this without expensive hardware,” says Waller. “The concepts and algorithms have been around for a while; what’s new is using the technique of patterning illumination from different angles.”

Waller’s dual interests in building hands-on devices and exploring abstract theory go back to her early days growing up with a chemistry teacher mother and computer programmer father. “They were both nerds,” chuckles Waller. “I spent a lot of time doing electronics projects or fixing cars with my dad.”

She went on to study electrical engineering and computer science at MIT, earning her bachelor’s, master’s and doctoral degrees. While there, she interned in an optics lab, quickly dove into computational microscopy and never looked back.

“I really enjoy interdisciplinary research,” she says. “Bringing physical optics together with signal processing algorithms is a powerful combination that can do things which couldn’t even be imagined by super experts in either field alone.”

Waller came to Berkeley in 2012, joining the EECS faculty as an assistant professor. Since then, she and the team from her lab have focused on these types of interdisciplinary efforts. She says half her team’s innovations come from hands-on work with physical optics and design, and the rest from understanding the theory well enough to coax higher performance out of the signal-processing algorithms they work with.

“We have to figure out where to change things to make a better system,” says Waller. “Do you change the data processing or your physical system? Which is more practical? How do they interact?”

One recent project that conveys the power of Waller’s computational microscopy is a clip-on for the CellScope microscope developed by bioengineering department chair Dan Fletcher, which uses a Nexus 5 cell phone camera outfitted with a microscope lens. The new Computational CellScope uses domed LED programmable illumination and Android code written by a team of undergraduate students, led by graduate student Zack Phillips, to enable 3-D and phase imaging for better disease diagnosis.

“I think it’s cool,” says Waller. “It makes the point that these computational microscopy ideas are so simple they can even be done on a cellphone.”

The challenge has been that typical low-power smartphone microprocessors have limited memory and computational capabilities. For clinicians in remote areas using a CellScope to diagnose tuberculosis and malaria, this can be problematic. But with Waller’s techniques, out-of-focus images of patient blood could be refocused and provide clinicians with an expanded view to search for diagnostic clues.

For Waller, who earlier this year was named a Bakar Fellow — a program run by the university’s Vice Chancellor for Research to translate research into commercialized products — this application is key.

“We want this to be a practical device that can actually be taken into the field, not just a research journal paper,” she says. “That’s a big part of our motivation; this has the potential to have an impact on real-world problems, like disease diagnosis.”