Case studies in forward thinking
Working at the interface of technology and society, Berkeley Engineering faculty are shaping the future with their ambitious research, in high-impact areas from health to work to infrastructure. Here are nine examples of the groundbreaking work taking place across the college:
Goal-oriented artificial intelligence
Intelligence, artificial or otherwise, requires learning. People have all kinds of learning styles, but how do robots learn best? That’s a question taken up by electrical engineering and computer sciences professor Pieter Abbeel as he trains the Berkeley Robot for the Elimination of Tedious Tasks, better known as BRETT.
There are two main educational philosophies for developing artificial intelligence. Deep supervised learning trains a neural network to understand input-output combinations from examples. Apple’s Siri uses deep supervised learning, as does Amazon’s Alexa. Deep reinforcement learning, on the other hand, develops artificial intelligence for goal-oriented tasks by repeatedly trying to achieve the desired goal, learning from both past successes and failures.
What makes Abbeel’s BRETT research so novel is that his group was the first to show deep reinforcement learning success on real robots (much other work with deep reinforcement learning largely focuses on teaching artificial intelligence to play video games). The ultimate goal is to create helper robots that not only learn how to perform assigned tasks but are also capable of coping with changing situations and unscripted moments. Future applications might be
in homes and offices as well as in manufacturing and logistics.
Mail carriers were responsible for developing the early aviation systems in the United States, delivering airmail and opening new possibilities for air travel. After World War II, Congress began setting rules for the skies, resulting in the National Airspace System, which created an increase in both commercial and military uses of domestic airspace.
If Internet commerce trends continue, then parcel delivery may once again disrupt aviation systems. As companies such as Amazon and Google are experimenting with how to build the most effective package delivery drone, electrical engineering and computer sciences professor Claire Tomlin is using her expertise in control theory and machine learning to figure out the most efficient way to integrate unmanned aircraft into the National Airspace System.
Designing tomorrow’s airspace is not without its challenges, safety being chief among them. But Tomlin has an idea. She is investigating the possibility of using the space above sparsely populated regions, such as waterways and railways, as drone expressways, where tightly grouped unmanned aircraft could zing along at altitudes between 200 and 400 feet — high enough to not interfere with ground-based activities, but low enough to not be a nuisance for manned air travel.
In developing countries and other low-resource areas, lack of access to medical facilities with diagnostic equipment means that otherwise treatable diseases run rampant.
Bioengineering professor Dan Fletcher is working to address that problem by building devices that enable decentralized diagnostic medicine.
“More people have access to mobile phones than clean water,” Fletcher says. “Could we harness mobile phone technology for disease diagnostic purposes?” Over the last eight years, Fletcher’s lab has built many iterations of the CellScope, converting ordinary smartphones into a powerful diagnostic tool.
In addition to tuberculosis and malaria, CellScopes also enable safe treatment of river blindness in Africa and test for soil-transmitted parasitic worms. Besides sputum, blood and stool tests, the technology can also be used for primary care. A commercialized version of the CellScope helps detect ear infections in children, and a new device under development enables visualization of the retina for diabetic retinopathy screening.
Transit data trends
In recent years, the emergence of massive data sets from devices like mobile phones and connected vehicles has created unprecedented opportunities for urban mobility modeling. Going beyond traditional transportation models, which were built on census and demographic information, the new deluge of fine-grained data allows for more precise demand modeling — benefitting both public agencies and the driving public.
New mobility-as-a-service companies have millions of users in California, and hundreds of millions in the world, all using similar apps; a better understanding of large-scale mobility patterns, particularly in mega-cities and urban environments, has the potential to unleash tomorrow’s smart city operations.
For Alexandre Bayen — professor of electrical engineering and computer sciences and civil and environmental engineering, as well as director of the Institute for Transportation Studies (ITS) — the future of large-scale mobility rests on open data platforms that allow for transit systems to be modeled holistically across scales of time and geography. ITS was established in 1947 by the state legislature to address a lack of investment in public infrastructure during the WWII years and has been responsible for many technological innovations since: algorithmically-controlled metering lights on the Bay Bridge, platoons of connected-automated vehicles on California roads and the launch of one of the first traffic-monitoring smart phone apps.
Mechanical engineering professor Lydia Sohn is developing a simple blood test to reduce, or even eliminate, the need for invasive biopsy. Sohn’s test targets rogue cancer cells that carry the markers of a metastasizing tumor. She calls the search for these cells a needle-in-a-haystack problem: a blood sample might contain billions of cells, but only a handful are cancerous.
To isolate cancer cells, Sohn is building a microfluidic device that sits on a simple printed circuit board. Inertial and acoustic forces sort blood cells by size — cancer cells are larger than red blood cells and the majority of white blood cells — and don’t damage or alter the properties of the sample in a way that simple filtering or applying protein signatures might. Once the cells are sorted, they can be tested for biochemical and mechanical markers, revealing information about a cancer’s status.
Despite years of research, robots remain remarkably clumsy. But it may be possible to facilitate deep learning of robust robotic grasping and train robots using a synthetic dataset of 6.7 million point clouds generated from thousands of 3-D models, according to recent work by published by industrial engineering and operations research professor Ken Goldberg and Ph.D. student Jeff Mahler from the Berkeley Laboratory for Automation Science and Engineering (or AUTOLAB). Traditional robots are programmed by painstaking human coding; now, with the right data, the traditional machine-learning curve can be shrunk from months to days. “Dex-Net 2.0 achieves 99 percent precision,” Goldberg says. “That includes many difficult-to-grasp objects it was not trained on, such as an old sneaker and a piece of fabric.”
Better robot dexterity has immediate applications for factories and warehouses. Looking ahead, teachable robots might also be welcomed into hospitals and homes.
As a Ph.D. student, Ren Ng worked on light field camera technology. A light field camera uses a new kind of sensor to capture fundamentally more information than a traditional camera — the four-dimensional field of light flowing along every ray into the camera as opposed to conventional two-dimensional images. Since a light field photograph contains more detailed image data, the technology allows a user to manipulate depth of field and focus in post-production, among other things.
After graduating from Stanford in 2006, Ng launched Lytro, a company that commercialized his doctoral research and brought consumer light field cameras to market. Today, the company is known for building camera systems for high-quality virtual reality and cinema. In 2015, Ng joined the electrical engineering and computer sciences faculty and is applying his expertise to computer vision, virtual reality and neuroimaging.
As part of research funded by the National Science Foundation, Ng is rethinking the way that hardware and software can be combined to fully capture light and produce images containing more information. Target applications for such computational imaging systems include everything from miniaturized lenses to cameras for virtual reality capture to imaging the brain.
Electrical engineering and computer sciences professor Ana Claudia Arias sees a disconnect between the way our electronics are currently built and the way we use them. Her proposal: design and build devices that can conform to our bodies. Specifically, she is looking for ways to change how medical diagnostics are performed.
“At Berkeley, we are bringing this vision of flexible, lightweight electronics to magnetic resonance imaging (MRI),” Arias said during a World Economic Forum talk. Current MRI technologies are good at producing images related to soft tissue health and function, but because of the way the machines are set up — the coils that produce the images are not immediately next to the body — the process can take some time.
Currently, Arias is experimenting with developing flexible electronics by using a method similar to how silkscreen t-shirts are made. Instead of ink, electronics are joined to flexible membranes, which can then be manipulated into a variety of shapes.
If human bodies are made up of molecular machines cycling through biological processes, then proteins are the fuel for life. Because of the role that proteins play in the way a body functions — they contain genetic information, ignite metabolic reactions and transport critical material — they are useful biomarkers. Almost like time-stamped data packets, proteins contain crucial information, including the state of an illness, even one that may not be showing outwardly. The problem is that these windows into overall health are like stars in the night sky: there are millions of proteins in a human body, and they are constantly changing.
“How do you measure proteins in a packet of life that is so small and so inaccessible?” bioengineering professor Amy Herr is fond of asking when presenting her work to build microfluidic tools capable of measuring proteins in each individual cell — among thousands of cells — all at once. The answer, it turns out, is found inside of a smart phone.
This time, it’s not an app, but the way that cell phones and other mobile devices are manufactured that is useful. As part of her research, Herr is trying to figure out how to build new diagnostic equipment using the same advanced manufacturing processes used to build the electronics inside mobile phones.