Personal robots that can bake cookies, shoot pool and — in the hands of EECS professor Pieter Abbeel — fold laundry are evidence of a new generation in artificial intelligence, jump-started by a Silicon Valley tech company’s PR2 robots.
Artificial intelligence archive
In spring 2012, the Floating Sensor Network project, led by associate professor of EECS Alexandre Bayen, launched a flotilla of 100 robots down the Sacramento River to provide data on water movement and pollutant spread.
This week the state of Nevada finalized new rules that will make it possible for robotic self-driving cars to receive their own special driving permits. Do people notice a self-driving car and gawk? “We get a lot of thumbs up,” says Berkeley Engineering alum Anthony Levandowski (M.S.’03 IEOR), one of the leaders of Google’s self-driving car project. Google’s fleet of robotic cars has driven more than 200,000 miles over highways and city streets in California and Nevada.
From flying and crawling through quake-ravaged wreckage to performing dexterous feats of minimally invasive surgery and enabling paraplegics to walk, the vision of what robots and intelligent machines can do has come a long way since I first began the robotics effort at Berkeley in 1983.
Recently, the Department of Homeland Security issued a proposal for a disposable robot that could be used in search and rescue missions. This week, a lab at UC Berkeley unveiled a contender: a mechanical cockroach with wings. “What’s really interesting here,” says Ron Fearing, professor of electrical engineering and computer sciences, “is that we don’t have things that fly really well, that fly like birds. And we don’t have things that run really well, like a cockroach or a rat can. But combining the two, we can actually do more than with either of them by itself.”
Designing a robot to mimic the basic capabilities of motion and perception would be revolutionary, researchers say. Yet the challenges remain immense, far higher than artificial intelligence hurdles like speaking and hearing. The limits of today’s most sophisticated robots can be seen in a robotic towel-folding demonstration pioneered by a group of students at the University of California, Berkeley, last year. “Our end goal right now is to do an entire laundry cycle,” said Pieter Abbeel, a Berkeley computer scientist who leads the group.
Microsoft’s Kinect, a motion-tracking peripheral for the Xbox console that is packed with an irresistible blend of cameras and sensors, is finding popularity among researchers such as UC Berkeley engineering graduate student Patrick Bouffard. Working out of Professor Claire Tomlin’s lab, Bouffard built a Kinect-enhanced robotic helicopter that perceives objects in its path. A video of the device has been a viral hit on YouTube.
EECS grad student Patrick Bouffard, working with Professor Claire Tomlin from the Hybrid Systems Lab, has used Microsoft’s Kinect controller to create a quadcopter which can maneuver around obstacles autonomously. The developers attached the Kinect hardware to the device which delivers a point cloud to the on-board computer and allows the vehicle to map its surroundings and move about intelligently. A video documenting the project and posted on YouTube is on track for going viral.
A robotic exoskeleton called eLEGS enables people who have been paralyzed below the waist to walk again. The technology, the latest in a line of “human augmentation robotics systems” that Berkeley Bionics has created with the Robotics and Human Engineering Laboratory at the University of California, Berkeley, is geared toward consumers — the 6 million or so paraplegics in the U.S. who are bound to wheelchairs.
A team of UC Berkeley researchers interested in domestic applications for robotics has shown that Willow Garage’s PR2 robot can be a handy household companion, namely laundry-folding. Now, they’ve shown that if you give PR2 a sock it can employ its keen ability for repetitive hand motions to that other regularly recurring chore: pairing socks.
Who wouldn’t want a robot that could make your bed or do the laundry? A team of Berkeley researchers has brought us one important step closer by, for the first time, enabling an autonomous robot to reliably fold piles of previously unseen towels. Robots that can do things like assembling cars have been around for decades. The towel-folding robot, however, is doing something very new, according to the researchers, doctoral student Jeremy Maitin-Shepard and assistant professor Pieter Abbeel, both of UC Berkeley’s Department of Electrical Engineering and Computer Sciences.
It’s no surprise that a Google search for Peter Norvig turns up tens of thousands of hits. Norvig (Ph.D. ’86 EECS) literally wrote the book on artificial intelligence, coauthoring a bestselling textbook on the subject with Professor Stuart Russell in 1995. As the senior computer scientist at NASA Ames Research Center, he led the team that developed the remote artificial intelligence software that flew aboard the Deep Space 1 spacecraft in 1999. And today, as Google’s director of research, Norvig is transforming the way information is organized and accessed on the Web.