Drone guidance assisted by augmented reality

Seeing is believing

Leveling up with augmented reality

When I was a kid, I would set off traversing orienteering courses through the hardwood forests of my native New England. The goal was to find a flag somewhere far off a trail using a topographic map and compass. It’s a very interactive process: I was constantly positioning the map, readjusting headings and trying to find discernible landmarks.

Sometimes I would end up lost; other times, with my bearing right on, I felt like a successful explorer, emerging from the untrammeled wilderness. It was a straightforward lesson about how using technology, albeit simple, changed the way I considered what was right in front of me.

I haven’t thought about orienteering in a long time. These days, my wayfinding needs are typically satisfied by opening Google Maps on my phone. But recently, as I sat down with electrical engineering and computer sciences (EECS) major Daniel Pok and computer science major Isabel Zhang, the co-founders of a relatively new student organization called VR@Berkeley, my mind wandered back to my experiences with maps and compasses.

We talked about how virtual reality (VR) and augmented reality (AR) — once relegated to movie screens and sci-fi novels — are poised to revolutionize computing. Forty years ago, we saw AR’s potential in the opening scene of the “Star Wars” epic, when Luke Skywalker responds to a hologram of a distressed Princess Leia.

Today, if the recent Pokémon Go craze (where players blend the fictional world of Pokémon with real-world environments) is any indication, AR has arrived.

Through a combination of hardware and software, AR and VR convert computing from a flat, two-dimensional screen to an immersive, interactive, three-dimensional experience. AR users wear a headset, but retain some visibility. Software is constantly mapping a user’s surroundings with efficient localization functions and then simultaneously overlying digital images and interfaces in appropriate places in the real physical surroundings.

Isabel Zhang with Gear VR gogglesIsabel Zhang, VR@Berkeley co-founder. (Photos by Noah Berger)VR is a different experience: users wear an eye-covering headset, which serves as a wearable screen with embedded motion sensors — a fully synthetic digital experience that is completely divorced from actual physical surroundings.

Pokémon Go isn’t quite AR because it uses a phone screen instead of a headset, so it’s not completely immersive. The game does, however, give a glimpse of where the technology is heading.

So far, commercially available VR applications include gaming and entertainment. While a few limited AR products are available now, Silicon Valley-based consulting firm Digi-Capital predicts that the field will explode in market value, reaching $150 billion by 2020. Potential applications range from telemedicine to more intuitive control for robots on factory floors.

For Zhang, the technology is already life-changing. She was carrying heavy biology-related course loads with aspirations of pursuing a career in medicine when Pok introduced her to VR. “I got interested in VR and then took an intro computer science course. After that, I decided to switch to computer science,” Zhang says. “So it’s definitely changed my life.” Now she’s creating immersive animated short films. “Watching a film in VR can create so much more emotion and evoke a lot more out of a wide variety of people. Being able to contribute to that is exciting.”  

The VR club started with a handful of members in early 2015, and has since grown to 200 members from across campus working on a range of projects, including an augmented 3-D virus model that pops off the page of a biology textbook and the use of virtual reality to play the Campanile’s carillon.

“The idea is that VR is the ultimate medium,” Pok says. “When you get there, you can shape the world however you want.”

Pok, Zhang and fellow members of Berkeley’s VR club are advised by Allen Yang, executive director of the Center for Augmented Cognition (CAC), headquartered in Cory Hall. Yang is working at a new frontier, one where the topography of the physical world is being reconfigured by digital tools.

Yang came to Berkeley as an EECS postdoctoral researcher in 2006. After a stint in industry as employee number one at Atheer Labs, the maker of headsets that enable interactive computing, he returned to campus to lead CAC, along with faculty director Shankar Sastry, also dean of the college. The center opened this spring, after the college identified the need to integrate emerging research on virtual and augmented reality, including human-computer and human-robot interactions.

“We realized we need two things,” Yang says about the center’s founding.  “We need resources for researchers and students to be able to conduct projects or research in this area, and we need to have a community that can circle around this topic.”

Unlike many other university-based research areas, where commercial products follow theoretical research, in the case of augmented reality, Yang says, companies are moving faster than academia. “Industry has provided us a lot of significant problems to solve,” he says.

Future history

While the technology might be crackling with a sense of newness, Berkeley researchers have been laying the foundation for augmented reality’s theoretical and computational framework for more than a decade.

In 2005, Berkeley’s Teleimmersion Lab, led by EECS professor Ruzena Bajcsy, began collaborating with researchers from the University of Illinois Urbana-Champaign to build virtual teleportation systems, first through virtual choreography and dance. “We were looking to immerse people in virtual environments and see how they could collaborate remotely,” says Gregorij Kurillo, a research engineer in the lab. “A lot of our work was using computer vision to image and capture people in real time, send the data over the net and render the data in a virtual environment to allow some kind of interaction.”

In 2007, Berkeley engineers collaborated with Stanford researchers, creating virtual Tai Chi classes. “We showed that when you immerse people in a virtual environment with an instructor, they learn movements faster and are able to more accurately replicate what the teacher is showing,” Kurillo says. By 2009, the lab moved more toward interacting with data via a virtual geology platform with UC Davis and on virtual archaeology with UC Merced.

The AR/VR field is predicted to explode in market value, reaching $150 billion by 2020.

– Source: Digi-Capital

In 2010, Microsoft released its Kinect camera, which captures full-body 3-D images and is designed for the gaming community. Berkeley researchers saw a potential for other applications and began a partnership with physicians at UC Davis Medical Center to study how to use AR and VR technologies to improve telemedicine.

Recently, I met with Yang at the Center for Augmented Cognition on the third floor of Cory Hall to try to better understand what augmented reality looks like. After a couple of rounds of questions, he suggests I try the different headsets to see for myself. He retreats to his office for a minute, to retrieve a Microsoft HoloLens headset.

He returns, and it’s time to augment my reality. He turns on the HoloLens and gives me a quick tutorial, which consists of demonstrating a pinching motion to use as a mouse function. The rest is pretty intuitive; after I put the device on my head, it tracks my gaze and gestures.

A detective game starts to unfold in this new world, which is not completely digital, but transformed just enough that I forget that I’m wearing odd headgear and moving in somewhat exaggerated gestures (swiping and pointing and pinching) in the middle of a public space. Real walls now contain maps, information and game menus. Elements are added to my surroundings — a garbage can containing clues appears next to a real chair, for instance.

I have only just put on the headset, but already I’m “advancing the story,” as Yang calls it, with relative ease. Immediately the potential for AR to become a better and more intuitive way for humans to interact with machines becomes clear.

A new map

Yang and his team obtained HoloLens gear only a few months earlier, after being selected as one of five research groups sponsored by Microsoft. The Berkeley team proposed investigating using the HoloLens as an interface to control drones.

“We submitted a proposal together with DJI, the maker of one of the most popular drones, to lower the bar for everyday people to be able to control a drone through a new kind of interface, because existing controllers are very difficult to learn,” Yang says.

Like teleimmersion, studying the control systems undergirding drone flight has a history on campus. “The Berkeley Robotics Lab with Shankar Sastry, Claire Tomlin and others has conducted pioneering work in drone safety, drone control and unmanned drone maneuvering,” Yang says, “a very solid foundation for this kind of work.”

Now, a team of graduate students at the CAC is developing more intuitive interfaces for piloting drones and other robots that rely on high levels of human direction and control. “What we want is to reimagine the modality of training or working, especially with aerial drones,” Yang says. “Instead of just a master-client relationship, we want to make it more interactive.”

Oladapo Afolabi, an EECS Ph.D. student working on the HoloLens research project, explains one potential application for more advanced drone interactions. “The idea is that a remote user would have control over a group of robots, specifically quadcopters, like in a search-and-rescue scenario,” he says. “If the quadcopters are in a remote location, separated from the team, then the operator would be able to visualize what the drones see using an AR headset, and get a better understanding of what’s going on inside an unsafe environment. They can also set waypoints or coordinates where they want more robots to visit.”

The idea of improving communication between humans and devices operating with artificial intelligence is crucial to advancing robotic technology. “All of these virtual reality and augmented reality headsets are giving you that door that lets a machine talk to you and lets you see things in your surroundings as a machine would see them,” says Vicenç Rubies Royo, an EECS Ph.D. student who is also collaborating on the HoloLens project. “What you have right now are tools like joysticks and screens to send and receive information, but with augmented reality, the real 3-D environment becomes the interface, so you get and send information that is way more expressive.”

In all, eight faculty members are currently affiliated with CAC. Beyond engineering, other researchers are examining how to use these new tools to create and consume information across multiple disciplines, including autonomous vehicle navigation, content creation and new media. In a lot of ways, Berkeley researchers are trying to build the tools, maps and compasses to make our world more immersive. After all, Yang says, “In the future, a parallel digital world will exist side-by-side with the physical world.” And we’ll all need to know how to navigate there.

Allen Yang at the Center for Augmented Cognition

Envisioning a whole new reality

Not yet a year old, the Center for Augmented Cognition (CAC) was established to advance the new frontier of augmented and virtual reality. Allen Yang, the center’s executive director (pictured above), is joined by other engineering faculty including Ruzena Bajcsy, Francesco Borrelli, Lee Fleming, Ren Ng, Claire Tomlin and CAC faculty director and college dean Shankar Sastry. Also on the team are multimedia journalist Richard Koci Hernandez, assistant professor at the Graduate School of Journalism, and Oculus co-founder Jack McCauley (B.S.’86 EECS), now Innovator-in-Residence at the Jacobs Institute for Design Innovation.

Among the center’s first projects, CAC researchers have launched ISAACS — the Immersive Semi-autonomous Aerial Command System — that will use an augmented reality headset to make a more intuitive drone interface.

Topics: EECS, Computing, Research, Robotics & AI, Students

Reach the editors at berkeleyengineer@berkeley.edu