As robots come out from behind their cages to work with humans on production floors and in warehouses and — eventually — inside our homes, we need to be certain we can prevent broken bones caused by machines that don’t know their own strength. Intelligent software and sensors already make it possible for people to work side-by-side with robots like Baxter, but to enable even closer interaction, a number of researchers are working on electronic skins (or e-skins) that give robots a more human-like sense of touch.
Over the weekend, a UC Berkeley research team shared its latest innovation in that area — a flexible network of sensors that lights up when touched. OLED arrays on a paper-thin surface light up in response to pressure, creating an e-skin that is itself interactive. The research was published in the journal Nature Materials on Sunday and was partially funded by DARPA and the Department of Energy.
“Integrating sensors into a network is not new, but converting the data obtained into something interactive is the breakthrough,” said Chuan Wang, the study’s co-lead author, in a university release announcing the publication. “And unlike the stiff touchscreens on iPhones, computer monitors and ATMs, the e-skin is flexible and can be easily laminated on any surface.”
Applying greater pressure to the e-skin produces more intense light, providing immediate human-readable feedback. Because of that, in addition to giving robots a better sense of touch, researchers envision the technology used in wallpapers that double as touchscreen displays, dashboard laminates that can be controlled with a wave of a driver’s hand and e-skin bandages that can be used as health monitors.
The next step, researchers say, is getting the sensors to respond to temperature and light as well as pressure.
To see the e-skin in action, check out the video below:
[ photo by Ali Javey and Chuan Wang, courtesy of UC Berkeley ]