A Bug’s Life: Robots Programmed to Think Like Insects
By Kevin Ritchart
Recent advancements in nanotechnology have allowed engineers to build robots the size of small insects, but getting these miniscule mechanical marvels to behave like their real-life counterparts is proving to be much more of a challenge.
Engineers from Cornell University are working on a new type of programming that mimics the way an insect’s brain functions. The amount of computing power needed to perform such a feat would typically require each of these tiny robots to carry a desktop computer with them every time they took to the air, but the emergence of neuromorphic computer chips can significantly lessen a robot’s payload.
Unlike traditional chips that utilize binary code, neuromorphic chips process spikes of electrical current that fire in complex combinations, like they do in a human brain. Silvia Ferrari, Professor of Mechanical and Aerospace Engineering and Director of the Laboratory for Intelligent Systems and Controls at Cornell University, is leading a team of engineers in developing a new class of event-based algorithms that mimic brain activity and can be programmed onto neuromorphic chips. These chips would use less power than traditional processors, allowing engineers to pack more computation into a smaller footprint.
Ferrari’s team of engineers is working with the Harvard Microbiotics Laboratory, which has developed an 80-milligram RoboBee that’s been outfitted with vision, optical flow and motion sensors. Currently, the 3-centimeter-wide robots remain tethered to their power source, but researchers from both Harvard and Cornell are hopeful that neuromorphic chips will make the RoboBee more autonomous and adaptable to different environments without significantly increasing its weight.
"Getting hit by a wind gust or a swinging door would cause these small robots to lose control. We're developing sensors and algorithms to allow RoboBee to avoid the crash, or if crashing, survive and still fly," said Ferrari. "You can't really rely on prior modeling of the robot to do this, so we want to develop learning controllers that can adapt to any situation."
Cornell doctoral student Taylor Clawson, a member of Ferrari’s team, created a physics-based simulator model of the RoboBee and the aerodynamic forces it encounters with each wing stroke. The model can accurately predict RoboBee’s movements through complex environmental conditions.
Ferrari plans to help outfit RoboBee with new micro devices such as a camera, expanded antennae for tactile feedback, contact sensors on the robot's feet, and airflow sensors.
"We're using RoboBee as a benchmark robot because it's so challenging, but we think other robots that are already untethered would greatly benefit from this development because they have the same issues in terms of power," Ferrari said.
- What can scientists learn from the data collected by RoboBee?
- Are there other areas of scientific study where this type of nanotechnology could be useful?