Robots: Brain-Machine Interfaces

Posted 14 Aug 2009 at 09:55 UTC (updated 14 Aug 2009 at 15:25 UTC) by mwaibel Share This

Like reported a few days ago, insects have an amazing vision system, far surpassing any of our current sensing technology. Charles Higgins from the University of Arizona has now told the Robots podcast how he taps into the spinal cord of dragonflies to use them as extremely powerful sensors for his robots (compare his work on Neuromorphic Vision). The picture above shows an earlier version of the robot with an onboard moth used as a sensor in a closed-loop control system and a dragonfly. The same episode also features an interview with Steve Potter at the Laboratory for NeuroEngineering at Emory and Georgia Tech famous for his Hybrots (hybrid robots). Rather than interfacing with existing animals, he grows neural circuits in Petri-dishes and hooks them up to the sensors and actuators of robots. Potter describes the resulting semi-living animals or animats, and he discusses both technical and ethical implications of this technology. Tune in to this episode on brain-machine interfaces or listen to two previous, related episodes on building robot flies and manufacturing insect-sized robots.

See more of the latest robot news!

Recent blogs

25 Nov 2015 shimniok (Journeyer)
15 Nov 2015 mwaibel (Master)
6 Nov 2015 wedesoft (Master)
26 Oct 2015 steve (Master)
20 Oct 2015 Flanneltron (Journeyer)
10 Sep 2015 svo (Master)
9 Aug 2015 Petar.Kormushev (Master)
6 May 2015 spirit (Journeyer)
14 Nov 2014 Sergey Popov (Apprentice)
3 Jul 2014 jmhenry (Journeyer)
Share this page