Robots: Brain-Machine Interfaces

Posted 14 Aug 2009 at 09:55 UTC (updated 14 Aug 2009 at 15:25 UTC) by mwaibel Share This

Like reported a few days ago, insects have an amazing vision system, far surpassing any of our current sensing technology. Charles Higgins from the University of Arizona has now told the Robots podcast how he taps into the spinal cord of dragonflies to use them as extremely powerful sensors for his robots (compare his work on Neuromorphic Vision). The picture above shows an earlier version of the robot with an onboard moth used as a sensor in a closed-loop control system and a dragonfly. The same episode also features an interview with Steve Potter at the Laboratory for NeuroEngineering at Emory and Georgia Tech famous for his Hybrots (hybrid robots). Rather than interfacing with existing animals, he grows neural circuits in Petri-dishes and hooks them up to the sensors and actuators of robots. Potter describes the resulting semi-living animals or animats, and he discusses both technical and ethical implications of this technology. Tune in to this episode on brain-machine interfaces or listen to two previous, related episodes on building robot flies and manufacturing insect-sized robots.

See more of the latest robot news!

Recent blogs

20 Apr 2014 Flanneltron (Journeyer)
19 Apr 2014 mwaibel (Master)
17 Apr 2014 shimniok (Journeyer)
8 Apr 2014 Petar.Kormushev (Master)
6 Apr 2014 steve (Master)
2 Mar 2014 wedesoft (Master)
1 Dec 2013 AI4U (Observer)
13 Nov 2013 jlin (Master)
23 Jun 2013 Mubot (Master)
13 May 2013 JLaplace (Observer)
Share this page