Sensors

Self-Aiming Camera

Posted 9 May 2001 at 14:01 UTC by steve Share This

Researchers at the Beckman Institute's AI Group and Robot Vision Lab have developed a new camera system that uses both audio and visual cues to point itself toward "interesting" activity. The camera is controlled by a neural network that simulates the superior colliculus of the human brain. A wide angle camera and two microphones are used to localize activity and a second camera with a tighter view is then directed towards the desired area. The system contains a database of sounds to help it determine what types of sound are interesting and, over time, it learns to associate sounds with the visual targets that make them.

See more of the latest robot news!

Recent blogs

13 Dec 2014 mwaibel (Master)
3 Dec 2014 shimniok (Journeyer)
14 Nov 2014 Sergey Popov (Apprentice)
14 Nov 2014 wedesoft (Master)
5 Aug 2014 svo (Master)
20 Jul 2014 Flanneltron (Journeyer)
3 Jul 2014 jmhenry (Journeyer)
3 Jul 2014 steve (Master)
2 Jul 2014 Petar.Kormushev (Master)
10 Jun 2014 robotvibes (Master)
X
Share this page