|
Both animals and
machines are faced with similar problems when perceiving their environment.
We believe that strategies devised over the last 500 million years by
evolution can help us design better machines. The Vibrating Retina project aims to exploit naturally occurring vibrations onboard a robot to allow each pixel to scan a local area in space. This permits feature detection based on information from a single pixel reducing the problem of pixel mismatch, and increases the effective resolution of the sensor. The members of this part of the Klab include Alberto Pesavento, and Ania Mitros. Recent members of this part of the Klab include Dr. Oliver Landolt, Prof. Reid Harrison at the University of Utah, Prof. Charles (Chuck) Higgins at the University of Arizona, Prof. Timothy Horiuchi at the University of Maryland, and Theron Stanford, In our efforts to combine visual and vestibular sensors we collaborate with Prof. Rahul Sarpeshkar at MIT and with Prof. Yu-Chong Tai at Caltech. We also collaborate with Prof. Timothy Horiuchi at the Univeristy of Maryland, Prof. Ernst Niebur at Johns Hopkins, and Prof. Chris Diorio at the University of Washington in Seattle. We are working with them on creating learning algorithms for an aVLSI model of the primate's eye movement system (smooth pursuit and visual/auditory triggered saccades). These learning algorithms are implemented using floating gate technology to remove offsets and to enable the saccadic system to learn to set its gains by itself. The floating gate technology allows these algorithms to be implemented on chip while consuming a minimal amount of area and power. |
[Home] [News] [Research] [People] [Papers] [Classes] [Jobs][an error occurred while processing this directive]