This is an EU-funded project to create a refined understanding of retinal function in natural visual environments by examining the unique role that non-standard retinal ganglion cells play in dynamic visual processes.
VISUALISE will combine the efforts of physiologists, computational neuroscientists, neuromorphic electronic engineers, and roboticists, to build novel theoretical and hardware models of biological retinal ganglion cell types for dynamic vision applications such as robotic navigation or pursuit.
The VISUALISE project aims to solve a number of related problems previously ignored in existing retina processing models, namely:
- Investigating the dynamics of non-standard type retinal ganglion cells with natural scenes.
- Identifying the nonlinear transformations from natural scenes to non-standard type ganglion cell response and associated computational models, including the influence of latency on encoding.
- Statistical analysis of retinal ganglion cell population coding response to natural stimuli including measuring how the cells interact.
- Software and hardware emulation of an event-based bio-inspired artificial retina that captures the dynamics and adaptive nature of both individual neurons and neuronal populations and their precisely-timed and correlated spiking output.
- Experimental study, application and evaluation of the bio-inspired artificial retina under challenging visual conditions in a robotic predator-prey scenario.
How?
The retina is an extension of the brain and formed embryonically from neural tissue and connected to the brain by the optic nerve. The retina is the only source of visual information to the brain and a uniquely accessible part of the brain suitable for investigating neural coding.
The VISUALISE consortium will achieve its aims by (1) recording the activities of vertebrate retinal ganglion cells using multi-electrode arrays under dynamic natural stimulation, (2) analysing the functional response properties to expose new principles of spike encoding that bridge the gap between single cell and population information processing, (3) exploiting these principles in multi-scale mathematical models which permit efficient digital circuit implementations for a next generation of real-time event-based vision sensors, and (4) evaluating their effectiveness in a challenging predator-prey high-speed robot scenario.
VISUALISE is a three-year project funded under the European Union Seventh Framework Programme (FP7-ICT-2011.9.11) under grant agreement No. 600954 ("VISUALISE"). The project commenced in April 2013.
For further information and contact details, please visit the project website.