Welcome to Scribd. Sign in or start your free trial to enjoy unlimited e-books, audiobooks & documents.Find out more
Download
Standard view
Full view
of .
Look up keyword
Like this
1Activity
0 of .
Results for:
No results containing your search query
P. 1
Natural Image Statistics

Natural Image Statistics

Ratings: (0)|Views: 8|Likes:
Published by Tyle Stelzig
Simoncelli and Olhausen. Efficient coding hypothesis.
Simoncelli and Olhausen. Efficient coding hypothesis.

More info:

Published by: Tyle Stelzig on Feb 17, 2012
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

11/11/2013

pdf

text

original

 
Annu. Rev. Neurosci. 2001. 24:1193–216Copyrightc
2001 by Annual Reviews. All rights reserved
N
ATURAL
I
MAGE
S
TATISTICS AND
N
EURAL
R
EPRESENTATION
Eero P Simoncelli
 Howard Hughes Medical Institute, Center for Neural Science, and Courant Instituteof Mathematical Sciences, New York University, New York, NY 10003;e-mail: eero.simoncelli@nyu.edu
Bruno A Olshausen
Center for Neuroscience, and Department of Psychology, University of California, Davis, Davis, California 95616; e-mail: baolshausen@ucdavis.edu
Key Words
efficient coding, redundancy reduction, independence, visual cortex
s
Abstract
It has long been assumed that sensory neurons are adapted, throughboth evolutionary and developmental processes, to the statistical properties of thesignals to which they are exposed. Attneave (1954) and Barlow (1961) proposed thatinformation theory could provide a link between environmental statistics and neuralresponses through the concept of coding efficiency. Recent developments in statisticalmodeling, along with powerful computational tools, have enabled researchers to studymore sophisticated statistical models for visual images, to validate these models em-pirically against large sets of data, and to begin experimentally testing the efficientcoding hypothesis for both individual neurons and populations of neurons.
INTRODUCTION
Understanding the function of neurons and neural systems is a primary goal of systems neuroscience. The evolution and development of such systems is drivenby three fundamental components: (
a
) the tasks that the organism must perform,(
b
) the computational capabilities and limitations of neurons (this would includemetabolic and wiring constraints), and (
c
) the environment in which the organismlives. Theoretical studies and models of neural processing have been most heavilyinfluenced by the first two. But the recent development of more powerful modelsof natural environments has led to increased interest in the role of the environmentin determining the structure of neural computations.The use of such ecological constraints is most clearly evident in sensory sys-tems, where it has long been assumed that neurons are adapted, at evolutionary,developmental,andbehavioraltimescales,tothesignalstowhichtheyareexposed.
0147-006X/01/0621-1193$14.00
1193
   A  n  n  u .   R  e  v .   N  e  u  r  o  s  c   i .   2   0   0   1 .   2   4  :   1   1   9   3  -   1   2   1   6 .   D  o  w  n   l  o  a   d  e   d   f  r  o  m   w  w  w .  a  n  n  u  a   l  r  e  v   i  e  w  s .  o  r  g   b  y   U  n   i  v  e  r  s   i   t  y  o   f   R  o  c   h  e  s   t  e  r   L   i   b  r  a  r  y  o  n   0   4   /   0   6   /   1   1 .   F  o  r  p  e  r  s  o  n  a   l  u  s  e  o  n   l  y .
 
1194
SIMONCELLI
OLSHAUSEN
Because not all signals are equally likely, it is natural to assume that perceptualsystems should be able to best process those signals that occur most frequently.Thus, it is the statistical properties of the environment that are relevant for sen-soryprocessing.Suchconceptsarefundamentalinengineeringdisciplines:Sourcecoding, estimation, and decision theories all rely heavily on a statistical “prior”model of the environment.Theestablishmentofaprecisequantitativerelationshipbetweenenvironmentalstatisticsandneuralprocessingisimportantforanumberofreasons.Inadditiontoprovidingaframeworkforunderstandingthefunctionalpropertiesofneurons,sucha relationship can lead to the derivation of new computational models based onenvironmentalstatistics.Itcanalsobeusedinthedesignofnewformsofstochasticexperimental protocols and stimuli for probing biological systems. Finally, it canleadtofundamentalimprovementsinthedesignofdevicesthatinteractwithhumanbeings.Despite widespread agreement that neural processing must be influenced byenvironmental statistics, it has been surprisingly difficult to make the link quanti-tativelyprecise.Morethan40yearsago,motivatedbydevelopmentsininformationtheory, Attneave (1954) suggested that the goal of visual perception is to producean efficient representation of the incoming signal. In a neurobiological context,Barlow (1961) hypothesized that the role of early sensory neurons is to removestatistical redundancy in the sensory input. Variants of this “efficient coding” hy-pothesis have been formulated by numerous other authors (e.g. Laughlin 1981,Atick 1992, van Hateren 1992, Field 1994, Riecke et al 1995).Butevengivensuchalink,thehypothesisisnotfullyspecified.Oneneedsalsotostatewhichenvironmentshapesthesystem.Quantitatively,thismeansspecificationof a probability distribution over the space of input signals. Because this is a diffi-cultprobleminitsownright,manyauthorsbasetheirstudiesonempiricalstatisticscomputedfromalargesetofexampleimagesthatarerepresentativeoftherelevantenvironment.Inaddition,onemustspecifyatimescaleoverwhichtheenvironmentshould shape the system. Finally, one needs to state which neurons are meant tosatisfy the efficiency criterion, and how their responses are to be interpreted.There are two basic methodologies for testing and refining such hypotheses of sensory processing. The more direct approach is to examine the statistical proper-ties of neural responses under natural stimulation conditions (e.g. Laughlin 1981,Rieke et al 1995, Dan et al 1996, Baddeley et al 1998, Vinje & Gallant 2000). Analternativeapproachisto“derive”amodelforearlysensoryprocessing(e.g.Sanger1989,Foldiak1990,Atick1992,Olshausen&Field1996,Bell&Sejnowski1997,van Hateren & van der Schaaf 1998, Simoncelli & Schwartz 1999). In such an ap-proach,oneexaminesthestatisticalpropertiesofenvironmentalsignalsandshowsthat a transformation derived according to some statistical optimization criterionprovides a good description of the response properties of a set of sensory neurons.In the following sections, we review the basic conceptual framework for linkingenvironmental statistics to neural processing, and we discuss a series of examplesin which authors have used one of the two approaches described above to provideevidence for such links.
   A  n  n  u .   R  e  v .   N  e  u  r  o  s  c   i .   2   0   0   1 .   2   4  :   1   1   9   3  -   1   2   1   6 .   D  o  w  n   l  o  a   d  e   d   f  r  o  m   w  w  w .  a  n  n  u  a   l  r  e  v   i  e  w  s .  o  r  g   b  y   U  n   i  v  e  r  s   i   t  y  o   f   R  o  c   h  e  s   t  e  r   L   i   b  r  a  r  y  o  n   0   4   /   0   6   /   1   1 .   F  o  r  p  e  r  s  o  n  a   l  u  s  e  o  n   l  y .
 
STATISTICS OF NATURAL IMAGES
1195
BASIC CONCEPTS
Thetheoryofinformationwasafundamentaldevelopmentofthetwentiethcentury.Shannon (1948) developed the theory in order to quantify and solve problemsin the transmission signals over communication channels. But his formulation of a quantitative measurement of information transcended any specific application,device, or algorithm and has become the foundation for an incredible wealth of scientific knowledge and engineering developments in acquisition, transmission,manipulation, and storage of information. Indeed, it has essentially become atheory for computing with signals.As such, the theory of information plays a fundamental role in modeling andunderstandingneuralsystems.Researchersinneurosciencehadbeenperplexedbythe apparent combinatorial explosion in the number of neurons one would need touniquelyrepresenteachvisual(orothersensory)patternthatmightbeencountered.Barlow(1961)recognizedtheimportanceofinformationtheoryinthiscontextandproposed that an important constraint on neural processing was informational (orcoding) efficiency. That is, a group of neurons should encode as much informationas possible in order to most effectively utilize the available computing resources.We will make this more precise shortly, but several points are worth mentioningat the outset.1. The efficiency of the neural code depends both on the transformation thatmaps the input to the neural responses and on the statistics of the input.In particular, optimal efficiency of the neural responses for one inputensemble does not imply optimality over other input ensembles!2. The efficient coding principle should not be confused with optimalcompression (i.e. rate-distortion theory) or optimal estimation. Inparticular, it makes no mention of the accuracy with which the signals arerepresented and does not require that the transformation from input toneural responses be invertible. This may be viewed as either an advantage(because one does not need to incorporate any assumption regarding theform of representation, or the cost of misrepresenting the input) or alimitation (because such costs are clearly relevant for real organisms).3. The simplistic efficient coding criterion given above makes no mention of noise that may contaminate the input stimulus. Nor does it mentionuncertainty or variability in the neural responses to identical stimuli. Thatis, it assumes that the neural responses are deterministically related to theinput signal. If these sources of external and internal noise are smallcompared with the stimulus and neural response, respectively, then thecriterion described is approximately optimal. But a more complete solutionshould take noise into account, by maximizing the information that theresponses provide about the stimulus (technically, the mutual informationbetween stimulus and response). This quantity is generally difficult tomeasure, but Bialek et al (1991) and Rieke et al (1995) have recentlydeveloped approximate techniques for estimating it.
   A  n  n  u .   R  e  v .   N  e  u  r  o  s  c   i .   2   0   0   1 .   2   4  :   1   1   9   3  -   1   2   1   6 .   D  o  w  n   l  o  a   d  e   d   f  r  o  m   w  w  w .  a  n  n  u  a   l  r  e  v   i  e  w  s .  o  r  g   b  y   U  n   i  v  e  r  s   i   t  y  o   f   R  o  c   h  e  s   t  e  r   L   i   b  r  a  r  y  o  n   0   4   /   0   6   /   1   1 .   F  o  r  p  e  r  s  o  n  a   l  u  s  e  o  n   l  y .

You're Reading a Free Preview

Download
scribd
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->