You are on page 1of 64

Structure of Biological neuron

• Information processing cell of the brain


• Neurons can come in various sizes and shapes
• Neuron-Soma/cell body that contains cell’s nucleus and
other vital components called organelles to perform
specialized tasks
• Communication links of neuron
• Dendrite: receives its input electrical signal through this
• Axon: tubular extension from the cell soma. carries
electrical signal away from the soma to other neuron for
processing
• It carries action potential
• Axons may travel for larger distances or terminate in local
regions
• usually terminate on the dendrites of other cells or on
muscles
• together called as processes
Glial cells and Membranes
• Glial cells, also called or neuroglia, are cell
which are non-neuronal and are located within
the central nervous system and the peripheral
nervous system.
• They provide physical and metabolic support
to neurons, including neuronal insulation and
communication, and nutrient and waste
transport.
• Glial cells account for about nine-tenths of the
cellls in the brain.
• They come in three varieties, each with a
specialized role
• Star shaped Astroglia surround the neurons and
isolate them from the smallest bloood vessels of
the brain called capillaries
• They form barrier called blood brain barrier,
across the barrier thay selectively absorb the
nutrients from the blood and transfer them to the
neurons
• provide physical support and electrical isolation
for neurons
• Isolation minimizes inter neuron cross talk
• Micro glia: cells that move continuously
between neurons and glia to clean up debris
• Oligodendroglia : send out membranous
processes that wrap themselves tightly around
axons forming a dense layer of spiraling
membranes called myelin sheath.
• (Ability of remembering all the dynamic
processes which occur with in the cell
membrane)
• Plasma membrane encloses neurons
• It is a two layered structure
• The membrane consists of molecules called phospholipids,
which spontaneously arrange themselves into a double layer.
• Each molecule has hydrophilic (“water loving”) heads on the
outside and hydrophobic (“water hating”) tails on the inside.
• hydrophilic heads face toward the surrounding watery
medium and tails are isolated from water facing one another
inside
• There are other variety of proteins embedded in the cell
membrane
• The function of these proteins is
to regulate the transport of the ions through pores in the
membranes
to move ions across the membrane from one side to the other
• They are referred as ion channels
• Properties of ion channels
• Allow the passage of ions at a very high rate
• Very selective to specific ion
• they can open or close depending upon a
voltage or chemical signal
Resting potential
• Due the difference in ion concentrations inside
and outside cell, there is difference in electrical
potential across the membrane-resting
membrane potential
• It is negative-about -65 mV
Ion Concentration Concentration
inside(mM) outside(mM)
Sodium(NA+) 15 150

Potassium(K+) 100 5

Chloride(Cl-) 13 150
• Because of the difference in concentrations,
ions will tend to diffuse into and out of the cell
depending on the concentration gradient.
• Example: K+ concentration within the cell is
more, will diffuse out
Nernst Equation
• Given the charge of an ion and its concentrations
inside and outside the cell, the equation predicts
the potential difference assuming the ion in
question is in a state of equilibrium where it
neither flows into nor out of the cell.
• Two forces that keep X in equilibrium are
• concentration gradient: ions have the tendency to
diffuse inwards due to greater concentration of
ions on the outside
• electrostatic gradient: ions have the tendency to
be pushed out of or into the cell depending upon
the nature of the electric field that builds up
across the membrane due to imbalance of charge
neutrality
Graded potential
• Maintaining the resting potential-important
• decides about transmitting information
• External signals striking at synapses-create
disturbances in the cell potential-graded
potential
• when the external signal decreases the internal
potential below the resting potential-neuron is
hyperpolarized
• when the external signal rises the internal
potential in a positive direction w r t the resting
potential-neuron is hypopolarized/depolarized
• the disturbances –in the order of few mV
• these are conducted along neuron in the same way as
electric charge conducted along a resistive conductor
• dendrites has both resistance and capacitance, the
disturbance decays exponentially towards zero in space
• amount of fluid inside the membrane of the dendrite-
responsible for –conduction of electricity
• dendrites-large diameter-more fluid-offer less
resistance-conduct better
• graded potential also varies in time
• when external stimulus disturbs ionic balance of ions in
resting membrane-finite time is required for diffusion
and pumping of ions-to restore the resting potential
• when stimulus is epotential is restored-rest value-small
fraction of second
Action potential
• neurons-receive-inputs from other neurons –
along dendrites-at points-synapses-post synaptic
potentials
• psp-occurs-asynchronously in space and time at
various points along difft dendrites-neuron
• each psp-decreasing/increasing voltage pulse
that decays in space and time
• cell soma-receives-psp superimposed upon each
other-integrates
• soma potential increases and decrease to resting
value
• at axon hillok-ion channels exist(proteins) –
sensitive-integrated soma potential
• ion channels-moniters-soma potential-when
cell potential exceeds a threshold value-say -
40mV-neuron fires action potential-
transmitted down its axon-towards-synaptic
terminal
Chemical synapses
• communication b/w neurons takes place
through synapses
• synapse-contact points
• at synapse, unidirectional conduction of a signal takes
place from pre synaptic to post synaptic membrane
• synapsis strengthens each signal and stretch it out-to
have large amplitude and longer duration than the pre
synaptic potential
Two step transmission process at synapse
 from electrical action potential to chemical transmitter
substance which is releases into synaptic cleft
 from chemical transmitter back to an electrical signal
post synaptic potential
mathematical neuron model
models the
internal firing

positive weights-excitatory synapses


negative weights-inhibitory synapses
• neuron activations xj change quickly-based on
integrated impinging signals
• learning process – involves-changes in weights
• changes in weights – implemented-learning
algorithm
• rapidly changing xj=short term memoty
• slowly changing wij=long term memory
• inner product-represents-degree of match b/w input
vector and weight vector
• activation – measures the similarity b/w input vector
and weight vector
• view-neuron-filter
• discriminate between inputs that are similar to weight
vector-generate large positive activations
• otherwise-negative activations
• if S is orthogonal to Wj-zero inner products-zero
activations
• if S is aligned with Wij-maximal activation
Neuron signal functions
To get around discontinuity around zero
Neuron updates its signal at discrete instants of time, k, by sampling its instantaneous activation
xjk
By allowing the signal function to shift on the qj axis, the bias weight gives the neuron an
extra degree of freedom which extends the learning capabilities of the neuron
• In some applications, the logistic function is
replaced by the hyperbolic tangent form for
some constant k>0
• Its an example of a bipolar signal function
Activations close to the center of the function elicit strong signal responses
Activations far away from the center of the function elicit weak signal responses
• There is receptive field of the neuron within
which it responds to input by generating strong
signals
• So-these neurons can be used to recognize
specific ranges of inputs by selectively tuning
receptive fields to be centered around those
regions of activity –suitable learning algorithm
• It takes input from infinte range of activations
and transforms smoothly into a value in the finite
range(0,1)
Squashing functions
• A neural network N can be viewed as a
weighted graph in which artificial neurons are
nodes and directed weighted edges represent
connections between neurons
• How to design the classifier that can separate
the pattern space into different classes
• what conditions need to be placed on the data
set in order for these machines to classify data
correctly.
Perceptron Learning
Least mean square learning
• Perceptron learning: correct thre classification
error on a misclassified pattern in each
iteration.
• Restriction: the desired output had to be
binary, pattern set has to be linearly separable
• This is overcome in LMS
Neuronal signal function is changed from binary threshold to
linear/linear error measure
Recursive
• unit vector in the direction of Xk
• is the learning rate
• pattern normalized learning rate
• Here the learning rate is changed from
iteration to iteration by
• larger magnitude vectors induce smaller
effective weight change than smaller mag
vectors
Each weight update pushes the weight vector in the direction of the current pattern in an
attempt to reduce the error
How the linear neuron orients itself along the data

You might also like