You are on page 1of 20
Neural Netwo Parveen Malik Assistant Professor KIIT University ae KALINGA INSTITUTE OF INDUSTRIAL TECHNOLOGY (KIIT) Dendrite Ly seve snus Soma (Cell Body Z Myelin Sheath Synaptic Terminal Ly Save suae Text Books: * Neural Networks and Learning Machines — Simon Haykin * Principles of Soft Computing- S.N.Shivnandam & S.N.Deepa * Neural Networks using Matlab- S.N. Shivanandam, S. Sumathi ,S N Deepa “A neural network is a massively parallel distributed processor made up of simple processing units that has a natural propensity for storing experiential knowledge and making it available for use.” It resembles the brain in two respects: 1. Knowledge is acquired by the network from its environment through a learning process. 2. Interneuron connection strengths, known as synaptic weights, are used to store the acquired knowledge. Why we need Neural Networks ? achine Learning and B Time series Modelling See een eee ety oe Machine Learning Natural Language Processing Coen Nr eo Image prediction ech Recognition Cee Sent ae CH convers: Pence ool eee cag Facial Expression detection pplications [race crs Sd Flight path simulations Decal Dee eS Deere cues ad ee en ea cee ere Cae Cece Prosthesis d Cee MCU Cad Deen ee a Cee ua Sioa ce! pea ee aa Se’ Object discrimination eee ac De ees oa Cee aed Gee Dau ees te tn Noise suppression erry eed Ceara Manufacturing process control Product design and analysis ete er Real time particle identification POO ie enue eu Due eer cacy Chemical product design, analysis Dea Gog rt cane ccd Pyaciaaae ee esau ene acc) Cemreterstiet Integrated circuit chip layout eee CCE Reena Need Rema Perera) Peeuer eo eee eccuny eee at eu) ee ee eit eet tec) eee Roun eet oy rete Peon c cal Ceres ace Meas teal re Dn ee ad Niue neue eee rte a Automated information services 5 Dec une ae Customer payment processing systems ey ere at nee Pee C end o Pec . See ey : Meee) = Lense eee ae Boos Weer Prec at icra Bo ea cur Een Cree ero Cree encanta! Ree wee Neue eared Cer eed CNS- Brain and Neuron “The Brain is a highly complex, non-linear and massively parallel Computing machine.” ‘Wernicke's area (Speech comperenson) ‘Cerebellum (Crore stone RE urn, Sa Neuron Neuron - Structural Unit of central nervous system i.e. Brain and Spinal Cord. * 100billion neurons, 100 trillion synapses : * Welefit 1.5 Kg to 2Ke. Conduction Speed — 0.6 m/s to 120 m/s * Power—20% ,20-40 Watt,10-16 —L ‘operations * lon Transport Phenomenon, + Fault tolerant Asynchronous firing 10sec Response tim Neuron “A Neuron is a basic unit of brain that processes and transmits information.” Dendrite ‘Axon Terminal Node of ¢. Ranvier Schwann cell Myelin sheath Nucleus Dendrite: Receive signals from other neurons Soma (Cell body): Process the incoming signals. Myelin Sheath: Covers neurons and help speed up neuron impulses. Axon : Transmits the electric potential from soma to synaptic terminal and then finally to other neurons, muscles or glands Synaptic Terminal : Release the neurotransmitter to transmit information to dendrites. Neuron Connection a Dendrite ‘Synapses ( Computation Electrical impulse Neuron! Human Brain Contd. Hierarchal Learning — Inspired Deep Learning a Termediate level Features High Level Object description Historical Perspective about modelling of Brain through Medical and Applied Mathematics World Historical Perspective 1871 1873 1888 1891 o———0- Joseph Von Gerlach Reticular Theory > Nervous System isa single continuous network, Santiago Ramén y Cajal Neuron Doctrine - » Nervous System is a network of discrete individual cells, Camillo Golgi Is pe pecans Heinrich Withelm Gottfried von > Discovered a ‘chemical Waldeyer-Hartz reaction for examining the nervous tissue in much detail » Coined the term “Neuron” » Proponent of Reticular > Consolidated the neuron Theory (single continuous doctrine theory nerve fibre network ) Historical Perspective contd. 1943 1957-58 0 0 =e Warren Sturgis McCulloch Frank Rosenblatt Walter Pitts MccCulloch- Pitts Model > Mathematical model of human brain Perceptron ® More accurate model wt % Econ 2 (|) y Historical Perspective contd. a 1965-68 1969 A WwaRRnares Marvin Minsky ny WAKA Seymour Pepert Multilayer Perceptron | imitation of Perceptron > Mathematical modeling with > Single perceptron can-not solve non- additional layer | linearseperabilty 1986 1989 Rumelhart Kurt Hornik Back Propagation Universal Approximation Theorem > Optimization Problem w.rt hidden > AMLP with single hidden layer can be layers used to approximate any continuous function Analogy between Biological Neural Network and Artificial Neural Network Biological Neural Network Vs Artificial Neural Network‘ Denarite Axon Terminal ~~ Ssenwann cen Myelin sheath Outputs Neuron Linear SynapticWeights Non-tinear decision (strength of connection) Equivalent Electrical Model Practical Neural Network (Single Neuron) my o=1(S was) Actual Output Xm Actual Output Activation Functions Geometrical Shape Mathematical Expression Property 1 ifx>0 Non-differentiable Hard Limit f@= to otherwise Bipolar Hard Limit Signum Function 1ifx>o f@)=}0 ifx=0 Non-differentiable -lifx<0 Sigmoid Function Activation Functions oi Name Mathematical Expression | Hyperbolic Tangent f(x) = tanhx =o si re Bipolar sigmoid Bipolar Hard Limit Signum Function F(X) = max(0,x) Rectified Linear Unit

You might also like