This action might not be possible to undo. Are you sure you want to continue?
SIDDHARTH Y.PATEL (10IT32)
DEPARTMENT OF INFORMATION TECHNOLOGY C. G. PATEL INSTITUTE OF TECHNOLOGY GUJARAT TECHNOLOGICAL UNIVERSITY, BARDOLI – 394350 [2012-2013]
Submitted in partial fulfillment of the requirements For the degree of
Bachelor of Engineering in Information Technology
SIDDHARTH Y.PATEL (10IT32)
Under the Guidance of
Prof. FENIL KHATIWALA
DEPARTMENT OF INFORMATION TECHNOLOGY C. G. PATEL INSTITUTE OF TECHNOLOGY GUJARAT TECHNOLOGICAL UNIVERSITY, BARDOLI – 394350 [2012-2013]
Y (10IT32) under my guidance in partial fulfillment of the degree of Bachelor of Engineering in Information Technology of Gujarat Technological University. IT / CO Dept. CGPIT . Bardoli during the academic year 2012-2013 (Semester-V). CGPIT Prof. Devendra V... Thakor Head of IT / CO Dept. Fenil Khatiwala Guide.CERTIFICATE This is to certify that the Seminar entitled “NEURAL NETWORK” has been submitted by PATEL SIDDHARTH . Date: Place: BARDOLI Prof.
The connection between the artificial and the real thing is also investigated and explained. The main question of interest to us future of Neural Networks is presented. a few of which will be dealt in the consequent sections and then advantages and disadvantages are explained. and a detailed historical background is provided. The most important concept of the neural networks is its wide range of its applications. The various types of neural networks are explained and demonstrated. the history of Neural Networks which deals with the comparative study of how vast the Neural Networks have developed over the years is presented than architecture of Neural network can be explained with its figure. we proceed to next session: resemblance with brain where in the comparison between brain and neural networks as well as neurons. Small but effective overall content of neural networks is presented .First. this section leads us to a brief conclusion and ends this report with the references. 1 .ABSTARCT This report is an introduction to Neural Networks.
I express my deep and sincere gratitude to Mr. Devendra Thakor (Head of Department) Who provided me an opportunity. I am also earnestly thankful to all the concerned faculties who supported and furnished information to make this endeavour a success. inspiration and requisite facility to complete this report in spite of his busy schedule and patiently solving my rather amateurish queries.Y (100530116032) 2 . PATEL SIDDHARTH . Fenil Khatiwala (guide) And Mr.ACKNOWLEDGEMENT I take this opportunity to express my sincere thanks and deep gratitude to all those people who extended their wholehearted co-operation and helped me in completing this seminar report successfully.
2 Neural networks in medicine 4.8 Signature Analysis 8 8 8 9 10 10 10 10 10 3 . Introduction TOPIC P.1 Feed Forward Network 2.INDEX SR.3 Why Use Neural Network? 2.3 Network Layer 3 3 4 4 3 Human and Artificial Neurons 3.2 Historical Background 1.1 What is Neural Network? 1.2 From Human neurons to Artificial Neurons.7 Food Processing: 4.6 Image Compression: 4.NO 1 1 1 1 1.NO 1.1 How the Human Brain Learn 3. 6 7 7 4 Application Of Neural Network 4.5 Character Recognition: 4.2 Feedback Network 2. Architecture of Neural Network 2.G.1 Neural Networks in Industry 4.3 Neural Networks in business 4.
5.4. 6.9 Monitoring. 7. Advantages of Neural Network Disadvantages Of Neural Network Neural Network in Future Conclusion References 11 12 13 4 .
TITLE 1 2 3 4 5 6 Feed forward network Feed back Network Network layer Components of a neuron The synapse The neuron model Page No. 3 4 5 6 7 7 5 . No.FIGURE INDEX Sr.
such as the brain. But the technology available at that time did not allow them to do too much. INTRODUCTION 1. These pioneers were able to develop convincing technology which surpassed the limitations identified by Minsky and Papert. McCulloch and Pitts (1943) developed models of neural networks based on their understanding of neurology. Learning in biological systems involves adjustments to the synaptic connections that exist between the neurons. However. they consulted the 6 . Two groups (Farley and Clark. learn by example. this field was established before the advent of computers. Holland. The history of neural networks that was described above can be divided into several periods: 1. During this period when funding and professional support was minimal. Another attempt was by using computer simulations. Currently. NNs. important advances were made by relatively few researchers. First Attempts: There were some initial simulations using formal logic. Following an initial period of enthusiasm. So whenever their models did not work. through a learning process. The results of their model were simple logic functions such as "a or b" and "a and b". such as pattern recognition or data classification. and was thus accepted by most without further analysis. published a book (in 1969) in which they summed up a general feeling of frustration (against neural networks) among researchers. 1. This is true of NNs as well. The first artificial neuron was produced in 1943 by the neurophysiologist Warren McCulloch and the logician Walter Pits. An NN is configured for a specific application.1 What is Neural Network? A Neural Network (NN) is an information processing paradigm that is inspired by the way biological nervous systems. It is composed of a large number of highly interconnected processing elements (neurons) working in unison to solve specific problems. Haibit and Duda. like people.1. the field survived a period of frustration and disrepute. process information. The key element of this paradigm is the novel structure of the information processing system. Their networks were based on simple neurons which were considered to be binary devices with fixed thresholds. 1956).2 Historical Background: Neural network simulations appear to be a recent development. Many important advances have been boosted by the use of inexpensive computer emulations. These models made several assumptions about how neurons worked. 1954. Minsky and Papert. and has survived at least one major setback and several eras. the neural network field enjoys a resurgence of interest and a corresponding increase in funding. Rochester. The first group (IBM researchers) maintained closed contact with neuroscientists at McGill University.
The Perceptron had three layers with the middle layer known as the association layer.. The conclusions supported the disenchantment of researchers in the field. Promising & Emerging Technology: Not only was neuroscience influential in the development of neural networks. Klopf (A. 3. developed a basis for learning in artificial neurons based on a biological principle for neuronal learning called heterostasis. Kunihiko) developed a step wise trained multilayered neural network for interpretation of 7 . Another system was the ADALINE (Adaptive Linear Element) which was developed in 1960 by Widrow and Hoff (of Stanford University). As a result. Rosenblatt (1958) stirred considerable interest and activity in the field when he designed and developed the Perceptron. considerable prejudice against this field was activated. The method used for learning was different to that of the Perceptron. The significant result of their book was to eliminate funding for research with neural network simulations.. Shun-Ichi 1967) was involved with theoretical developments: he published a paper which established a mathematical theory for a learning basis (error-correction method) dealing with adaptive pattern classification. This system could learn to connect or associate a given input to a random output unit. They developed the ART (Adaptive Resonance Theory) networks based on biologically plausible models. In the book they said: ". however several years passed before this approach was popularized. a different threshold function in the artificial neuron. 4. is a Perceptron with multiple layers. The ADALINE was an analogue electronic device made from simple components. Werbos (Paul Werbos 1974) developed and used the back-propagation learning method. but psychologists and engineers also contributed to the progress of neural network simulations. In essence. This interaction established a multidisciplinary trend which continues to the present day. the back-propagation net. Henry Klopf) in 1972. While Fukushima (F. Grossberg's (Steve Grossberg and Gail Carpenter in 1988) influence founded a school of thought which explores resonating algorithms. Backpropagation nets are probably the most well known and widely applied of the neural networks today. it employed the Least-Mean-Squares (LMS) learning rule. and a more robust and capable learning rule. 2. several researchers continued working to develop neuromorphically based computational methods for problems such as pattern recognition. Period of Frustration & Disrepute: In 1969 Minsky and Papert wrote a book in which they generalized the limitations of single layer Perceptrons to multilayered systems.our intuitive judgment that the extension (to multilayer systems) is sterile". Amari (A. During this period several paradigms were generated which modern work continues to enhance. Anderson and Kohonen developed associative techniques independent of each other. Innovation: Although public interest and available funding were minimal.neuroscientists.
8 . and research is advancing the field on many fronts. 5. with their remarkable ability to derive meaning from complicated or imprecise data. Attention is now focused on funding levels throughout Europe. and the response to conferences and publications was quite positive. Neurally based chips are emerging and applications to complex problems developing. Today: Significant progress has been made in the field of neural networks-enough to attract a great deal of attention and fund further research. Japan and the US and as this funding becomes available. Academic programs appeared and courses were introduced at most major Universities (in US and Europe). Advancement beyond current commercial applications appears to be possible. For example. several new commercial with applications in industry and financial institutions are emerging. comprehensive books and conferences provided a forum for people in diverse fields with specialized technical languages. A trained neural network can be thought of as an "expert" in the category of information it has been given to analyze. can be used to extract patterns and detect trends that are too complex to be noticed by either humans or other computer techniques. 2. today is a period of transition for neural network technology. Several factors influenced this movement. 1. This expert can then be used to provide projections given new situations of interest and answer "what if" questions.handwritten characters. Other advantages include: 1. Adaptive learning: An ability to learn how to do tasks based on the data given for training or initial experience. Clearly. Re-Emergence: Progress during the late 1970s and early 1980s was important to the re-emergence on interest in the neural network field. 6. The original network was published in 1975 and was called the Cognitron. The news media picked up on the increased activity and tutorials helped disseminate the technology. Self-Organization: An NN can create its own organization or representation of the information it receives during learning time.3 Why Use Neural Network? Neural networks.
Real Time Operation: NN computations may be carried out in parallel. However. some network capabilities may be retained even with major network damage.3. 4. Fault Tolerance via Redundant Information Coding: Partial destruction of a network leads to the corresponding degradation of performance. and special hardware devices are being designed and manufactured which take advantage of this capability. 9 .
This type of organization is also referred to as bottom-up or top-down.e. the output of any layer does not affect that same layer.2. Feed-forward ANNs tend to be straight forward networks that associate inputs with outputs. There is no feedback (loops) i.1 Feed-forward networks: Feed-forward NNs allow signals to travel one way only. ARCHITECTURE OF NEURAL NETWORKS 2. They are extensively used in pattern recognition. Simple feed forward network 10 . from input to output.
The activity of each hidden unit is determined by the activities of the input units and the weights on the connections between the input and the hidden units. 11 . Feedback networks are dynamic. or layers. .3 Network layers: The commonest type of artificial neural network consists of three groups. Feedback networks are very powerful and can get extremely complicated. of units: a layer of "input" units is connected to a layer of "hidden" units.2Feedback networks: Feedback networks can have signals travelling in both directions by introducing loops in the network. although the latter term is often used to denote feedback connections in single-layer organizations. which is connected to a layer of "output" units. their 'state' is changing continuously until they reach an equilibrium point. Feedback architectures are also referred to as interactive or recurrent.2. They remain at the equilibrium point until the input changes and a new equilibrium needs to be found. The activity of the input units represents the raw information that is fed into the network. Feed back Network 2.
a hidden unit can choose what it represents. The behavior of the output units depends on the activity of the hidden units and the weights between the hidden and output units. We also distinguish single-layer and multi-layer architectures. This simple type of network is interesting because the hidden units are free to construct their own representations of the input. constitutes the most general case and is of more potential computational power than hierarchically structured multi-layer organizations. The single-layer organization. instead of following a global numbering. in which all units are connected to one another. The weights between the input and hidden units determine when each hidden unit is active. In multi-layer networks. and so by modifying these weights. units are often numbered by layer. Network layer 12 .
it sends a spike of electrical activity down its axon. When a neuron receives excitatory input that is sufficiently large compared with its inhibitory input. a typical neuron collects signals from others through a host of fine structures called dendrites. In the human brain.1 How the Human Brain Learns? Much is still unknown about how the brain trains itself to process information. a structure called a synapse converts the activity from the axon into electrical effects that inhibit or excite activity from the axon into electrical effects that inhibit or excite activity in the connected neurons. HUMAN AND ARTIFICIAL NEURONES 3.3. Learning occurs by changing the effectiveness of the synapses so that the influence of one neuron on another changes. so theories abound. which splits into thousands of branches. Components of a neuron 13 . The neuron sends out spikes of electrical activity through a long. thin stand known as an axon. At the end of each branch.
The neuron model 14 . We then typically program a computer to simulate these features. However because our knowledge of neurons is incomplete and our computing power is limited.The synapse 3. our models are necessarily gross idealizations of real networks of neurons.2 From Human Neurons to Artificial Neurons: We conduct these neural networks by first trying to deduce the essential features of neurons and their interconnections.
texture analysis. Neural networks are ideal in recognizing diseases using scans since there is no need to provide a specific algorithm on how to identify the disease.2 Neural networks in medicine: Neural Networks (NN) is currently a 'hot' research area in medicine and it is believed that they will receive extensive application to biomedical systems in the next few years. APPLICATON OF NEURAL NETWORK: 4. and facial recognition. 4. What is needed is a set of examples that are representative of all the variations of the disease. Neural networks learn by example so the details of how to recognize the disease are not needed. CAT scans.). recovery of telecommunications from faulty software. NN are also used in the following specific paradigms: recognition of speakers in communications. Sales forecasting 2. At the moment. they are well suited for prediction or forecasting needs including: 1. Industrial process control 3. The quantity of examples is not as important as the 'quantity'. Target marketing But to give you some more specific examples. hand-written word recognition. In fact. 15 . cardiograms. three-dimensional object recognition.g. undersea mine detection. the research is mostly on modeling parts of the human body and recognizing diseases from various scans (e. diagnosis of hepatitis. what real world applications are they suited for? Neural networks have broad applicability to real world business problems. they have already been successfully applied in many industries. ultrasonic scans. The examples need to be selected very carefully if the system is to perform reliably and efficiently. etc. Since neural networks are best at identifying patterns or trends in data.4. Risk management 6. Data validation 5.1Neural Networks in Industry: Given this description of neural networks and how they work. interpretation of multimeaning Chinese words. Customer research 4.
Neural networks were used to discover the influence of undefined interactions by the various variables. It is also noteworthy to see that neural networks can influence the bottom line. 4. In a conservative mode the system agreed on the underwriters on 97% of the cases. One of them is the Credit Scoring system which increases the profitability of the existing model up to 27%.2 Stock Market: The day-to-day business of the stock market is extremely complicated. such as the Hopfield-Tank network for optimization and scheduling. There is also a strong potential for using neural networks for database mining that is. Additionally. Thus. Such information has a direct impact on the profitability of an airline and can provide a technological advantage for users of the system. Most of the funded work in this area is classified as proprietary. This is system run on an Apollo DN3000 and used 250K memory while processing a case file in approximately 1 sec.1Marketing: There is a marketing application which has been integrated with a neural network system.3. has developed several neural network applications. The system is used to monitor and recommend booking advice for each departure. The Airline Marketing Tactician (a trademark abbreviated as AMT) is a computer system made of various intelligent technologies including expert systems. Many factors Weigh in whether a given stock will go up or down on any given day. A feed forward neural network is integrated with the AMT and was trained using back-propagation to assist the marketing control of airline seat allocations. searching for patterns implicit within the explicitly stored information in databases. This system was trained with 5048 applications of which 2597 were certified. In the liberal model the system agreed 84% of the cases.4. The data related to property and borrower qualifications. which required a continuously adaptive solution. including resource allocation and scheduling. they were used by the neural system to develop useful conclusions. While these interactions were not defined. There is some potential for using neural networks for business purposes. Most work is applying neural networks. it is not possible to report on the full extent of the work going on. 4. the application's environment changed rapidly and constantly. Almost any neural network application would fit into one business area or financial analysis. it is also important to see that this intelligent technology can be integrated with expert systems and other approaches to make a functional system. The HNC neural systems were also applied to mortgage screening.3 Neural Networks in business: Business is a diverted field with several general areas of specialization such as accounting or financial analysis. founded by Robert Hecht-Nielsen. 4. The adaptive neural approach was amenable to rule expression. While it is significant that neural networks have been applied to this problem. A neural network automated mortgage insurance underwriting system was developed by the Nestor Company.3.3.2 Credit Evaluation: The HNC Company. Since neural 16 .
Fermentation control. 5. This is one of the first large-scale applications of neural networks in The USA. beverage container Inspection and grading whisky. With the Internet explosion and more sites using more Images on their sites. in a bank) With those stored. verifying if orange juice is natural. 17 . 8. Monitoring cheese ripening. fish inspection. Neural networks can be used to recognize Handwritten characters.Networks can examine a lot of information quickly and sort it all out. checking mayonnaise for rancidity. grading quality of food. 7. making Them useful in image compression. Inspection of food. 6. using neural networks for image compression is worth a look. Image Compression: Neural networks can receive and process vast amounts of information at once.g. and is also one of the first to use a neural network chip. automated flavor control. Signature Analysis: Neural nets can be used as a mechanism for comparing signatures made (e. Character Recognition: The idea of character recognition has become very important as handheld devices like the Palm Pilot is becoming increasingly popular. they can be used to Predict stock prices. Food Processing: The food industry is perhaps the biggest practical market for electronic noses. assisting Or replacing entirely humans.
early warning of engine problems can be given.9. 18 . By monitoring vibration levels and sound. Monitoring: Networks have been used to monitor the state of aircraft engines. British Rail has also been testing a similar application monitoring diesel engines.
missing and noisy data. making them useful for dynamic environments. 19 . or the form of interactions between factors. ADVANTAGES: Adapt to unknown situation. Ease of maintenance: Neural networks can be updated with fresh data. Autonomous learning & generalization. Noise Tolerance: Neural networks are very Flexible with respect to incomplete. Robustness: fault tolerance due to network redundancy.5. High Accuracy: Neural networks are able to approximate complex non-linear mappings. Independence from prior assumptions: Neural networks do not make a priori assumptions about the distribution of the data.
DISADVANTEGES: No exact. The neural network needs training to operate. 20 . Large complexity of the network structure.6. Requires high processing time for large neural networks. Neural network programs sometimes become unstable when applied to larger problems.
7. Common usage of self-driving cars. 21 . feel. Trends found in the human genome to aid in the understanding of the data compiled by the Human Genome Project. and predict the world around them. Composition of music. Improved stock prediction. NEURAL NETWORK IN FUTURE: Robots that can see. be automatically transformed into formatted Self-diagnosis of medical problems using neural networks. Handwritten documents to word processing documents.
I would like to state that even though neural networks have a huge potential we will only get the best of them when they are integrated with computing.CONCLUSION The computing world has a lot to gain from neural networks. Their ability to learn by example makes them very flexible and powerful. AI. They are also very well suited for real time systems because of their fast response and computational times which are due to their parallel architecture. 22 . They are regularly used to model parts of living organisms and to investigate the internal mechanisms of the brain. Furthermore there is no need to devise an algorithm in order to perform a specific task. there is no need to understand the internal mechanisms of that task. Finally. There are a number of scientists arguing that consciousness is a 'mechanical' property and that 'conscious' neural networks are a realistic possibility. Neural networks also contribute to other areas of research such as neurology and psychology.e. Perhaps the most exciting aspect of neural networks is the possibility that some day 'conscious' networks might be produced. fuzzy logic and related subjects. i.
Neural Networks.emsl. Alexander. I (1984. New York.abs.pnl. I.emsl. 1950). (1986). Ballatine.wcnn95. 2nd edition 2. The 1989 Neuro Computing Bibliography. D. Industrial Applications of Neural Networks (research reports Esprit.gov:2080/docs/cie/techbrief/NN. (1989).gov:2080/docs/cie/neural/papers2/keller.F. Neural Networks by Eric Davalo and Patrick Naim 7. 1987-February. J. A Connectionist/Neural Network Bibliography. I.Croall. CC. Learning internal representations by error propagation by Rumelhart.techbrief.html 5. Electronic Noses for Telemedicine http://www. 1989). Neural Networks at Pacific Northwest National Laboratory http://www. Hinton and Williams (1986).html THANK YOU 23 . MIT Lincoln Lab. Artificial Neural Networks in Medicine http://www. 8. H.ccc95. Klimasauskas.gov:2080/docs/cie/neural/neural. DARPA Neural Network Study (October. 9.Mason) 4. and Morton.ht 6.pnl.emsl.gov:2080/docs/cie/neural/papers2/keller.html 3.homepage. Hammerstrom. Robot.pnl. Asimov. 11.P.REFERENCES: 1.emsl.abs. An introduction to neural computing.pnl. A Novel Approach to Modeling and Diagnosing the Cardiovascular System http://www. Eric Davalo and Patrick Naim 10.