You are on page 1of 1

Modeling the Neurons and Synapses The idea of modeling many neurons in massive parallel connection was introduced

in the 1950s and 1960s under the general field known as artificial neural networks (ANN). The subsection below provides a brief history of ANN and how the field has advanced to its current stage, and highlights the current difficulty encountered in modeling at the cellular level. A brief historical background on artificial neural networks (ANN). In the late 1950s, Frank Rosenblatt began to explore the functional properties of small networks of neurons, which he called Perceptrons (Rosenblatt, 1962). Perceptrons works well in many experiments. But in the 1960s, many mathematicians were skeptical of Rosenblatt s conclusions because they were mainly empirically drawn. Marvin Minsky and Seymour Papert, in 1969, published the book on Perceptrons that widely determined what Perceptrons can and cannot do in the mathematical sense. Minsky and Papert s book make it clear that Perceptron architectures were much more limited in what they could accomplish than what Rosenblatt would claim. As a result, work on Perceptrons and other related neural network modeling was silent for almost a decade. In the late 1980s to 1990s, artificial neural networks were hot topics of research again (more reading can be found in (Ng, 1997) where an extensive survey was conducted in 1994). The field was re-ignited by various factors, some of which include: publication of the books Parallel Distribution Processing by Rumelhurt and McClelland 1986/87 (Rumelhurt and McClelland, 1986a,b) (Rumelhurt and McClelland, 1987a,b), interest from DARPAd to re-examine neural information processing capability (DARPA neural

You might also like