You are on page 1of 3

2.

Defining Artificial Neural Networks

Artificial neural networks (ANNs) are mathematical or computer models inspired by a


human's central nervous system (particularly the brain) and capable of machine learning and
pattern recognition in electronics engineering and related fields. Artificial neural networks
(ANNs) are mathematical or computer models inspired by a human's central nervous system
(particularly the brain) and capable of machine learning and pattern recognition in electronics
engineering and related fields. Because the neural systems of animals are more sophisticated
than those of humans, systems built in this manner will be able to address more difficult
issues. Artificial neural networks are typically depicted as systems of densely interconnected
"neurons" capable of computing values from inputs.

Fig. 1 Architecture of Simple Neural Network

A neural network is similar to a website network of interconnected neurons that can number
in the millions. All parallel processing in the body is done with the help of these
interconnected neurons, and the best example of Parallel Processing is the human or animal
body.
Artificial neural networks are currently made up of clusters of basic artificial neurons. This
clustering is accomplished by forming layers that are then connected. The "art" of creating
networks to solve complicated problems in the actual world also includes how these layers
connect.
As a result of their superior capacity to derive meaning from complex or imprecise data,
neural networks can be used to identify patterns and discover trends that are too complex for
people or other computer techniques to notice.

2.1 Background
Neural networks were inspired by research into the central nervous system of the human
brain. Simple artificial nodes called "neurons," "processing elements," or "units" are joined to
build a network termed a biological neural network in an Artificial Neural Network.

Fig. 2 Genetic Neuron

An artificial neural network does not have a single formal definition. However, if a set of
statistical, mathematical, or computational models shares the following features, they are
referred to as "Neural Networks."
1. Sets of adaptive weights, or numerical parameters that are tuned using learning
algorithms
2. They are capable of approximating non-linear input functions.

The adaptive weights are, in theory, the strength of connections between neurons that are
activated during training and prediction. Rather than a clear division of subtasks to which
individual units are assigned, neural networks perform functions collectively and parallel to
the units, akin to biological brain networks. In statistics, cognitive psychology, and artificial
intelligence, the word "neural network" usually refers to models. Theoretical and
computational neuroscience both use neural network models to simulate the central nervous
system.

2.2 Working of Neural Networks


The many different ways these individual neurons can be clustered together is essential to
how neural networks function. In the human mind, clustering happens so that information can
be processed in a dynamic, interactive, and self-organizing manner. Neural networks are built
in a three-dimensional universe from microscopic components in biology. These neurons
appear to be capable of practically limitless interactions. In the case of any proposed or
current man-made network, this is not the case. Integrated circuits are two-dimensional
devices with a restricted number of layers for connections in today's technology. The
different types and scope of artificial neural networks that can be built on silicon are limited
by this physical reality. Neural networks are currently just a simple grouping of primitive
artificial neurons. This clustering is accomplished by forming layers that are then connected.
The "art" of engineering networks to solve real-world problems also includes how these
levels connect.

You might also like