You are on page 1of 4

Code No: RR420507 Set No.

1
IV B.Tech II Semester Supplementary Examinations, June 2007
NEURAL NETWORKS
( Common to Computer Science & Engineering and Electronics &
Computer Engineering)
Time: 3 hours Max Marks: 80
Answer any FIVE Questions
All Questions carry equal marks
⋆⋆⋆⋆⋆

1. (a) Explain in detail about the single layer artificial neural network with diagram.
(b) Explain in detail about the Multi layer artificial neural network with neat
diagram. [8+8]

2. (a) Write the advantages and disadvantages of perceptron. [8]
(b) Explain Least Mean Square (LMS) algorithm. [8]

3. Explain the backpropagation algorithm and derive the expressions for weight up-
date relations? [8+8]

4. (a) What are the limitations of Hopfield network? Suggest methods that may
overcome these limitations. [4+4]
(b) A Hopfield network made up of five neurons, which is required to store the
following three fundamental memories: [8]

ξ1 = [+1, +1, +1, +1, +1]T
ξ2 = [+1, −1, −1, +1, −1]T
ξ3 = [−1, +1, −1, +1, +1]T
Evaluate the 5-by-5 synaptic weight matrix of the network.

5. Discuss how the “Winner-Take-All” in the Kohonen’s layer is implemented and
explain the architecture, Also explain the training algorithm. [16]

6. Explain the architecture of the Grossberg layer and its training algorithm [8+8]

7. Draw the archictural diagram of ART network and explain its classification opera-
tion in detail. [4+6+6]

8. Describe how a neural network may be trained for a pattern recognition task.
Illustrate with an example [16]

⋆⋆⋆⋆⋆

1 of 1
Code No: RR420507 Set No. 2
IV B.Tech II Semester Supplementary Examinations, June 2007
NEURAL NETWORKS
( Common to Computer Science & Engineering and Electronics &
Computer Engineering)
Time: 3 hours Max Marks: 80
Answer any FIVE Questions
All Questions carry equal marks
⋆⋆⋆⋆⋆

1. Distinguish between the use of neural networks as classifiers and as approximators
with the help of suitable examples. [8+8]

2. Briefly discuss about linear separability and the solution for EX-OR problem.Also
suggest a network that can solve EX-OR problem. [4+6+6]

3. Explain about the generalized delta- rule and derive the weight updatation for a
multi layer feed forward neural network. [8+8]

4. (a) What are the limitations of Hopfield network? Suggest methods that may
overcome these limitations. [4+4]
(b) A Hopfield network made up of five neurons, which is required to store the
following three fundamental memories: [8]

ξ1 = [+1, +1, +1, +1, +1]T
ξ2 = [+1, −1, −1, +1, −1]T
ξ3 = [−1, +1, −1, +1, +1]T
Evaluate the 5-by-5 synaptic weight matrix of the network.

5. Explain the Kohonen’s method of unsupervised learning. Discuss any example as
its application. [8+8]

6. Derive expressions for the weight updation involved in counter propagation. [16]

7. Draw the architectural diagram of ART network and explain the function of each
block in detail. [4+12]

8. Describe how a neural network may be trained for a pattern recognition task.
Illustrate with an example [16]

⋆⋆⋆⋆⋆

1 of 1
Code No: RR420507 Set No. 3
IV B.Tech II Semester Supplementary Examinations, June 2007
NEURAL NETWORKS
( Common to Computer Science & Engineering and Electronics &
Computer Engineering)
Time: 3 hours Max Marks: 80
Answer any FIVE Questions
All Questions carry equal marks
⋆⋆⋆⋆⋆

1. (a) Consider a multilayer feed forward network, all the neurons of which operate
in their linear regions. Justify the statement that such a network is equivalent
to a single layer feed forward network. [8]
(b) What is the advantage of having hidden layers in an ANN? On what basis is
the number of hidden layers and the number of neurons in each hidden layer
selected? [3+5]

2. (a) Write the advantages and disadvantages of perceptron. [8]
(b) Explain Least Mean Square (LMS) algorithm. [8]

3. Explain the backpropagation algorithm and derive the expressions for weight up-
date relations? [8+8]

4. What is the Hopfield network? Describe how it can be used to have analog to
digital conversion. [4+12]

5. (a) What is the Kohonen layer architure and explain its features. [4+4]
(b) Explain the Kohonen’s learning algorithm. [4+4]

6. (a) Explain briefly about the counter propagation-training algorithm. [10]
(b) Explain the various applications of counter propagation. [6]

7. Draw the archictural diagram of ART network and explain its classification opera-
tion in detail. [4+6+6]

8. Describe how a neural network may be trained for a pattern recognition task.
Illustrate with an example [16]

⋆⋆⋆⋆⋆

1 of 1
Code No: RR420507 Set No. 4
IV B.Tech II Semester Supplementary Examinations, June 2007
NEURAL NETWORKS
( Common to Computer Science & Engineering and Electronics &
Computer Engineering)
Time: 3 hours Max Marks: 80
Answer any FIVE Questions
All Questions carry equal marks
⋆⋆⋆⋆⋆

1. (a) Give a flow graph model of an artificial neural network and explain its working.
(b) Distinguish between unipolar and bipolar activation functions used in artificial
neural networks giving at least two examples of each. [6+10]

2. Compare the similarities and differences between single layer and multi layer per-
ceptrons and also discuss in what aspects multi layer perceptrons are advantageous
over single layer perceptrons. [6+6+4]

3. (a) Explain why is it preferable to have different values of η for weights leading
to the units in different layers in a feed forward neural network. [8]
(b) Discuss a few tasks that can be performed by a backpropagation algorithm.
[8]

4. (a) Explain briefly the applications of Boltzman completion network [4]
(b) With suitable examples, explain different types of associative memories. [3x4=12]

5. (a) What is the Kohonen layer architure and explain its features. [4+4]
(b) Explain the Kohonen’s learning algorithm. [4+4]

6. Derive expressions for the weight updation involved in counter propagation. [16]

7. (a) What are the advantages of ART network. Discuss about gain control in ART
network. [3+5]
(b) Discuss in detail about orienting subsystem in an ART network. [8]

8. Describe how a neural network may be trained for a pattern recognition task.
Illustrate with an example [16]

⋆⋆⋆⋆⋆

1 of 1