You are on page 1of 5

Sardar Patel Institute of Technology, Mumbai

Department of Electronics and Telecommunication Engineering


B.Tech. Sem-VII (2022-2023)

Principles of Soft Computing

Experiment no. 3
Name: Shweta Choudhary UID: 2019120015

Aim: Madaline algorithm implementation for character recognition.

Software used: Google Colab, Python

Theory:
Madaline

■ Stands for multiple adaptive linear neuron


■ It consists of many adalines in parallel with a single output unit whose value is based
on certain selection rules.
■ It uses the majority vote rule
■ On using this rule, the output unit would have an answer either true or false.
■ On the other hand, if AND rule is used, the output is true if and only if both the inputs
are true and so on.
■ The training process of madaline is similar to that of adaline

Architecture

● It consists of “n” units of input layer and “m” units of adaline layer and “1” Unit of the
Madaline layer.
● Each neuron in the adaline and madaline layers has a bias of excitation “1”
● The Adaline layer is present between the input layer and the madaline layer; the adaline
layer is considered as the hidden layer.

Uses

The use of hidden layer gives the net computational capability which is not found in the
single-layer nets, but this complicates the training process to some extent.
Training Algorithm

In this training algorithm, only the weights between the hidden layers are adjusted, and the
weights for the output units are fixed. The weights v1, v2………vm and the bias b0 that enter
into output unit Y are determined so that the response of unit Y is 1.

Program Code & Output:

Code :-

from math import sqrt

def prepare_pattern(pattern):
value_pattern = list(pattern.replace('#', '1').replace('-', '0'))
value = list(map(int, value_pattern))
x = sum(value)
root = sqrt(x)
for index, number in enumerate(value):
value[index] /= root
return value

def read_data(file):
with open(file, newline='') as file:
data = file.read().strip().split('\r\n')

letters = []
patterns = []
number_of_patterns = int(data.pop(0))
horizontal = int(data.pop(0))
vertical = int(data.pop(0))

for i in range(number_of_patterns):
letter = data.pop(0)
pattern = ''
for row in range(horizontal):
pattern += data.pop(0)
patterns.append(prepare_pattern(pattern))
letters.append(letter)
return letters, patterns

def training_data(data):
patterns = data[1]
weights = []
layers = []

for i in range(len(patterns)):
for j in range(len(patterns[i])):
weight = patterns[i][j]
weights.append(weight)
layers.append(weights)
weights = []
return layers

def compute(test, layer, layer_index):


value = 0
for i in range(len(layer[layer_index])):
value += layer[layer_index][i] * test[i]
return value

def madaline(layer, test_data):


outputs = []
layer_index = 0
for letter in test_data[0]:
print(f'Letter {letter}')
for test in test_data[1]:
output = compute(test, layer, layer_index)
print(output)
outputs.append(output)
found_result = max(outputs)
found_letter = test_data[0][outputs.index(found_result)]
layer_index += 1
outputs = []
print(f'Letter {found_letter} was recognized. Level of
confidence = {round(found_result, 2)}')

train_data = read_data('train.txt')
test_data = read_data('test.txt')
layer = training_data(train_data)
madaline(layer, test_data)

Train.txt :-

3
4
4
6
#--#
-##-
-##-
#--#
A
#--#
-##-
-#--
#---
Z
####
--#-
-#--
####

Test.txt :-

3
4
4
6
#---
-##-
-##-
#--#
A
#--#
-##-
----
#---
Z
####
--#-
-#--
###-
Output :-

Conclusion:
In this experiment, we learned about the Madaline network and also implemented madaline
algorithm for recognising characters.

You might also like