You are on page 1of 3

1) Hard computing, i.e., conventional computing, requires a precisely stated analytic model and often a lot of computation time.

Soft computing differs


from conventional (hard) computing in that, unlike hard computing, it is tolerant of imprecision, uncertainty, partial truth, and approximation. In effect, the
role model for soft computing is the human mind.
2) Hard computing based on binary logic, crisp systems, numerical analysis and crisp software but soft computing based on fuzzy logic, neural nets and
probabilistic reasoning.
3) Hard computing has the characteristics of precision and categoricity and the soft computing,approximation and dispositionality. Although in hard
computing, imprecision and uncertainty are undesirable properties, in soft computing the tolerance for imprecision and uncertainty is exploited to achieve
tractability, lower cost,high Machine Intelligence Quotient (MIQ) and economy of communication
4) Hard computing requires programs to be written, uses two-valued logic, is deterministic,
requires exact input data, is strictly sequential, produces precise answers; soft computing can
evolve its own programs, can use multi valued or fuzzy logic, incorporates stochastic, can deal with ambiguous and noisy data, allows parallel computations,
can yield approximate answers.

Introduction Associative memory

Pattern association involves associating a new pattern with a stored pattern.


It is a simplified model of human memory.

Types of associative memory:

1.

Heteroassociative memory

2.

Autoassociative memory

3.

Hopfield Net

4.

Bidirectional Associative Memory (BAM)

These are usually single-layer networks.

The neural network is firstly trained to store a set of patterns in the form s : t
s represents the input vector and t the corresponding output vector.

The neural network is then tested on a set of data to test its memory by using it to
identify patterns containing incorrect or missing information.

Associative memory can be feed forward or recurrent.

Autoassociative memory cannot hold an infinite number of patterns.

Factors that affect this: Complexity of each pattern, Similarity of input patterns

Autoassociative Memory
Auto Associative Memory Architecture

Auto Associative Architecture


Auto associative Memory

The inputs and output vectors s and t are the same.


The Hebb rule is used as a learning algorithm or calculate the weight matrix by
summing the outer products of each input-output pair.
The autoassociative application algorithm is used to test the algorithm

Hetero associative Memory

Hetero Associative Architecture


Hetero associative Memory

The inputs and output vectors s and t are different.

The Hebb rule is used as a learning algorithm or calculate the weight matrix by
summing the outer products of each input-output pair.

The heteroassociative application algorithm is used to test the algorithm.


The Hebb Algorithm

Initialize weights to zero, wij =0, where i = 1, , n and j = 1, , m.

For each training case s:t repeat:

xi = si , where i=1,,n

yi = tj, where j = 1, .., m

Adjust weights wij(new) = wij(old) + xiyj,


j = 1, .., m

Learning Rate
Data type: Real Domain: [0, 1] Typical value: 0.3

Meaning: Learning Rate. Training parameter that controls the size of weight and bias changes in learning of the training algori

where i = 1, .., n and

You might also like