You are on page 1of 7





SIGNED: ……………………

REG No: BU/UG/2007/109






Qn. 1
Consider the following fitness function:
Fitness(<bitstring>) = number of 1’s in the bitstring where both adjacent bits are 0’s
For example, fitness(“010110100”) = 2, fitness(“100011011”) = 0, and fitness(“010101010”)= 4.
(Notice that 1's in the first or last position in the string are not counted in the fitness function,
even if adjacent to a 0.)
Assume the design of our genetic algorithm is:
(a) Create an initial population containing 4 random 9-bit strings.
(b) Discard the 2 least-fit ones (break ties randomly).
(c) Do a cross-over using the 2 most fit.
The 2 children that results and their parents constitute the next generation.
(d) Randomly mutate 1 bit in 1 string in the population.
(e) Go to step (b)

Start with the initial population below and show what the next two (2) generations might look
like. Explain your reasoning.
(Your answer must be in the format below.)
Generation 0
Generation 1 Explanation
Generation 2 Explanation
Generation 0
fitness(“011110110”) = 0
fitness(“011001011”) = 1
fitness(“101101110”) = 0
fitness(“000010101”) = 2
Generation 1 Explanation
011001011 One-point crossover
000010101 Crossover Mask: 000011111
011010101 Parents: 011001011 000010101
000001011 Offspring: 011010101

fitness(“011001011”) = 1
fitness(“000010101”) = 2
fitness(“011010101”) = 2 most fit
fitness(“000001011”) = 1

mutation of 000010101 increases its fitness and

prevents local maxima




To maintain diversity and for better fitness of the

dominant individual, 000010101, genes have to be
modified. Hence mutation.
Generation 2 Explanation
001010101 fitness(“011001011”) = 1
011010101 fitness(“001010101”) = 3
001010101 fitness(“011010101”) = 2 most fit
011010101 fitness(“000001011”) = 1
One-point cross over
Crossover Mask: 000011111

Parents: 001010101 011010101

Offspring: 001010101 011010101

The reproduction operators, that is, cross over and mutation lead to the creation of fitter
individuals out of the fairly fit or unfit population. Cross over combines better genes from the
parents to generate an offspring whose genes are modified to create a fit individual.

(a) Diagnostic rules lead from observed effects to hidden causes? TRUE or FALSE
TRUE, because diagnostic rules provide possible explanations for what one observes or know to
be the case.
(b) What does it mean to say that entailment for first-order logic is semi decidable?
It means that algorithms exist that return YES to every entailed sentence, but no algorithm exists
that also returns NO to every non-entailed sentence.
(c) Describe each of the following AI concepts and briefly explain its most significant
i. Inference Rules
ii. Searle’s Chinese-Room Story
iii. Fuzzy Logic
iv. Vector-Space Model
v. Weight Space
Inference rules
Inference rules are conditional statements, each having two parts: an if clause and a then clause.
These rules give expert systems the ability to find solutions to diagnostic and prescriptive

An expert system's rule base is made up of many such inference rules. They are entered as
separate rules and it is the inference engine that uses them together to draw conclusions.

The most significant aspect of inference rules

Inference rules use reasoning which more closely resembles human reasoning. Reasoning is the
cognitive process of looking for reasons, beliefs, conclusions, actions or feelings. Thus, when a
conclusion is drawn, it is possible to understand how this conclusion was reached.

Searle’s Chinese-Room Story

Searle (1999) summarized the Chinese Room argument concisely:
Imagine a native English speaker who knows no Chinese locked in a room full of boxes of
Chinese symbols (a data base) together with a book of instructions for manipulating the symbols
(the program). Imagine that people outside the room send in other Chinese symbols which,
unknown to the person in the room, are questions in Chinese (the input). And imagine that by
following the instructions in the program the man in the room is able to pass out Chinese
symbols which are correct answers to the questions (the output). The program enables the person
in the room to pass the Turing Test for understanding Chinese but he does not understand a word
of Chinese.
Its most significant aspect
No digital computer can understand solely by virtue of running a formal program. Searle’s
Chinese-Room Story demonstrates the intrinsic inability of formal processing to produce
thinking, not to mention sentience. It therefore shows that digital computers can not think.
That is, while suitably programmed computers may appear to converse in natural language, they
are not capable of understanding the language, even in principle.
Fuzzy Logic
Fuzzy logic is a form of multi-valued logic derived from fuzzy set theory to deal with reasoning
that is approximate rather than precise. In fuzzy logic the degree of truth of a statement can range
between 0 and 1 and is not constrained to the two truth values {true, false} as in classic predicate

Fuzzy logic usually uses IF-THEN rules, or constructs that are equivalent. Rules are usually
expressed in the form:
IF variable IS property THEN action

The most significant aspect of fuzzy logic

Reasoning with uncertainty: - Knowledge is almost always incomplete and uncertain. The set
of methods for using uncertain knowledge in combination with uncertain data in the reasoning
process is called reasoning with uncertainty. Fuzzy logic is a method for reasoning with

Vector-Space Model
The vector space model is the most widely used method for information retrieval (IR). This
model is used to encode documents, where each document in the corpus is represented by a
vector whose elements are values associated with the words in the document.
These values can also be weighted to represent the importance of the terms in the semantics of
the document. A corpus of n documents is represented by an m × n matrix A, where m is the
number of words in the lexicon or the number of terms used to index the documents. The
element Aij represents the frequency of word i in document j. The column space of this term-by-
document matrix determines the semantics of the corpus.

The vector space model procedure can be divided into three stages. The first stage is the
document indexing where content bearing terms are extracted from the document text. The
second stage is the weighting of the indexed terms to enhance retrieval of document relevant to
the user. The last stage ranks the document with respect to the query according to a similarity
The most significant aspect of the Vector-Space Model
The model creates a space in which both documents and queries are represented by vectors.
Vector space model is an algebraic model for representing text documents (and any objects, in
general) as vectors of identifiers, such as, for example, index terms. It is used in information
filtering, information retrieval, indexing and relevancy rankings.

Weight Space
Weight space is the set of all possible values of weights. A weight, in an artificial neural
network, is a parameter associated with a connection from one neuron, M, to another neuron N. It
corresponds to a synapse in a biological neuron, and it determines how much notice the neuron N
pays to the activation it receives from neuron M. If the weight is positive, the connection is
called excitatory, while if the weight is negative, the connection is called inhibitory.

The most significant aspect of the Weight Space

Error surface: - When total error of a back propagation-trained neural network is expressed as a
function of the weights, and graphed (to the extent that this is possible with a large number of
weights), the result is a surface termed the error surface. The course of learning can be traced on
the error surface: as learning is supposed to reduce error, when the learning algorithm causes the
weights to change, the current point on the error surface should descend into a valley of the error
The "point" defined by the current set of weights is termed a point in weight space.

REFERENCES accessed on 27th November 2010 accessed on 27th November 2010 accessed on 27th November 2010 accessed on 27th November 2010 accessed on 27th November 2010