This action might not be possible to undo. Are you sure you want to continue?

BooksAudiobooksComicsSheet Music### Categories

### Categories

Scribd Selects Books

Hand-picked favorites from

our editors

our editors

Scribd Selects Audiobooks

Hand-picked favorites from

our editors

our editors

Scribd Selects Comics

Hand-picked favorites from

our editors

our editors

Scribd Selects Sheet Music

Hand-picked favorites from

our editors

our editors

Top Books

What's trending, bestsellers,

award-winners & more

award-winners & more

Top Audiobooks

What's trending, bestsellers,

award-winners & more

award-winners & more

Top Comics

What's trending, bestsellers,

award-winners & more

award-winners & more

Top Sheet Music

What's trending, bestsellers,

award-winners & more

award-winners & more

P. 1

(eBook - Science) Goertzel, Ben - The Structure of Intelligence4.5

|Views: 59|Likes: 0Published by Able Scribe

Dr. Ben Goertzel, a leader in the field of artificial intelligence, expounds on the nature and structure of AGI, or Artificial General Intelligence, as differentiated from so-called "expert systems" or "narrow AI," so common today (such as voice recognition software).

Dr. Ben Goertzel, a leader in the field of artificial intelligence, expounds on the nature and structure of AGI, or Artificial General Intelligence, as differentiated from so-called "expert systems" or "narrow AI," so common today (such as voice recognition software).

See more

See less

https://www.scribd.com/doc/11372714/eBook-Science-Goertzel-Ben-The-Structure-of-Intelligence

10/16/2011

- 0.0 Psychology versus Complex Systems Science
- 0.1 Mind and Computation
- 0.2 Synopsis
- 0.3 Mathematics, Philosophy, Science
- 1.0 Rules
- 1.1 Stochastic and Quantum Computation
- 1.2 Computational Complexity
- 1.3 Network, Program or Network of Programs?
- 2.0 Thought as Optimization
- 2.1 Monte Carlo And Multistart
- 2.2 Simulated Annealing
- 2.3 Multilevel Optimization
- 3.0 Algorithmic Complexity
- 3.1 Randomness
- 3.2 Pattern
- 3.3 Meaningful Complexity
- 3.4 Structural Complexity
- 4.0 The Triarchic Theory Of Intelligence
- 4.1 Intelligence as Flexible Optimization
- 4.2 Unpredictability
- 4.3 Intelligence as Flexible Optimization, Revisited
- 4.4 Mind and Behavior
- 5.0 Justifying Induction
- 5.1 The Tendency to Take Habits
- 5.2 Toward A General Induction Algorithm
- 5.3 Induction, Probability, and Intelligence
- 6.0 The Structure-Mapping Theory of Analogy
- 6.1 A Typology of Analogy
- 6.2 Analogy and Induction
- 6.3 Hierarchical Analogy
- 6.4 Structural Analogy in the Brain
- 7.0 Structurally Associative Memory
- 7.1 Quillian Networks
- 7.2 Implications of Structurally Associative Memory
- 7.3 Image and Process
- 8.0 Deduction and Analogy in Mathematics
- 8.1 The Structure of Deduction
- 8.2 Paraconsistency
- 8.3 Deduction Cannot Stand Alone
- 9.0 The Perceptual Hierarchy
- 9.1 Probability Theory
- 9.2 The Maximum Entropy Principle
- 9.3 The Logic of Perception
- 10.0 Generating Motions
- 10.2 The Motor Control Hierarchy
- 10.3 A Neural-Darwinist Perceptual-Motor Hierarchy
- 11.0 Toward A Quantum Theory of Consciousness
- 11.1 Implications of the Quantum Theory of Consciousness
- 11.2 Consciousness and Emotion
- 12.0 The Structure of Intelligence
- 12.1 Design for a Thinking Machine

We have not yet discussed the possibility that analogies might be justified by analogy.

Obviously, analogy as a general mode of thought cannot be justified by analogy; that would be

circular reasoning. But, just as particular analogies can be justified by inductive or deductive

background knowledge, so can particular analogies be justified by analogical background

knowledge. In some cases, the mind may use analogical reasoning to determine how probable it

is that similarity A between x and x% will imply similarity B between x and x%.

y observing which similarities have, in similar situations, led to which other similarities.

Actually, this sort of analogical background information is a special kind of inductive

background information, but it is worth distinguishing.

Let us be more precise. Assume that processor P1 executes a long sequence of analogical

reasoning processes, and processor P2 observes this sequence, recording for each instance a

vector of the form (w,w%,f,w,w%,v,v%,R,r), where r is a number measuring the total

prominence of the patterns recognized in that instance, and R is the set of patterns located.

The prominence of a pattern may -- as in the previous chapter -- be defined as the product of

its intensity with its importance. The prominence of a set of patterns S may be crudely defined as

%S%K, where S is the structural complexity %S% of the set and K is some number representing

the prominence of the set. A very crude way to define K is as the average over all (y,z) in S of

THE STRUCTURE OF INTELLIGENCE

Get any book for free on: www.Abika.com

83

theimportance of (y,z). A more accurate definition could be formulated by a procedure similar to

Algorithm 3.1.

Then processor P2 can seek to recognize patterns (y,z) with the property that when x is a

pattern in (w,w%,f,w,w%,v,v%,R), r tends to be large. This is an optimization problem:

maximize the correlation of r with the intensity of x, over the space of patterns in the first six

components of the vector. But it is a particularly difficult optimization problem in the following

sense: determining what entities lie in the space over which optimization is taking place is, in

itself, a very difficult optimization problem. In other words, it is a constrained optimization

problem with very unpleasant constraints. One very simple approach would be the following:

1. Using straightforward optimization or analogy, seek to recognize patterns in

(x,x%,f,w,w%,v,v%,R).

2. Over the space of patterns recognized, see which ones correlate best with large r.

3. Seek to recognize new patterns in (x,x%,f,w,w%,v,v%,R) in the vicinity of the answer(s)

obtained in Step 2.

Perhaps some other approach would be superior, but the difficulty is that one cannot expect to

find patterns in a given narrow vicinity merely because functions in that region correlate well

with r. The focus must be on the location of patterns, not the search for large correlation with r.

In this way analogy could be used to determine which analogies are likely to pay off. This

might be called second-level analogy, or "learning by analogy about how to learn by analogy."

And the same approach could be applied to the analogies involved in analyzing analogies,

yielding third-level analogy, or "learning by analogy how to learn by analogy how to learn by

analogy." Et cetera. These are tremendously difficult optimization problems, so that learning on

these levels is likely to be rather slow. On the other hand, each insight on such a high level will

probably have a great impact on the effectiveness of lower-level analogies.

Let us be more precise about these higher levels of learning. A processor which learns by

second level analogy must be connected to a processor which learns by analogy, in such a way

that is has access to the inputs and the outputs of this processor. Similarly, a processor which

learns by third level analogy must be connected to a processor which learns on the second level

in such a way that it has access to the inputs and the outputs of this second-level processor -- and

the inputs and outputs of this second-level processor include all the inputs and outputs of at least

one first-level processor. In general, the absolute minimum number of inputs required for an n'th-

level analogy processor is proportional to n: this is the case, for instance, if every n'th level

processor is connected to exactly one (n-1)'th level processor. If each n-level processor is

connected to k (n-1)'th level processors for some k>1, then the number of inputs required for an

n'th level processor is [1-kn+1]/[1-k].

In general, if a set of analogical reasoning processors -- Nk learning on level k, k%n -- is

arranged such that each processor learning on level k is connected to all the inputs and outputs of

some set of nk processors on level k-1, then the question of network architecture is the question

THE STRUCTURE OF INTELLIGENCE

Get any book for free on: www.Abika.com

84

of the relation between the Nk and the nk. For instance, if Nknk=8Nk-1, then each (k-1)-level

processor is being analyzed by eight different k-level processors; but if Nknk=Nk-1, then each (k-

1)-level processor is being analyzed by only one k-level processor.

This is an hierarchical analogy network: a hierarchy of processors learning by analogy how to

best learn by analogy how to best learn by analogy how to best learn by analogy... how to best

learn by analogy. As will be explained in later chapters, in order to be effective it must be

coupled with a structurally associative memory network, which provides a knowledge base

according to which the process of analogy can be executed.

- Read and print without ads
- Download to keep your version
- Edit, email or read offline

Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

CANCEL

OK

You've been reading!

NO, THANKS

OK

scribd