Professional Documents
Culture Documents
Eigenvalues and Eigenvectors Apps
Eigenvalues and Eigenvectors Apps
Christy Deken
equations that relate two different populations. We can think of one population
as predators and one population as prey. If the two populations are related by:
dx/dt = 2x - 3y
dy/dt = x - 2y
This is the same as a system of differential equations with components (x(t),
y(t)):
x(t)=A x, where A= 2 -3 . This population has an eigenvector, v1 = 3 .
1 -2
1
This eigenvector has a specific meaning. If the population of the prey is three
times that of the predators, that will always be true, because if x(0) is
proportional to the eigenvector, then x = x. Therefore, x(t)= x(0)et. The role of
the eigenvalue associated with the eigenvector is since, A v1 = v1 , = 1. This
tells us the growth rate of the differential equation.
Could one use eigenvectors in measuring the importance of a web site?
Imagine a site is important if other important web sites link to it. If x1 , x2 , . . . ,
xn are the importances of the n different web sites of the world, we would want
the importance of each web site to be proportional to the sum of the importances
of every web site that links to up to it. This system of equations could look like
this:
x1 = K(x345 + x67 + x34)
x2 =K(x14 + x245 + x99)
....=.....
where K is the constant of proportionality, and the right side of the equation is
the sum of importances of all the sites that link to the web site on the left. Now
picture a huge (n x n) matrix A whose (i,j) entry is 1 if web site j links to web site
and is 0 otherwise. This allows us to rewrite this as:
n
x 1 := K
( a ij) x j
i= 1
It shows that the vector we are looking for is an eigenvector of the matrix A. To
find it we might find all of the eigenvectors of A, and use the one with all positive
components. Once the eigenvector is found the most important web site is the
one with the largest entry in that eigenvector. The google web search engine
uses a variant of this idea to find the importance of a large number of web sites.
Now, lets talk about how to find eigenvectors and eigenvalues. Suppose
= 0. Finding the eigenvalues is now the same as finding nonzero vectors in the
null space. We can do this if det(A)=0 but not otherwise. If we do not assume
that = 0 then it is still the same as finding the null space, but it is the null space
for the matrix: A - I. Here I stands for the (n x n) identity matrix. There will be
a nonzero vector for the identity matrix only if the determinate of this new matrix
is equal to 0. We refer to this condition, det(A - I) = 0, as the characteristic
equation of A. If the null space of A - I has a nonzero vector, it is called an
eigenspace of A with an eigenvalue of . Thus the algorithm for solving for
eigenvalues and eigenvectors, according to Evans M. Harrell II, is as follows:
I. First find the eigenvalues by solving the characteristic equation. Call the
solutions
1 , . . . , n .
II. For each eigenvalue k, use row reduction to find a basis of the eigenspace
Ker(A - I).
If k, the existence of a nonzero vector in this null space, is guaranteed. Any
such vector is an eigenvector.
There are many uses of eigenvectors and eigenvalues not discussed in
this paper. This paper only begins to explore the many applications of
eigenvectors and eigenvalues. However, the importance of these has been
demonstrated in the examples discussed.
References
1. Evans M. Harrall II, Eigenvectors and Eigenvalues, (2001). Online at
http://www.mathphysics.com/calc/eigen.html.
2. Herbert S. Wilf, University of Pennsylvania, (April 2001) . Online at
http://www.math.utsc.utoronto.ca/b24/KendallWei.pdf.