Professional Documents
Culture Documents
org/wiki/Eigenvector
Many kinds of mathematical objects can be treated as vectors: functions, harmonic modes, quantum states, and
frequencies, for example. In these cases, the concept of direction loses its ordinary meaning, and is given an
abstract definition. Even so, if this abstract direction is unchanged by a given linear transformation, the prefix
"eigen" is used, as in eigenfunction, eigenmode, eigenstate, and eigenfrequency.
Contents
1 History
2 Definitions
2.1 Left and right eigenvectors
3 Characteristic equation
3.1 Example
4 Existence and multiplicity of eigenvalues
4.1 Shear
1 of 18 1/29/2009 5:14 PM
Eigenvalue, eigenvector and eigenspace - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Eigenvector
History
Eigenvalues are often introduced in the context of linear algebra or matrix theory. Historically, however, they
arose in the study of quadratic forms and differential equations.
Euler had also studied the rotational motion of a rigid body and discovered the importance of the principal axes.
As Lagrange realized, the principal axes are the eigenvectors of the inertia matrix.[1] In the early 19th century,
Cauchy saw how their work could be used to classify the quadric surfaces, and generalized it to arbitrary
dimensions.[2] Cauchy also coined the term racine caractéristique (characteristic root) for what is now called
eigenvalue; his term survives in characteristic equation.[3]
Fourier used the work of Laplace and Lagrange to solve the heat equation by separation of variables in his
famous 1822 book Théorie analytique de la chaleur.[4] Sturm developed Fourier's ideas further and he brought
them to the attention of Cauchy, who combined them with his own ideas and arrived at the fact that symmetric
matrices have real eigenvalues.[2] This was extended by Hermite in 1855 to what are now called Hermitian
matrices.[3] Around the same time, Brioschi proved that the eigenvalues of orthogonal matrices lie on the unit
circle,[2] and Clebsch found the corresponding result for skew-symmetric matrices.[3] Finally, Weierstrass
clarified an important aspect in the stability theory started by Laplace by realizing that defective matrices can
cause instability.[2]
In the meantime, Liouville studied eigenvalue problems similar to those of Sturm; the discipline that grew out of
their work is now called Sturm-Liouville theory.[5] Schwarz studied the first eigenvalue of Laplace's equation on
general domains towards the end of the 19th century, while Poincaré studied Poisson's equation a few years
2 of 18 1/29/2009 5:14 PM
Eigenvalue, eigenvector and eigenspace - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Eigenvector
later.[6]
At the start of the 20th century, Hilbert studied the eigenvalues of integral operators by viewing the operators as
infinite matrices.[7] He was the first to use the German word eigen to denote eigenvalues and eigenvectors in
1904, though he may have been following a related usage by Helmholtz. "Eigen" can be translated as "own",
"peculiar to", "characteristic", or "individual"—emphasizing how important eigenvalues are to defining the
unique nature of a specific transformation. For some time, the standard term in English was "proper value", but
the more distinctive term "eigenvalue" is standard today.[8]
The first numerical algorithm for computing eigenvalues and eigenvectors appeared in 1929, when Von Mises
published the power method. One of the most popular methods today, the QR algorithm, was proposed
independently by John G.F. Francis[9][10] and Vera Kublanovskaya[11] in 1961.[12]
Definitions
See also: Eigenplane
Linear transformations of a vector space, such as rotation, reflection, stretching, compression, shear or any
combination of these, may be visualized by the effect they produce on vectors. In other words, they are vector
functions. More formally, in a vector space L, a vector function A is defined if for each vector x of L there
corresponds a unique vector y = A(x) of L. For the sake of brevity, the parentheses around the vector on which
the transformation is acting are often omitted. A vector function A is linear if it has the following two properties:
Additivity: A(x + y) = Ax + Ay
Homogeneity: A(αx) = αAx
where x and y are any two vectors of the vector space L and α is any scalar.[13] Such a function is variously
called a linear transformation, linear operator, or linear endomorphism on the space L.
The key equation in this definition is the eigenvalue equation, Ax = λx. That is to say that the vector x has the
property that its direction is not changed by the transformation A, but that it is only scaled by a factor of λ. Most
vectors x will not satisfy such an equation: a typical vector x changes direction when acted on by A, so that Ax is
not a multiple of x. This means that only certain special vectors x are eigenvectors, and only certain special
numbers λ are eigenvalues. Of course, if A is a multiple of the identity matrix, then no vector changes direction,
and all non-zero vectors are eigenvectors.
The requirement that the eigenvector be non-zero is imposed because the equation A0 = λ0 holds for every A
and every λ. Since the equation is always trivially true, it is not an interesting case. In contrast, an eigenvalue
can be zero in a nontrivial way. Each eigenvector is associated with a specific eigenvalue. One eigenvalue can
be associated with several or even with an infinite number of eigenvectors.
3 of 18 1/29/2009 5:14 PM
Eigenvalue, eigenvector and eigenspace - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Eigenvector
The eigenvectors corresponding to different eigenvalues are linearly independent[16] meaning, in particular, that
in an n-dimensional space the linear transformation A cannot have more than n eigenvectors with different
eigenvalues.[17]
If a basis is defined in vector space, all vectors can be expressed in terms of components. For finite dimensional
vector spaces with dimension n, linear transformations can be represented with n × n square matrices.
Conversely, every such square matrix corresponds to a linear transformation for a given basis. Thus, in a
two-dimensional vector space R2 fitted with standard basis, the eigenvector equation for a linear transformation
A can be written in the following matrix representation:
The word eigenvector formally refers to the right eigenvector xR. It is defined by the above eigenvalue
equation AxR = λRxR, and is the most commonly used eigenvector. However, the left eigenvector xL exists as
well, and is defined by xLA = λLxL.
Characteristic equation
When a transformation is represented by a square matrix A, the eigenvalue equation can be expressed as
4 of 18 1/29/2009 5:14 PM
Eigenvalue, eigenvector and eigenspace - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Eigenvector
then both sides can be multiplied by the inverse to obtain the trivial solution: x = 0. Thus we require there to be
no inverse by assuming from linear algebra that the determinant equals zero:
The determinant requirement is called the characteristic equation (less often, secular equation) of A, and the
left-hand side is called the characteristic polynomial. When expanded, this gives a polynomial equation for λ.
The eigenvector x or its components are not present in the characteristic equation.
Example
The matrix
defines a linear transformation of the real plane. The eigenvalues of this transformation are given by the
characteristic equation
The roots of this equation (i.e. the values of λ for which the equation holds) are λ = 1 and λ = 3. Having found
the eigenvalues, it is possible to find the eigenvectors. Considering first the eigenvalue λ = 3, we have
Both rows of this matrix equation reduce to the single linear equation x = y. To find an eigenvector, we are free
to choose any value for x, so by picking x=1 and setting y=x, we find the eigenvector to be
We can check this is an eigenvector by checking that : For the eigenvalue λ = 1, a similar
The complexity of the problem for finding roots/eigenvalues of the characteristic polynomial increases rapidly
with increasing the degree of the polynomial (the dimension of the vector space). There are exact solutions for
dimensions below 5, but for dimensions greater than or equal to 5 there are generally no exact solutions and one
5 of 18 1/29/2009 5:14 PM
Eigenvalue, eigenvector and eigenspace - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Eigenvector
has to resort to numerical methods to find them approximately. For large symmetric sparse matrices, Lanczos
algorithm is used to compute eigenvalues and eigenvectors.
As well as distinct roots, the characteristic equation may also have repeated roots. However, having repeated
roots does not imply there are multiple distinct (i.e. linearly independent) eigenvectors with that eigenvalue. The
algebraic multiplicity of an eigenvalue is defined as the multiplicity of the corresponding root of the
characteristic polynomial. The geometric multiplicity of an eigenvalue is defined as the dimension of the
associated eigenspace, i.e. number of linearly independent eigenvectors with that eigenvalue.
Over a complex space, the sum of the algebraic multiplicities will equal the dimension of the vector space, but
the sum of the geometric multiplicities may be smaller. In a sense, then it is possible that there may not be
sufficient eigenvectors to span the entire space. This is intimately related to the question of whether a given
matrix may be diagonalized by a suitable choice of coordinates.
Shear
Horizontal shear. The shear angle φ is given The matrix of a horizontal shear transformation is . The
by k = cot φ.
characteristic equation is λ2 − 2 λ + 1 = (1 − λ)2 = 0 which has a
single, repeated root λ = 1. Therefore, the eigenvalue λ = 1 has algebraic multiplicity 2. The eigenvector(s) are
found as solutions of
The last equation is equivalent to y = 0, which is a straight line along the x axis. This line represents the
one-dimensional eigenspace. In the case of shear the algebraic multiplicity of the eigenvalue (2) is greater than
its geometric multiplicity (1, the dimension of the eigenspace). The eigenvector is a vector along the x axis. The
case of vertical shear with transformation matrix is dealt with in a similar way; the eigenvector in
6 of 18 1/29/2009 5:14 PM
Eigenvalue, eigenvector and eigenspace - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Eigenvector
vertical shear is along the y axis. Applying repeatedly the shear transformation changes the direction of any
vector in the plane closer and closer to the direction of the eigenvector.
Unequal scaling
Thus, the eigenspace is the x-axis. Similarly, substituting λ = k2 shows that the corresponding eigenspace is the
y-axis. In this case, both eigenvalues have algebraic and geometric multiplicities equal to 1. If a given eigenvalue
is greater than 1, the vectors are stretched in the direction of the corresponding eigenvector; if less than 1, they
7 of 18 1/29/2009 5:14 PM
Eigenvalue, eigenvector and eigenspace - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Eigenvector
are shrunken in that direction. Negative eigenvalues correspond to reflections followed by a stretch or shrink. In
general, matrices that are diagonalizable over the real numbers represent scalings and reflections: the
eigenvalues represent the scaling factors (and appear as the diagonal terms), and the eigenvectors are the
directions of the scalings.
The figure shows the case where k1 > 1 and 1 > k2 > 0. The rubber sheet is stretched along the x axis and
simultaneously shrunk along the y axis. After repeatedly applying this transformation of stretching/shrinking
many times, almost any vector on the surface of the rubber sheet will be oriented closer and closer to the
direction of the x axis (the direction of stretching). The exceptions are vectors along the y-axis, which will
gradually shrink away to nothing.
Rotation
A rotation in a plane is a transformation that describes motion of a vector, plane, coordinates, etc., around a
fixed point. Clearly, for rotations other than through 0° and 180°, every vector in the real plane will have its
direction changed, and thus there cannot be any eigenvectors. But this is not necessarily true if we consider the
same matrix over a complex vector space.
A counterclockwise rotation in the horizontal plane about the origin at an angle φ is represented by the matrix
The characteristic equation of R is λ2 − 2λ cos φ + 1 = 0. This quadratic equation has a discriminant D = 4 (cos2
φ − 1) = − 4 sin2 φ which is a negative number whenever φ is not equal a multiple of 180°. A rotation of 0°,
360°, … is just the identity transformation, (a uniform scaling by +1) while a rotation of 180°, 540°, …, is a
reflection (uniform scaling by -1). Otherwise, as expected, there are no real eigenvalues or eigenvectors for
rotation in the plane.
The characteristic equation has two complex roots λ1 and λ2. If we choose to think of the rotation matrix as a
linear operator on the complex two dimensional, we can consider these complex eigenvalues. The roots are
complex conjugates of each other: λ1,2 = cos φ ± i sin φ = e ± iφ, each with an algebraic multiplicity equal to 1,
where i is the imaginary unit.
The first eigenvector is found by substituting the first eigenvalue, λ1, back in the eigenvalue equation:
The last equation is equivalent to the single equation x = iy, and again we are free to set x = 1 to give the
eigenvector
8 of 18 1/29/2009 5:14 PM
Eigenvalue, eigenvector and eigenspace - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Eigenvector
Similarly, substituting in the second eigenvalue gives the single equation x = − iy and so the eigenvector is given
by
Although not diagonalizable over the reals, the rotation matrix is diagonalizable over the complex numbers, and
again the eigenvalues appear on the diagonal. Thus rotation matrices acting on complex spaces can be thought
of as scaling matrices, with complex scaling factors.
If the vector space is an infinite dimensional Banach space, the notion of eigenvalues can be generalized to the
concept of spectrum. The spectrum is the set of scalars λ for which (T − λ)−1 is not defined; that is, such that T
− λ has no bounded inverse.
Clearly if λ is an eigenvalue of T, λ is in the spectrum of T. In general, the converse is not true. There are
operators on Hilbert or Banach spaces which have no eigenvectors at all. This can be seen in the following
example. The bilateral shift on the Hilbert space ℓ 2(Z) (that is, the space of all sequences of scalars … a−1, a0,
a1, a2, … such that
In infinite-dimensional spaces, the spectrum of a bounded operator is always nonempty. This is also true for an
unbounded self adjoint operator. Via its spectral measures, the spectrum of any self adjoint operator, bounded or
otherwise, can be decomposed into absolutely continuous, pure point, and singular parts. (See Decomposition of
spectrum.)
The hydrogen atom is an example where both types of spectra appear. The eigenfunctions of the hydrogen atom
Hamiltonian are called eigenstates and are grouped into two categories. The bound states of the hydrogen atom
correspond to the discrete part of the spectrum (they have a discrete set of eigenvalues which can be computed
by Rydberg formula) while the ionization processes are described by the continuous part (the energy of the
collision/ionization is not quantified).
Eigenfunctions
A common example of such maps on infinite dimensional spaces are the action of differential operators on
function spaces. As an example, on the space of infinitely differentiable functions, the process of differentiation
defines a linear operator since
where f(t) and g(t) are differentiable functions, and a and b are constants).
9 of 18 1/29/2009 5:14 PM
Eigenvalue, eigenvector and eigenspace - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Eigenvector
The eigenvalue equation for linear differential operators is then a set of one or more differential equations. The
eigenvectors are commonly called eigenfunctions. The most simple case is the eigenvalue equation for
differentiation of a real valued function by a single real variable. In this case, the eigenvalue equation becomes
the linear differential equation
Here λ is the eigenvalue associated with the function, f(x). This eigenvalue equation has a solution for all values
of λ. If λ is zero, the solution is
f(x) = Aeλx.
If we expand our horizons to complex valued functions, the value of λ can be any complex number. The
spectrum of d/dt is therefore the whole complex plane. This is an example of a continuous spectrum.
Waves on a string
Each of these is an eigenvalue equation (the unfamiliar form of the eigenvalue is chosen merely for
convenience). For any values of the eigenvalues, the eigenfunctions are given by
If we impose boundary conditions -- that the ends of the string are fixed with X(x)=0 at x=0 and x=L, for
example -- we can constrain the eigenvalues. For those boundary conditions, we find
and
10 of 18 1/29/2009 5:14 PM
Eigenvalue, eigenvector and eigenspace - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Eigenvector
Thus, the constant ω is constrained to take one of the values , where n is any integer. Thus the
clamped string supports a family of standing waves of the form
From the point of view of our musical instrument, the frequency ωn is the frequency of the nth harmonic
overtone.
Eigendecomposition
The spectral theorem for matrices can be stated as follows. Let A be a square n × n matrix. Let q1 ... qk be an
eigenvector basis, i.e. an indexed set of k linearly independent eigenvectors, where k is the dimension of the
space spanned by the eigenvectors of A. If k = n, then A can be written
where Q is the square n × n matrix whose i-th column is the basis eigenvector qi of A and Λ is the diagonal
matrix whose diagonal elements are the corresponding eigenvalues, i.e. Λii = λi.
Applications
Schrödinger equation
11 of 18 1/29/2009 5:14 PM
Eigenvalue, eigenvector and eigenspace - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Eigenvector
Molecular orbitals
In quantum mechanics, and in particular in atomic and molecular physics, within the Hartree-Fock theory, the
atomic and molecular orbitals can be defined by the eigenvectors of the Fock operator. The corresponding
eigenvalues are interpreted as ionization potentials via Koopmans' theorem. In this case, the term eigenvector is
used in a somewhat more general meaning, since the Fock operator is explicitly dependent on the orbitals and
their eigenvalues. If one wants to underline this aspect one speaks of nonlinear eigenvalue problem. Such
equations are usually solved by an iteration procedure, called in this case self-consistent field method. In
quantum chemistry, one often represents the Hartree-Fock equation in a non-orthogonal basis set. This
particular representation is a generalized eigenvalue problem called Roothaan equations.
In geology, especially in the study of glacial till, eigenvectors and eigenvalues are used as a method by which a
mass of information of a clast fabric's constituents' orientation and dip can be summarized in a 3-D space by six
numbers. In the field, a geologist may collect such data for hundreds or thousands of clasts in a soil sample,
which can only be compared graphically such as in a Tri-Plot (Sneed and Folk) diagram [19], [20], or as a
Stereonet on a Wulff Net [21]. The output for the orientation tensor is in the three orthogonal (perpendicular)
12 of 18 1/29/2009 5:14 PM
Eigenvalue, eigenvector and eigenspace - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Eigenvector
axes of space. Eigenvectors output from programs such as Stereo32 [22] are in the order E1 ≥ E2 ≥ E3, with E1
being the primary orientation of clast orientation/dip, E2 being the secondary and E3 being the tertiary, in terms
of strength. The clast orientation is defined as the eigenvector, on a compass rose of 360°. Dip is measured as
the eigenvalue, the modulus of the tensor: this is valued from 0° (no dip) to 90° (vertical). The relative values of
E1, E2, and E3 are dictated by the nature of the sediment's fabric. If E1 = E2 = E3, the fabric is said to be
isotropic. If E1 = E2 > E3 the fabric is planar. If E1 > E2 > E3 the fabric is linear. See 'A Practical Guide to the
Study of Glacial Sediments' by Benn & Evans, 2004 [23].
Factor analysis
In factor analysis, the eigenvectors of a covariance matrix or correlation matrix correspond to factors, and
eigenvalues to the variance explained by these factors. Factor analysis is a statistical technique used in the social
sciences and in marketing, product management, operations research, and other applied sciences that deal with
large quantities of data. The objective is to explain most of the covariability among a number of observable
random variables in terms of a smaller number of unobservable latent variables called factors. The observable
random variables are modeled as linear combinations of the factors, plus unique variance terms. Eigenvalues are
used in analysis used by Q-methodology software; factors with eigenvalues greater than 1.00 are considered
significant, explaining an important amount of the variability in the data, while eigenvalues less than 1.00 are
considered too weak, not explaining a significant portion of the data variability.
Vibration analysis
Eigenvalue problems occur naturally in the vibration analysis of mechanical structures with many degrees of
freedom. The eigenvalues are used to determine the natural frequencies of vibration, and the eigenvectors
determine the shapes of these vibrational modes. The orthogonality properties of the eigenvectors allows
decoupling of the differential equations so that the system can be represented as linear summation of the
eigenvectors. The eigenvalue problem of complex structures is often solved using finite element analysis.
13 of 18 1/29/2009 5:14 PM
Eigenvalue, eigenvector and eigenspace - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Eigenvector
Eigenfaces
Tensor of inertia
In mechanics, the eigenvectors of the inertia tensor define the principal axes of a rigid body. The tensor of
inertia is a key quantity required in order to determine the rotation of a rigid body around its center of mass.
Stress tensor
In solid mechanics, the stress tensor is symmetric and so can be decomposed into a diagonal tensor with the
eigenvalues on the diagonal and eigenvectors as a basis. Because it is diagonal, in this orientation, the stress
tensor has no shear components; the components it does have are the principal components.
Eigenvalues of a graph
In spectral graph theory, an eigenvalue of a graph is defined as an eigenvalue of the graph's adjacency matrix A,
or (increasingly) of the graph's Laplacian matrix, which is either T−A or I−T 1/2AT −1/2, where T is a diagonal
matrix holding the degree of each vertex, and in T −1/2, 0 is substituted for 0−1/2. The kth principal eigenvector
of a graph is defined as either the eigenvector corresponding to the kth largest eigenvalue of A, or the
eigenvector corresponding to the kth smallest eigenvalue of the Laplacian. The first principal eigenvector of the
graph is also referred to merely as the principal eigenvector.
The principal eigenvector is used to measure the centrality of its vertices. An example is Google's PageRank
algorithm. The principal eigenvector of a modified adjacency matrix of the World Wide Web graph gives the
page ranks as its components. This vector corresponds to the stationary distribution of the Markov chain
represented by the row-normalized adjacency matrix; however, the adjacency matrix must first be modified to
ensure a stationary distribution exists. The second principal eigenvector can be used to partition the graph into
clusters, via spectral clustering. Other methods are also available for clustering.
See also
14 of 18 1/29/2009 5:14 PM
Eigenvalue, eigenvector and eigenspace - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Eigenvector
Nonlinear eigenproblem
Quadratic eigenvalue problem
Eigenspectrum
Notes
1. ^ See Hawkins 1975, §2 13. ^ See Beezer 2006, Definition LT on p. 507; Strang
2. ^ a b c d See Hawkins 1975, §3 2006, p. 117; Kuttler 2007, Definition 5.3.1 on p. 71;
3. ^ a b c See Kline 1972, pp. 807-808 Shilov 1977, Section 4.21 on p. 77; Rowland, Todd
4. ^ See Kline 1972, p. 673 and Weisstein, Eric W. Linear transformation
5. ^ See Kline 1972, pp. 715-716 (http://mathworld.wolfram.com
6. ^ See Kline 1972, pp. 706-707 /LinearTransformation.html) From MathWorld − A
7. ^ See Kline 1972, p. 1063 Wolfram Web Resource
8. ^ See Aldrich 2006 14. ^ See Korn & Korn 2000, Section 14.3.5a;
9. ^ J.G.F. Francis, "The QR Transformation, I" (part Friedberg, Insel & Spence 1989, p. 217
1), The Computer Journal, vol. 4, no. 3, pages 15. ^ For a proof of this lemma, see Shilov 1977, p. 109,
265-271 (1961); "The QR Transformation, II" (part and Lemma for the eigenspace
2), The Computer Journal, vol. 4, no. 4, pages 16. ^ For a proof of this lemma, see Roman 2008,
332-345 (1962). Theorem 8.2 on p. 186; Shilov 1977, p. 109;
10. ^ John G.F. Francis (1934 - ), devised the “QR Hefferon 2001, p. 364; Beezer 2006, Theorem
transformation” for computing the eigenvalues of EDELI on p. 469; and Lemma for linear
matrices. Born in London in 1934, he presently independence of eigenvectors
(2007) resides in Hove, England (near Brighton). In 17. ^ See Shilov 1977, p. 109
1954 he worked for the National Research 18. ^ Definition according to Weisstein, Eric W. Shear
Development Corporation (NRDC). In 1955-1956 he (http://mathworld.wolfram.com/Shear.html) From
attended Cambridge University. He then returned to MathWorld − A Wolfram Web Resource
the NRDC, where he served as assistant to 19. ^ Graham, D., and Midgley, N., 2000. Earth Surface
Christopher Strachey. At this time he devised the QR Processes and Landforms (25) pp 1473-1477
transformation. In 1961 he left the NRDC to work at 20. ^ Sneed ED, Folk RL. 1958. Pebbles in the lower
Ferranti Corporation, Ltd. and then at the University Colorado River, Texas, a study of particle
of Sussex. Subsequently, he had positions with morphogenesis. Journal of Geology 66(2): 114–150
various industrial organizations and consultancies. 21. ^ GIS-stereoplot: an interactive stereonet plotting
His interests encompassed artificial intelligence, module for ArcView 3.0 geographic information
computer languages, and systems engineering. He is system (http://dx.doi.org/10.1016
currently retired. (See: http://www-sbras.nsc.ru /S0098-3004(97)00122-2)
/mathpub/na-net/db/showfile.phtml?v07n34.html#1 .) 22. ^ Stereo32 (http://www.ruhr-uni-bochum.de/hardrock
11. ^ Vera N. Kublanovskaya, "On some algorithms for /downloads.htm)
the solution of the complete eigenvalue problem" 23. ^ Benn, D., Evans, D., 2004. A Practical Guide to
USSR Computational Mathematics and the study of Glacial Sediments. London: Arnold. pp
Mathematical Physics, vol. 3, pages 637–657 103-107
(1961). Also published in: Zhurnal Vychislitel'noi 24. ^ Xirouhakis, A.; Votsis, G.; Delopoulus, A. (2004)
Matematiki i Matematicheskoi Fiziki, vol.1, no. 4, (PDF), Estimation of 3D motion and structure of
pages 555–570 (1961). human faces, Online paper in PDF format, National
12. ^ See Golub & van Loan 1996, §7.3; Meyer 2000, Technical University of Athens,
§7.3 http://www.image.ece.ntua.gr/papers/43.pdf
References
Korn, Granino A.; Korn, Theresa M. (2000), Dover Publications, 2 Revised edition, ISBN
Mathematical Handbook for Scientists and 0-486-41147-8.
Engineers: Definitions, Theorems, and Lipschutz, Seymour (1991), Schaum's outline of
Formulas for Reference and Review, 1152 p., theory and problems of linear algebra,
15 of 18 1/29/2009 5:14 PM
Eigenvalue, eigenvector and eigenspace - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Eigenvector
Schaum's outline series (2nd ed.), New York, Golub, Gene F.; van der Vorst, Henk A. (2000),
NY: McGraw-Hill Companies, ISBN "Eigenvalue computation in the 20th century",
0-07-038007-4. Journal of Computational and Applied
Friedberg, Stephen H.; Insel, Arnold J.; Spence, Mathematics 123: 35–65,
Lawrence E. (1989), Linear algebra (2nd ed.), doi:10.1016/S0377-0427(00)00413-1.
Englewood Cliffs, NJ 07632: Prentice Hall, Akivis, Max A.; Vladislav V. Goldberg (1969),
ISBN 0-13-537102-3. Tensor calculus, Russian, Science Publishers,
Aldrich, John (2006), "Eigenvalue, Moscow.
eigenfunction, eigenvector, and related terms", Gelfand, I. M. (1971), Lecture notes in linear
in Jeff Miller (Editor), Earliest Known Uses of algebra, Russian, Science Publishers, Moscow.
Some of the Words of Mathematics, Alexandrov, Pavel S. (1968), Lecture notes in
http://jeff560.tripod.com/e.html, retrieved on 22 analytical geometry, Russian, Science
August 2006 Publishers, Moscow.
Strang, Gilbert (1993), Introduction to linear Carter, Tamara A.; Tapia, Richard A.;
algebra, Wellesley-Cambridge Press, Wellesley, Papaconstantinou, Anne, Linear Algebra: An
MA, ISBN 0-961-40885-5. Introduction to Linear Algebra for
Strang, Gilbert (2006), Linear algebra and its Pre-Calculus Students, Rice University, Online
applications, Thomson, Brooks/Cole, Belmont, Edition, http://ceee.rice.edu/Books
CA, ISBN 0-030-10567-6. /LA/index.html, retrieved on 19 February 2008.
Bowen, Ray M.; Wang, Chao-Cheng (1980), Roman, Steven (2008), Advanced linear
Linear and multilinear algebra, Plenum Press, algebra (3rd ed.), New York, NY: Springer
New York, NY, ISBN 0-306-37508-7. Science + Business Media, LLC, ISBN
Cohen-Tannoudji, Claude (1977), "Chapter II. 978-0-387-72828-5.
The mathematical tools of quantum mechanics", Shilov, Georgi E. (1977), Linear algebra
Quantum mechanics, John Wiley & Sons, ISBN (translated and edited by Richard A. Silverman
0-471-16432-1. ed.), New York: Dover Publications, ISBN
Fraleigh, John B.; Beauregard, Raymond A. 0-486-63518-X.
(1995), Linear algebra (3rd ed.), Addison- Hefferon, Jim (2001), Linear Algebra, Online
Wesley Publishing Company, ISBN book, St Michael's College, Colchester,
0-201-83999-7 (international edition). Vermont, USA, http://joshua.smcvt.edu
Golub, Gene H.; van Loan, Charles F. (1996), /linearalgebra/.
Matrix computations (3rd Edition), Johns Kuttler, Kenneth (2007) (PDF), An introduction
Hopkins University Press, Baltimore, MD, ISBN to linear algebra, Online e-book in PDF format,
978-0-8018-5414-9. Brigham Young University,
Hawkins, T. (1975), "Cauchy and the spectral http://www.math.byu.edu/~klkuttle
theory of matrices", Historia Mathematica 2: /Linearalgebra.pdf.
1–29, doi:10.1016/0315-0860(75)90032-4. Demmel, James W. (1997), Applied numerical
Horn, Roger A.; Johnson, Charles F. (1985), linear algebra, SIAM, ISBN 0-89871-389-7.
Matrix analysis, Cambridge University Press, Beezer, Robert A. (2006), A first course in
ISBN 0-521-30586-1 (hardback), ISBN linear algebra, Free online book under GNU
0-521-38632-2 (paperback). licence, University of Puget Sound,
Kline, Morris (1972), Mathematical thought http://linear.ups.edu/.
from ancient to modern times, Oxford Lancaster, P. (1973), Matrix theory, Russian,
University Press, ISBN 0-195-01496-0. Moscow, Russia: Science Publishers.
Meyer, Carl D. (2000), Matrix analysis and Halmos, Paul R. (1987), Finite-dimensional
applied linear algebra, Society for Industrial vector spaces (8th ed.), New York, NY:
and Applied Mathematics (SIAM), Philadelphia, Springer-Verlag, ISBN 0387900934.
ISBN 978-0-89871-454-8. Pigolkina, T. S. and Shulman, V. S., Eigenvalue
Brown, Maureen (October 2004), Illuminating (in Russian), In:Vinogradov, I. M. (Ed.),
Patterns of Perception: An Overview of Q Mathematical Encyclopedia, Vol. 5, Soviet
Methodology. Encyclopedia, Moscow, 1977.
16 of 18 1/29/2009 5:14 PM
Eigenvalue, eigenvector and eigenspace - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Eigenvector
Pigolkina, T. S. and Shulman, V. S., Eigenvector Shores, Thomas S. (2007), Applied linear
(in Russian), In:Vinogradov, I. M. (Ed.), algebra and matrix analysis, Springer
Mathematical Encyclopedia, Vol. 5, Soviet Science+Business Media, LLC, ISBN
Encyclopedia, Moscow, 1977. 0-387-33194-8.
Greub, Werner H. (1975), Linear Algebra (4th Sharipov, Ruslan A. (1996), Course of Linear
Edition), Springer-Verlag, New York, NY, ISBN Algebra and Multidimensional Geometry: the
0-387-90110-8. textbook, Online e-book in various formats on
Larson, Ron; Edwards, Bruce H. (2003), arxiv.org, Bashkir State University, Ufa,
Elementary linear algebra (5th ed.), Houghton arΧiv:math/0405323v1, ISBN 5-7477-0099-5,
Mifflin Company, ISBN 0-618-33567-6. http://www.geocities.com/r-sharipov.
Curtis, Charles W., Linear Algebra: An Gohberg, Israel; Lancaster, Peter; Rodman,
Introductory Approach, 347 p., Springer; 4th ed. Leiba (2005), Indefinite linear algebra and
1984. Corr. 7th printing edition (August 19, applications, Basel-Boston-Berlin: Birkhäuser
1999), ISBN 0387909923. Verlag, ISBN 3-7643-7349-0.
External links
Theory
Algorithms
Online calculators
arndt-bruenner.de (http://www.arndt-bruenner.de/mathe/scripts/engl_eigenwert.htm)
bluebit.gr (http://www.bluebit.gr/matrix-calculator/)
wims.unice.fr (http://wims.unice.fr
/wims/wims.cgi?session=6S051ABAFA.2&+lang=en&+module=tool%2Flinear%2Fmatrix.en)
17 of 18 1/29/2009 5:14 PM
Eigenvalue, eigenvector and eigenspace - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Eigenvector
18 of 18 1/29/2009 5:14 PM