Determinants, Eigen Vectors

Eigen Vectors

An eigen vector is a vector that is scaled by a linear transformation, but not moved. Think of an eigen vector as an arrow whose direction is not changed. It may stretch, or shrink, as space is transformed, but it continues to point in the same direction. Most arrows will move, as illustrated by a spinning planet, but some vectors will continue to point in the same direction, such as the north pole.

The scaling factor of an eigen vector is called its eigen value. An eigen value only makes sense in the context of an eigen vector, i.e. the arrow whose length is being changed.

In the plane, a rigid rotation of 90° has no eigen vectors, because all vectors move. However, the reflection y = -y has the x and y axes as eigen vectors. In this function, x is scaled by 1 and y by -1, the eigen values corresponding to the two eigen vectors. All other vectors move in the plane.

The y axis, in the above example, is subtle. The direction of the vector has been reversed, yet we still call it an eigen vector, because it lives in the same line as the original vector. It has been scaled by -1, pointing in the opposite direction. An eigen vector stretches, or shrinks, or reverses course, or squashes down to 0. The key is that the output vector is a constant (possibly negative) times the input vector.

These concepts are valid over a division ring, as well as a field. Multiply by K on the left to build the K vector space, and apply the transformation, as a matrix, on the right. However, the following method for deriving eigen values and vectors is based on the determinant, and requires a field.

Finding Eigen Values and Vectors

Given a matrix M implementing a linear transformation, what are its eigen vectors and values? Let the vector x represent an eigen vector and let l be the eigen value. We must solve x*M = lx.

Rewrite lx as x times l times the identity matrix and subtract it from both sides. The right side drops to 0, and the left side is x*M-x*l*identity. Pull x out of both factors and write x*Q = 0, where Q is M with l subtracted from the main diagonal. The eigen vector x lies in the kernel of the map implemented by Q. The entire kernel is known as the eigen space, and of course it depends on the value of l.

If the eigen space is nontrivial then the determinant of Q must be 0. Expand the determinant, giving an n degree polynomial in l. (This is where we need a field, to pull all the entries to the left of l, and build a traditional polynomial.) This is called the characteristic polynomial of the matrix. The roots of this polynomial are the eigen values. There are at most n eigen values.

Substitute each root in turn and find the kernel of Q. We are looking for the set of vectors x such that x*Q = 0. Let R be the transpose of Q and solve R*x = 0, where x has become a column vector. This is a set of simultaneous equations that can be solved using gaussian elimination.

In summary, a somewhat straightforward algorithm extracts the eigen values, by solving an n degree polynomial, then derives the eigen space for each eigen value.

Some eigen values will produce multiple eigen vectors, i.e. an eigen space with more than one dimension. The identity matrix, for instance, has an eigen value of 1, and an n-dimensional eigen space to go with it. In contrast, an eigen value may have multiplicity > 1, yet there is only one eigen vector. This is illustrated by [1,1|0,1], a function that tilts the x axis counterclockwise and leaves the y axis alone. The eigen values are 1 and 1, and the eigen vector is 0,1, namely the y axis.

The Same Eigen Value

Let two eigen vectors have the same eigen value. specifically, let a linear map multiply the vectors v and w by the scaling factor l. By linearity, 3v+4w is also scaled by l. In fact every linear combination of v and w is scaled by l. When a set of vectors has a common eigen value, the entire space spanned by those vectors is an eigen space, with the same eigen value.

This is not surprising, since the eigen vectors associated with l are precisely the kernel of the transfoormation defined by the matrix M with l subtracted from the main diagonal. This kernel is a vector space, and so is the eigen space of l.

Select a basis b for the eigen space of l. The vectors in b are eigen vectors, with eigen value l, and every eigen vector with eigen value l is spanned by b. Conversely, an eigen vector with some other eigen value lies outside of b.

Different Eigen Values

Different eigen values always lead to independent eigen spaces.

Suppose we have the shortest counterexample. Thus c1x1 + c2x2 + … + ckxk = 0. Here x1 through xk are the eigen vectors, and c1 through ck are the coefficients that prove the vectors form a dependent set. Furthermore, the vectors represent at least two different eigen values.

Let the first 7 vectors share a common eigen value l. If these vectors are dependent then one of them can be expressed as a linear combination of the other 6. Make this substitution and find a shorter list of dependent eigen vectors that do not all share the same eigen value. The first 6 have eigen value l, and the rest have some other eigen value. Remember, we selected the shortest list, so this is a contradiction. Therefore the eigen vectors associated with any given eigen value are independent.

Scale all the coefficients c1 through ck by a common factor s. This does not change the fact that the sum of cixi is still zero. However, other than this scaling factor, we will prove there are no other coefficients that carry the eigen vectors to 0.

If there are two independent sets of coefficients that lead to 0, scale them so the first coefficients in each set are equal, then subtract. This gives a shorter linear combination of dependent eigen vectors that yields 0. More than one vector remains, else cjxj = 0, and xj is the 0 vector. We already showed these dependent eigen vectors cannot share a common eigen value, else they would be linearly independent; thus multiple eigen values are represented. This is a shorter list of dependent eigen vectors with multiple eigen values, which is a contradiction.

If a set of coefficients carries our eigen vectors to 0, it must be a scale multiple of c1 c2 c3 … ck.

Now take the sum of cixi and multiply by M on the right. In other words, apply the linear transformation. The image of 0 ought to be 0. Yet each coefficient is effectively multiplied by the eigen value for its eigen vector, and not all eigen values are equal. In particular, not all eigen values are 0. The coefficients are not scaled equally. The new linear combination of eigen vectors is not a scale multiple of the original, and is not zero across the board. It represents a new way to combine eigen vectors to get 0. If there were two eigen values before, and one of them was zero, there is but one eigen value now. However, this means the vectors associated with that one eigen value are dependent, and we already ruled that out. Therefore we still have two or more eigen values represented. This cannot be a shorter list, so all eigen vectors are still present. In other words, all our original eigen values were nonzero. Hence a different linear combination of our eigen vectors yields 0, and that is impossible.

Therefore the eigen spaces produced by different eigen values are linearly independent.

These results, for eigen values and eigen vectors, are valid over a division ring.

Axis of Rotation

Here is a simple application of eigen vectors. A rigid rotation in 3 space always has an axis of rotation.

Let M implement the rotation. The determinant of M, with l subtracted from its main diagonal, gives a cubic polynomial in l, and every cubic has at least one real root. Since lengths are preserved by a rotation, l is ±1. If l is -1 we have a reflection. So l = 1, and the space rotates through some angle θ about the eigen vector. That's why every planet, every star, has an axis of rotation.

Mathematics and Accounting for the Matthew Neuenhaus Nonprofit Foundation:
Matthew Neuenhaus Grants | Nonprofit Accounting | Matthew Neuenhaus Nonprofit