Linear Algebra 9 | Trace, Eigenspace, Eigendecomposition, Similarity, and Diagonalizable Matrix

Series: Linear Algebra

Linear Algebra 9 | Trace, Eigenspace, Eigendecomposition, Similarity, and Diagonalizable Matrix

  1. Recall: Eigenvalue and Eigenvector

(1) The Definition of the Eigenvector and the Eigenvalue

Let matrix A be an n × n square matrix. Suppose we have a vector x0. If there’s a constant λ satisfies,

then x is an eigenvector for A corresponding to the eigenvalue λ.

(2) Calculate the Eigenvector and the Eigenvalue

If we want to calculate the eigenvalue of A, we can construct the matrix AI and then assign its determinate to zero.

Then, we can assign back the λ to AI respectively, and then calculate the non-trivial solution of (AI)x = 0 in order to solve the eigenvector x.

(3) The Definition of Algebraic Multiplicity

The number of times of λ appears as a root of,

is called the algebraic multiplicity.

(4) The Definition of Geometric Multiplicity

The dimension of the subspace N(AI) is called geometric multiply. The algebraic multiplicity and the geometric multiplicity can be equal sometimes but they are not always equal to each other.

Geometric multiplicity is also known as the dimension of the eigenspace of λ.

(5) The Definition of Trace

The trace of a matrix is the summation of the main diagonal entries, which is,

Why this is an important conclusion is because the trace of A equals the summation of all the eigenvalues of A. This is also to say that,

Proof:

Suppose we have matrix A as

then the determinate of A is to calculate,

by computational formula,

Alternatively, we suppose λ is an eigenvalue of A, then it must satisfy,

then, we can have,

Because

thus,

2. Recall: Orthonormal Transformation

(1) Orthonormal Transformation

If matrix Q has orthonormal columns,

So we can see that the orthonormal matrix Q on x preserves its length.

(2) Orthogonal Transformation

If matrix U has orthogonal columns, and suppose we have,

then,

Thus the orthogonal matrix U on x will not preserve its length. So it is obviously not a good idea to use orthogonal vectors for transformation.

3. Eigenspace and Eigendecomposition

(1) The Definition of the Eigenspace

The eigenspace is a subspace whose basis corresponding to the span of the set of all eigenvectors of A. This is to say, suppose we have the set of all eigenvectors of A as {v1, v2, …, vn}, then if,

this is known as the eigenspace of matrix A.

Proof of linear independence:

Generally, this is to prove that eigenvectors corresponding to distinct eigenvalues are linearly independent.

Suppose we have a set of all eigenvectors of A as {v1, v2, …, vn}, this is to show that we have only a trivial solution (c1 = c2 = … = 0) if we want to have,

we assume that vn is in the linear combination of all the other eigenvectors, and all the other eigenvectors are linearly independent, then,

define that,

then,

multiply A on the left of both sides of (*),

then,

also, multiply λn on the left of both sides of (*),

let (***) minus (**), then,

because vn is in the linear combination of all the other eigenvectors,

also, we have eigenvectors corresponding to distinct eigenvalues are linearly independent, so that,

therefore, for the set {v1, v2, …, vn-1}, these vectors are not linearly independent. This caused a contradiction to our assumption. Therefore, all the eigenvectors corresponding to distinct eigenvalues are linearly independent.

(2) The Definition of the Eigenmatrix

Suppose we have an n × n diagonal matrix whose main diagonal is composed of eigenvalues of A, then this matrix is called the eigenmatrix of A, and it is noted as Λ.

(3) The Definition of the Eigendecomposition

Suppose we have a full-ranked n × n matrix S whose column space is the eigenspace of A, which is to say that,

Then we can have the conclusion that,

This is known as the eigendecomposition of A.

Proof:

Because {v1, v2, …, vn} is a basis of ℝⁿ and ∀ vi ∈ {v1, v2, …, vn}, by definition of the eigenvalue and eigenvector,

then,

then,

then,

then,

then,

(4) Eigendecomposition in an Orthonormal Way for the Symmetric Matrix (The Spectral Theorem)

In the definition of eigendecomposition above, we had the matrix S as a matrix whose column space is the eigenspace of A. For the symmetric matrix, we can always find an orthogonal eigenvector matrix (we are going to talk about this later). What if we normalize this matrix S and make it an orthonormal matrix? Suppose we have the set of all eigenvectors of A as {v1, v2, …, vn}, then if we define,

then we can construct matrix Q as

thus, we have,

We are going to analyze this orthonormality for the symmetric matrix later.

(5) Geometrical Meaning of Eigendecomposition

Suppose we conduct transformation A: ℝⁿ → ℝⁿ on the eigenvector x, then we can have Ax equivalent to,

By the geometrical meaning of eigenvector matrix Q (Not orthonormal in this case), we know that both Q and Q inverse is to rotate the vector x for some degrees with its length remains, while the geometrical meaning of eigenmatrix Λ is only to scale the vector with its direction remains as the same. This is to say that,

(6) The Definition of Similar Matrix

If two matrix A and B are similar, then there’s an invertible matrix S satisfies,

(7) The Definition of Diagonalizable Matrix

The matrix A is said to be diagonalizable if A is similar to a diagonal matrix. This is also to say that there is an invertible matrix S so that,

where D is a diagonal matrix.

(8) The Definition of the Defective Matrix

Suppose we have a matrix A whose total algebraic multiplicity is greater than the geometric multiplicity, then we can't have enough eigenvector for the eigendecomposition. Therefore, the matrix A in this case is not diagonalizable, and matrix A is called a defective matrix. For example, suppose we have matrix A as

then the eigenvalue of A is,

then,

Solving (A-I)x = 0, then the eigenvector is,

therefore, we can not construct a diagonal matrix in this case.

(9) Ways to Know Diagonalizablity

There are three ways to know whether a matrix is diagonalizable,

  • Eigenvectors corresponding to distinct eigenvalues are linearly independent. (If matrix A has n distinct eigenvalues, then it’s diagonalizable.)
  • If A and B are similar, then they have the same eigenvalues, and they have the same diagonal matrix.
  • The eigenvalues of A equal the eigenvalues of A transpose.

(10) Complex Eigenvalues

Suppose we have matrix A as

then, we can solve the eigenvalues for,

we are not going to talk about it into details here, and just reserve if for later discussions.