### Eigenspace and Multiplicity

##### 1. Direct sum of eigenspaces

The diagonalization process is actually the process of computing the eigenspaces **E**_{λ} = `nul`(**A** - `λ`**I**). The following result says that the different eigenspaces are independent. See here for the definition of direct sum and the relation to linear independence.

Let `λ`_{1}, `λ`_{2}, ..., `λ`_{k} be distinct eigenvalues of **A**. Then the sum **E**_{λ1} + **E**_{λ2} + ... + **E**_{λk} of the corresponding eigenspaces is a direct sum.

Proof We prove by induction. Let us assume the statement is true for `k` - 1 eigenspaces. We prove the statement is still true for `k` eigenspaces. Specifically, for
**v**_{1} ∈ **E**_{λ1}, **v**_{2} ∈ **E**_{λ2}, ..., **v**_{k} ∈ **E**_{λk} satisfying

**v**_{1} + **v**_{2} + ... + **v**_{k} = **0**,

we need to prove **v**_{1} = **v**_{2} = ... = **v**_{k} = **0**.

Multiplying `λ`_{1} and applying **A** to the equality, respectively, we get

`λ`_{1}**v**_{1} + `λ`_{1}**v**_{2} + ... + `λ`_{1}**v**_{k} = **0**, `λ`_{1}**v**_{1} + `λ`_{2}**v**_{2} + ... + `λ`_{k}**v**_{k} = **0**.

Substracting the two equalities, we get

(`λ`_{1} - `λ`_{2})**v**_{2} + ... + (`λ`_{1} - `λ`_{k})**v**_{k} = **0**.

Since the eigenspaces are subspaces, we have `v'`_{2} = (`λ`_{1} - `λ`_{2})**v**_{2} ∈ **E**_{λ2}, ..., **v'**_{k} = (`λ`_{1} - `λ`_{k})**v**_{k} ∈ **E**_{λk}. By the inductive assumption, **E**_{λ2} + ... + **E**_{λk} is already a direct sum. Thus from

`v'`_{2} + ... + **v'**_{k} = (`λ`_{1} - `λ`_{2})**v**_{2} + (`λ`_{1} - `λ`_{k})**v**_{k} = **0**,

we have `v'`_{2} = ... = **v'**_{k} = **0**. Since `λ`_{1}, `λ`_{2}, ..., `λ`_{k} are distinct, this further implies **v**_{2} = ... = **v**_{k} = **0**. Finally, from **v**_{1} + **v**_{2} + ... + **v**_{k} = **0**, we have **v**_{1} = **0**.

By this property of direct sum, the result above has the following consequence (`B`_{i} consisting of one vector).

Eigenvectors with distinct eigenvalues must be linearly independent.

Recall that for an `n` by `n` matrix **A**, `det`(**A** - `λ`**I**) is a polynomial of degree `n`. If the polynomial has no multiple root, then the `n` distinct roots are `n` distinct eigenvalues `λ`_{1}, `λ`_{2}, ..., `λ`_{n}. Then each eigenvalue `λ`_{i} must yield at least one eigenvector **v**_{i}. Thus we have `n` eigenvectors **v**_{1}, **v**_{2}, ..., **v**_{n} ∈ **R**^{n} with distinct eigenvalues, which must be linearly independent and form a basis of **R**^{n}. The discussion leads to the following conclusion.

The characteristic polynomial has no multiple roots ⇒ The matrix is diagonalizable.

In this example, once we know the eigenvalues of the 2 by 2 matrix are 5 and 15, we may conclude, without further computation, that the matrix has a basis of eigenvectors.

Example In an earlier example, we know the eigenvalues of **A** must be from -5, -3, -1, 1, 3. Moreover, we also know that the eigenvalues must include -1, at least one from 1, -3, and at least one from 3, -5. Thus if **A** is 3 by 3, then **A** must have three distinct eigenvalues and must be diagonalizable.

Example The following lower triangular matrix

[ |
2.3 |
0 |
0 |
0 |
] |

√2 |
`π` |
0 |
0 |

√1.7 |
100 |
`π`^{2} |
0 |

`π`^{2} |
5 |
0.01 |
11 |

has 4 distinct eigenvalues 2.3, `π`, `π`^{2}, 11 and has a basis of eigenvectors.