### Properties

##### 1. About single matrix

In this part, we compare the eigenvalues and eigenvectors of a matrix and its modification.

**A** and **A**^{T} have the same characteristic polynomial and eigenvalues.
**A** is diagonalizable ⇔ **A**^{T} is diagonalizable.

Proof Since the determinant is not changed by transpose, we have

`det`(**A**^{T} - `λ`**I**) = `det`((**A** - `λ`**I**)^{T}) = `det`(**A** - `λ`**I**).

By the definition of diagonalization, we have

**A** = `PDP`^{-1} ⇔ **A**^{T} = (**P**^{-1})^{T} **D**^{T} **P**^{T} = `QD`^{T}**Q**^{-1},

where **Q** = (**P**^{-1})^{T} is still invertible and **D**^{T} is still diagonal.

The relation between eigenvectors of **A** and **A**^{T} is rather complicated in general: The eigenvectors of **A** are columns of **P**. The eigenvectors of **A**^{T} are columns of **Q** = (**P**^{-1})^{T}.

**A** and `PAP`^{-1} have the same characteristic polynomial and eigenvalues.
**v** is an eigenvector of **A** with eigenvalue `λ` ⇔ `Pv` is an eigenvector of `PAP`^{-1} with eigenvalue `λ`.
**A** is diagonalizable ⇔ `PAP`^{-1} is diagonalizable.

Proof The first part follows from (also see this discussion)

`det`(`PAP`^{-1} - `λ`**I**) = `det`(**P**(**A** - `λ`**I**)**P**^{-1}) = `det`(**P**^{-1}**P**(**A** - `λ`**I**)) = `det`(**A** - `λ`**I**).

The second part follows from

`Av` = `λ`**v** ⇔ `PAP`^{-1}(`Pv`) = `λ`(**Pv**).

For the third part, we note that since **P** is invertible, we have

{**v**_{1}, **v**_{2}, ..., **v**_{n}} is a basis of eigenvectors for **A** ⇔ {`Pv`_{1}, `Pv`_{2}, ..., **Pv**_{n}} is a basis of eigenvectors for `PA`.

Then we apply the definition of diagonalizability in terms of bases of eigenvectors.

- A square matrix
**A** is invertible ⇔ 0 is not an eigenvalue.
**v** is an eigenvector of **A** with eigenvalue `λ` ⇔ **v** is an eigenvector of **A**^{-1} with eigenvalue `λ`^{-1}.
**A** is diagonalizable ⇔ **A**^{-1} is diagonalizable.

Proof In the discussion leading to the computation of eigenvalues, we showed

`λ` is an eigenvalue of **A** ⇔ **A** - `λ`**I** is not invertible.

In particular, taking `λ` = 0 gives us

0 is an eigenvalue of **A** ⇔ **A** is not invertible.

This is the same as the first part.

Now assume **A** is invertible and **v** is an eigenvector of **A** with eigenvalue `λ`. Then `λ` ≠ 0 and `Av` = `λ`**v**. Applying `λ`^{-1}**A**^{-1} to both sides, we get `λ`^{-1}**v** = **A**^{-1}**v**. This is the second part.

The third part can be proved by using either definition of diagonalizability, similar to the proof of the previous two problems.

The properties were stated in terms of matrices. They also hold for linear transformations. For example, **T**: **V** → **V** and its dual **T**^{*}: **V**^{*} → **V**^{*} have the same eigenvectors and **T** is diagonalizable if and only if **T**^{*} is diagonalizable.