1. Eigenbases
Linear transformation, y= Ax 에서,
eigenvectors 들은 $R^n$ space 를 위한 bases 를 형성할 수도 있고 안할 수도 있으나, 일단 형성한다고 가정하자.
Assumption. A : n x n → x : n x 1
eigenvectors 들이 n개가 있다고 가정하자. → non-defective matrix / diagonalizable A
$x$ = $c_{1} x_{1}$ + $c_{2} x_{2}$ + ··· + $c_{n} x_{n}$ → eigenvectors 들은 무조건 linearly independent.
대입 → $y$ = $Ax$ = $A$( $c_{1} x_{1}$ + $c_{2} x_{2}$ + · · · + $c_{n} x_{n}$ )
= $c_{1} A x_{1}$ + $c_{2} A x_{2}$ + ··· + $c_{n} A x_{n}$
= $c_{1} \lambda_{1} x_{1}$ + $c_{2} \lambda_{2} x_{2}$ + ··· + $c_{n} \lambda_{n} x_{n}$
= $c'_{1} x_{1}$ + $c'_{2} x_{2}$ + ··· + $c'_{n} x_{n}$
의미 → $Ax$ 라는 linear transformation 은 eigenvectors 들의 간단한 linear combination 으로 표현된다는 것을 보여준다.
* $Ax = xD$ 유도
Assumption. Linear transformation $y$ = $Ax$ 에서,eigenvectors 들은 $R^n$ space 를 위한 bases 를 형성한다고 가정하자. → non-defective matrix A

Theorem 1」 Basis of eigenvectors
If an n x n matrix A has n distinct eigenvalues, the A has a basis of eigenvectors
$x_{1}$ $x_{2}$ ··· $x_{n}$ for $R^{n}$
→ Algebraic multiplicity, $M_{\lambda} = 1$
Geomatric multiplicity, $m_{\lambda} = 1$
→ $\Delta_{\lambda}$ = 0 ( ∵ $M_{\lambda}$ ≥ $m_{\lambda}$ ≥ 1 )
Theorem 2」 Symmetric matrix
A symmetric matrix has an ortonormal basis of eigenvectors for $R^n$
Note.
symmetric matrix 는 orthogonal eigenvector 들을 갖는다.
2. Similarity of matrices. Diagonalizagion
목적 : Eigenbasis → A 를 diagonalize 하자.
Def」 similarity matrices, similarity transformation
An n x n matrix $\hat A$ is called similar to an n x n matrix $A$ if $\hat A$ = $P^{-1} A P$ for some (nonsingular) n x n matrix $P$.
This transformation, which gives $\hat A$ from $A$, is called a similarity transformation.
Theorem 3」 Eigenvalues and Eigenvectors of similar matrices
If $\hat A$ is similar to $A$, then $\hat A$ has the same eigenvalues as A.
Furthermore, if $x$ is an eigenvector of $A$, then $y = P^{-1} x$ is an eigenvector of $\hat A$ corresponding to the same eigenvalue.
Proof」
· 주어진 조건 : ① $Ax = \lambda x$ ② $\hat A$ = $P^{-1} A P$ *$P$ : 임의의 nonsingular matrix
· 증명 방향 : $\lambda x$($A$) → $\lambda P^{-1} x$($\hat A$)
① → $Ax = \lambda x$
→ $P^{-1} Ax$ = $P^{-1}$ $\lambda x$
→ $P^{-1} A P$$P^{-1} x$ = $\lambda P^{-1} x$
→ $\hat A$$P^{-1} x$ = $\lambda P^{-1} x$ by ②
∴ $\hat A$ 의 eigenvalue 는 $A$ 와 같은 $\lambda$ 가 되고, 그 $\lambda$에 대응하는 eigenvector는 $P^{-1} x$가 된다.
Theorem 4」 Diagonalization of Matrix
If an n x n matrix A has a basis of eigenvectors, then $D = X^{-1}AX$ is diagonal.
Here, X is the matrix with these eigenvectors as column vectors.
Also, $D^m = X^{-1}A^{m}X$ ($m = 2, 3,$ ···)
Proof」

'Linear Algebra' 카테고리의 다른 글
| [Linear Algebra] #12 Orthogonal Matrices (0) | 2021.02.03 |
|---|---|
| [Linear Algebra] #11 Symmetric, Skew-symmetric and Orthogonal matrices (0) | 2021.02.03 |
| [Linear Algebra] #10 Matrix Eigenvalue Problems (0) | 2021.02.02 |
| [Linear Algebra] #09 Inverse of matrix Gauss-Jordan Elimination (0) | 2021.02.02 |
| [Linear Algebra] #08 Solution of linear systems (0) | 2021.02.02 |