Understanding the proof âalgebraic multiplicity of an eigenvalue of a real symmetric matrix is equal to its geometric multiplicityâ
Clash Royale CLAN TAG#URR8PPP
up vote
3
down vote
favorite
The algebraic multiplicity of the eigenvalues of a real symmetric matrix is equal to the geometric multiplicity.
(The instructor explained the proof in the classroom. I have tried to reproduce it below. But at certain parts, I am getting some difficulty in understanding. Hope the scholars of this mathematical community would assist me in this regard.)
The Proof as explained in the class:-
Let $A$ be an $ntimes n$ real symmetric matrix and $lambda$ be an eigenvalue of $A$. If $r$ is the geometric multiplicity of $lambda$, then the dimension of the eigenspace $E_lambda=v in mathbbR^n:A v=lambda v$ is $r$. Suppose we find an orthonormal set of vectors $v_1,v_2,dots,v_r$ which constitute a basis of $E_lambda$. Now the additional $n-r$ vectors $v_r+1,v_r+2,dots,v_n$ are found such that $v_1,v_2,dots,v_r,v_r+1,dots,v_n$ form an orthonormal basis of $mathbbR^n$. Isn't it due to Gram Schmidt Orthogonalization?
We have, $Av_i=lambda v_i$ for $1le i le r$.
Let us denote, $A v_i=sum_j=1^n c_ijv_j quad forall quad r+1 le i le n$
We can represent these $n$ equations in matrix form
beginequation*
beginaligned
& Abeginbmatrix
&v_1&dots&v_r&v_r+1dots&v_n
endbmatrix=beginbmatrix
lambda v_1&dotslambda v_r&sum_j=1^n c_r+1,jv_j dots&sum_j=1^n c_nj v_j
endbmatrix\
Rightarrow & Abeginbmatrix
& v_1 &dots& v_r &v_r+1 dots& v_n
endbmatrix=beginbmatrix
& v_1 &dots& v_r & v_r+1 dots& v_n
endbmatrix
beginbmatrix
lambda& & & & c_r+1,1&dots&c_n1\
&lambda& & & c_r+1,2&dots&c_n2\
&& &&vdots&ddots&vdots\
& & & lambda& c_r+1,r&dots&c_nr\
& & & & c_r+1,r+1&dots&c_n,r+1\
& & & &vdots&ddots&vdots\
& & & & c_r+1,n&dots&c_nn\
endbmatrix \
Rightarrow & AS=SQ qquad left(mboxSayright)
endaligned
endequation*
It is easy to see that $S$ is invertible and orthogonal.
Thus, beginequation
beginaligned
& AS=SQ\
Rightarrow & A=SQS^-1\
Rightarrow & A^T=left(S^-1right)^T Q^T S^T\
Rightarrow & A= left(S^Tright)^T Q^T S^T qquad left(mbox$A=A^T$right)\
Rightarrow & A= S Q^T S^-1 qquad hspace7mm left(mbox$S^T=S^-1$right)\
Rightarrow & A=S beginbmatrix
lambda & & & & & & \
&lambda& & & & & \
& & ddots & & & &\
& & & lambda & & & \
c_r+1,1 &c_r+1,2 &dots &c_r+1,r&c_r+1,r+1&dots & c_r+1,n\
vdots& vdots & ddots & vdots & vdots&ddots & vdots \
c_n1& c_n2& dots & c_nr& c_n,r+1&dots& c_nn
endbmatrix S^-1
endaligned
endequation
(Except the last equation, the above part was not done in the class. Am correct in getting the last equation?)
Now, we obtain,
beginequation
beginaligned
& Q= S^-1AS \
Rightarrow & Q=S^T A S\
Rightarrow & Q^T=S^T A^T left(S^Tright)^T\
Rightarrow & Q^T=S^T A S =Q\
endaligned
endequation
which indicates that $Q$ is a symmetric matrix. So, it should look like
$$Q=left[
beginarrayc
lambda \
& lambda\
& & ddots& \
& & & lambda\
hline
& & & &Large textbfC
endarrayright]$$
where $C$ is a symmetric matrix of order $left(n-rright)timesleft(n-rright)$.
Thus, it can be easily shown that $det(A-kI)=det(Q-kI)=left(lambda-kright)^r det(C-kI)$.
Now, the proof tries to show that $lambda$ can't be an eigenvalue of $C$. It uses contradiction to reach the conclusion. Suppose there exist an $(n-r)times 1$ non zero vector $u$ such that $Cu=lambda u$. Therefore, the above form of $Q$ allow us to get the following result:-
beginequation
beginaligned
Q beginbmatrix
0\
0\
u
endbmatrix
=beginbmatrix
0\
0\
lambda u
endbmatrix=lambda beginbmatrix
0\
0\
u
endbmatrix
endaligned
endequation
I have clearly understood till this point without much difficulty. But now it is claimed that $beginbmatrix
0\
0\
u
endbmatrix$ is an eigenvector of $A$, corresponding to the eigenvalue $lambda$. I am not able to understand it.
The above relation says that $beginbmatrix
0\
0\
u
endbmatrix$ is an $lambda$ eigenvector of $Q$. And, $Q$ is related to $A$ by $Q=S^-1A S$. From this relation we can just say that $Sbeginbmatrix
0\
0\
u
endbmatrix$ is an $lambda$ eigenvector of $A$. (Because, I know that if $Q=S^-1AS$ and if $v$ is a $lambda$ eigenvector of $Q$, then $Sv$ is a $lambda$ eigenvector of $A$. It doesn't tells that $v$ is an eigenvector of $A$. But here, I think, it may be true due to the fact that $S$ is orthogonal or $A$ is symmetric. I am not sure how it is possible in our case. )
The last part of the proof tells that $beginbmatrix
0\
0\
u
endbmatrix$ can't be an $lambda$ eigenvector of $A$.
Here also I didn't get the point. One of my fellow mates told me that $beginbmatrix
0\
0\
u
endbmatrix$ is in the span of $e_r+1,e_r+2,dots,e_n$, because the first $r$ components are zero. So, it means that it is in the span of $v_r+1,v_r+2,dots,v_n$. It did not make any sense for me. How does it imply an contradiction? If I suppose that he is correct, then could I give any contradiction using the fact that the eigenvectors corresponding to distinct eigenvalues must be independent?
linear-algebra proof-explanation
add a comment |Â
up vote
3
down vote
favorite
The algebraic multiplicity of the eigenvalues of a real symmetric matrix is equal to the geometric multiplicity.
(The instructor explained the proof in the classroom. I have tried to reproduce it below. But at certain parts, I am getting some difficulty in understanding. Hope the scholars of this mathematical community would assist me in this regard.)
The Proof as explained in the class:-
Let $A$ be an $ntimes n$ real symmetric matrix and $lambda$ be an eigenvalue of $A$. If $r$ is the geometric multiplicity of $lambda$, then the dimension of the eigenspace $E_lambda=v in mathbbR^n:A v=lambda v$ is $r$. Suppose we find an orthonormal set of vectors $v_1,v_2,dots,v_r$ which constitute a basis of $E_lambda$. Now the additional $n-r$ vectors $v_r+1,v_r+2,dots,v_n$ are found such that $v_1,v_2,dots,v_r,v_r+1,dots,v_n$ form an orthonormal basis of $mathbbR^n$. Isn't it due to Gram Schmidt Orthogonalization?
We have, $Av_i=lambda v_i$ for $1le i le r$.
Let us denote, $A v_i=sum_j=1^n c_ijv_j quad forall quad r+1 le i le n$
We can represent these $n$ equations in matrix form
beginequation*
beginaligned
& Abeginbmatrix
&v_1&dots&v_r&v_r+1dots&v_n
endbmatrix=beginbmatrix
lambda v_1&dotslambda v_r&sum_j=1^n c_r+1,jv_j dots&sum_j=1^n c_nj v_j
endbmatrix\
Rightarrow & Abeginbmatrix
& v_1 &dots& v_r &v_r+1 dots& v_n
endbmatrix=beginbmatrix
& v_1 &dots& v_r & v_r+1 dots& v_n
endbmatrix
beginbmatrix
lambda& & & & c_r+1,1&dots&c_n1\
&lambda& & & c_r+1,2&dots&c_n2\
&& &&vdots&ddots&vdots\
& & & lambda& c_r+1,r&dots&c_nr\
& & & & c_r+1,r+1&dots&c_n,r+1\
& & & &vdots&ddots&vdots\
& & & & c_r+1,n&dots&c_nn\
endbmatrix \
Rightarrow & AS=SQ qquad left(mboxSayright)
endaligned
endequation*
It is easy to see that $S$ is invertible and orthogonal.
Thus, beginequation
beginaligned
& AS=SQ\
Rightarrow & A=SQS^-1\
Rightarrow & A^T=left(S^-1right)^T Q^T S^T\
Rightarrow & A= left(S^Tright)^T Q^T S^T qquad left(mbox$A=A^T$right)\
Rightarrow & A= S Q^T S^-1 qquad hspace7mm left(mbox$S^T=S^-1$right)\
Rightarrow & A=S beginbmatrix
lambda & & & & & & \
&lambda& & & & & \
& & ddots & & & &\
& & & lambda & & & \
c_r+1,1 &c_r+1,2 &dots &c_r+1,r&c_r+1,r+1&dots & c_r+1,n\
vdots& vdots & ddots & vdots & vdots&ddots & vdots \
c_n1& c_n2& dots & c_nr& c_n,r+1&dots& c_nn
endbmatrix S^-1
endaligned
endequation
(Except the last equation, the above part was not done in the class. Am correct in getting the last equation?)
Now, we obtain,
beginequation
beginaligned
& Q= S^-1AS \
Rightarrow & Q=S^T A S\
Rightarrow & Q^T=S^T A^T left(S^Tright)^T\
Rightarrow & Q^T=S^T A S =Q\
endaligned
endequation
which indicates that $Q$ is a symmetric matrix. So, it should look like
$$Q=left[
beginarrayc
lambda \
& lambda\
& & ddots& \
& & & lambda\
hline
& & & &Large textbfC
endarrayright]$$
where $C$ is a symmetric matrix of order $left(n-rright)timesleft(n-rright)$.
Thus, it can be easily shown that $det(A-kI)=det(Q-kI)=left(lambda-kright)^r det(C-kI)$.
Now, the proof tries to show that $lambda$ can't be an eigenvalue of $C$. It uses contradiction to reach the conclusion. Suppose there exist an $(n-r)times 1$ non zero vector $u$ such that $Cu=lambda u$. Therefore, the above form of $Q$ allow us to get the following result:-
beginequation
beginaligned
Q beginbmatrix
0\
0\
u
endbmatrix
=beginbmatrix
0\
0\
lambda u
endbmatrix=lambda beginbmatrix
0\
0\
u
endbmatrix
endaligned
endequation
I have clearly understood till this point without much difficulty. But now it is claimed that $beginbmatrix
0\
0\
u
endbmatrix$ is an eigenvector of $A$, corresponding to the eigenvalue $lambda$. I am not able to understand it.
The above relation says that $beginbmatrix
0\
0\
u
endbmatrix$ is an $lambda$ eigenvector of $Q$. And, $Q$ is related to $A$ by $Q=S^-1A S$. From this relation we can just say that $Sbeginbmatrix
0\
0\
u
endbmatrix$ is an $lambda$ eigenvector of $A$. (Because, I know that if $Q=S^-1AS$ and if $v$ is a $lambda$ eigenvector of $Q$, then $Sv$ is a $lambda$ eigenvector of $A$. It doesn't tells that $v$ is an eigenvector of $A$. But here, I think, it may be true due to the fact that $S$ is orthogonal or $A$ is symmetric. I am not sure how it is possible in our case. )
The last part of the proof tells that $beginbmatrix
0\
0\
u
endbmatrix$ can't be an $lambda$ eigenvector of $A$.
Here also I didn't get the point. One of my fellow mates told me that $beginbmatrix
0\
0\
u
endbmatrix$ is in the span of $e_r+1,e_r+2,dots,e_n$, because the first $r$ components are zero. So, it means that it is in the span of $v_r+1,v_r+2,dots,v_n$. It did not make any sense for me. How does it imply an contradiction? If I suppose that he is correct, then could I give any contradiction using the fact that the eigenvectors corresponding to distinct eigenvalues must be independent?
linear-algebra proof-explanation
2
First you should realize that this is a consequence of the Principal Axis Theorem, that any real symmetric matrix has an orthogonal basis of eigenvectors with real eigenvalues. There are any number of proofs of the Principal Axis Theorem that you can refer to on the web by googling "proof of the principal axis theorem". Many of them go along the same lines as the proof of the more limited result your teacher gave. And reading them will hopefully help you understand your teacher's proof better.
â C Monsour
Aug 22 at 6:15
add a comment |Â
up vote
3
down vote
favorite
up vote
3
down vote
favorite
The algebraic multiplicity of the eigenvalues of a real symmetric matrix is equal to the geometric multiplicity.
(The instructor explained the proof in the classroom. I have tried to reproduce it below. But at certain parts, I am getting some difficulty in understanding. Hope the scholars of this mathematical community would assist me in this regard.)
The Proof as explained in the class:-
Let $A$ be an $ntimes n$ real symmetric matrix and $lambda$ be an eigenvalue of $A$. If $r$ is the geometric multiplicity of $lambda$, then the dimension of the eigenspace $E_lambda=v in mathbbR^n:A v=lambda v$ is $r$. Suppose we find an orthonormal set of vectors $v_1,v_2,dots,v_r$ which constitute a basis of $E_lambda$. Now the additional $n-r$ vectors $v_r+1,v_r+2,dots,v_n$ are found such that $v_1,v_2,dots,v_r,v_r+1,dots,v_n$ form an orthonormal basis of $mathbbR^n$. Isn't it due to Gram Schmidt Orthogonalization?
We have, $Av_i=lambda v_i$ for $1le i le r$.
Let us denote, $A v_i=sum_j=1^n c_ijv_j quad forall quad r+1 le i le n$
We can represent these $n$ equations in matrix form
beginequation*
beginaligned
& Abeginbmatrix
&v_1&dots&v_r&v_r+1dots&v_n
endbmatrix=beginbmatrix
lambda v_1&dotslambda v_r&sum_j=1^n c_r+1,jv_j dots&sum_j=1^n c_nj v_j
endbmatrix\
Rightarrow & Abeginbmatrix
& v_1 &dots& v_r &v_r+1 dots& v_n
endbmatrix=beginbmatrix
& v_1 &dots& v_r & v_r+1 dots& v_n
endbmatrix
beginbmatrix
lambda& & & & c_r+1,1&dots&c_n1\
&lambda& & & c_r+1,2&dots&c_n2\
&& &&vdots&ddots&vdots\
& & & lambda& c_r+1,r&dots&c_nr\
& & & & c_r+1,r+1&dots&c_n,r+1\
& & & &vdots&ddots&vdots\
& & & & c_r+1,n&dots&c_nn\
endbmatrix \
Rightarrow & AS=SQ qquad left(mboxSayright)
endaligned
endequation*
It is easy to see that $S$ is invertible and orthogonal.
Thus, beginequation
beginaligned
& AS=SQ\
Rightarrow & A=SQS^-1\
Rightarrow & A^T=left(S^-1right)^T Q^T S^T\
Rightarrow & A= left(S^Tright)^T Q^T S^T qquad left(mbox$A=A^T$right)\
Rightarrow & A= S Q^T S^-1 qquad hspace7mm left(mbox$S^T=S^-1$right)\
Rightarrow & A=S beginbmatrix
lambda & & & & & & \
&lambda& & & & & \
& & ddots & & & &\
& & & lambda & & & \
c_r+1,1 &c_r+1,2 &dots &c_r+1,r&c_r+1,r+1&dots & c_r+1,n\
vdots& vdots & ddots & vdots & vdots&ddots & vdots \
c_n1& c_n2& dots & c_nr& c_n,r+1&dots& c_nn
endbmatrix S^-1
endaligned
endequation
(Except the last equation, the above part was not done in the class. Am correct in getting the last equation?)
Now, we obtain,
beginequation
beginaligned
& Q= S^-1AS \
Rightarrow & Q=S^T A S\
Rightarrow & Q^T=S^T A^T left(S^Tright)^T\
Rightarrow & Q^T=S^T A S =Q\
endaligned
endequation
which indicates that $Q$ is a symmetric matrix. So, it should look like
$$Q=left[
beginarrayc
lambda \
& lambda\
& & ddots& \
& & & lambda\
hline
& & & &Large textbfC
endarrayright]$$
where $C$ is a symmetric matrix of order $left(n-rright)timesleft(n-rright)$.
Thus, it can be easily shown that $det(A-kI)=det(Q-kI)=left(lambda-kright)^r det(C-kI)$.
Now, the proof tries to show that $lambda$ can't be an eigenvalue of $C$. It uses contradiction to reach the conclusion. Suppose there exist an $(n-r)times 1$ non zero vector $u$ such that $Cu=lambda u$. Therefore, the above form of $Q$ allow us to get the following result:-
beginequation
beginaligned
Q beginbmatrix
0\
0\
u
endbmatrix
=beginbmatrix
0\
0\
lambda u
endbmatrix=lambda beginbmatrix
0\
0\
u
endbmatrix
endaligned
endequation
I have clearly understood till this point without much difficulty. But now it is claimed that $beginbmatrix
0\
0\
u
endbmatrix$ is an eigenvector of $A$, corresponding to the eigenvalue $lambda$. I am not able to understand it.
The above relation says that $beginbmatrix
0\
0\
u
endbmatrix$ is an $lambda$ eigenvector of $Q$. And, $Q$ is related to $A$ by $Q=S^-1A S$. From this relation we can just say that $Sbeginbmatrix
0\
0\
u
endbmatrix$ is an $lambda$ eigenvector of $A$. (Because, I know that if $Q=S^-1AS$ and if $v$ is a $lambda$ eigenvector of $Q$, then $Sv$ is a $lambda$ eigenvector of $A$. It doesn't tells that $v$ is an eigenvector of $A$. But here, I think, it may be true due to the fact that $S$ is orthogonal or $A$ is symmetric. I am not sure how it is possible in our case. )
The last part of the proof tells that $beginbmatrix
0\
0\
u
endbmatrix$ can't be an $lambda$ eigenvector of $A$.
Here also I didn't get the point. One of my fellow mates told me that $beginbmatrix
0\
0\
u
endbmatrix$ is in the span of $e_r+1,e_r+2,dots,e_n$, because the first $r$ components are zero. So, it means that it is in the span of $v_r+1,v_r+2,dots,v_n$. It did not make any sense for me. How does it imply an contradiction? If I suppose that he is correct, then could I give any contradiction using the fact that the eigenvectors corresponding to distinct eigenvalues must be independent?
linear-algebra proof-explanation
The algebraic multiplicity of the eigenvalues of a real symmetric matrix is equal to the geometric multiplicity.
(The instructor explained the proof in the classroom. I have tried to reproduce it below. But at certain parts, I am getting some difficulty in understanding. Hope the scholars of this mathematical community would assist me in this regard.)
The Proof as explained in the class:-
Let $A$ be an $ntimes n$ real symmetric matrix and $lambda$ be an eigenvalue of $A$. If $r$ is the geometric multiplicity of $lambda$, then the dimension of the eigenspace $E_lambda=v in mathbbR^n:A v=lambda v$ is $r$. Suppose we find an orthonormal set of vectors $v_1,v_2,dots,v_r$ which constitute a basis of $E_lambda$. Now the additional $n-r$ vectors $v_r+1,v_r+2,dots,v_n$ are found such that $v_1,v_2,dots,v_r,v_r+1,dots,v_n$ form an orthonormal basis of $mathbbR^n$. Isn't it due to Gram Schmidt Orthogonalization?
We have, $Av_i=lambda v_i$ for $1le i le r$.
Let us denote, $A v_i=sum_j=1^n c_ijv_j quad forall quad r+1 le i le n$
We can represent these $n$ equations in matrix form
beginequation*
beginaligned
& Abeginbmatrix
&v_1&dots&v_r&v_r+1dots&v_n
endbmatrix=beginbmatrix
lambda v_1&dotslambda v_r&sum_j=1^n c_r+1,jv_j dots&sum_j=1^n c_nj v_j
endbmatrix\
Rightarrow & Abeginbmatrix
& v_1 &dots& v_r &v_r+1 dots& v_n
endbmatrix=beginbmatrix
& v_1 &dots& v_r & v_r+1 dots& v_n
endbmatrix
beginbmatrix
lambda& & & & c_r+1,1&dots&c_n1\
&lambda& & & c_r+1,2&dots&c_n2\
&& &&vdots&ddots&vdots\
& & & lambda& c_r+1,r&dots&c_nr\
& & & & c_r+1,r+1&dots&c_n,r+1\
& & & &vdots&ddots&vdots\
& & & & c_r+1,n&dots&c_nn\
endbmatrix \
Rightarrow & AS=SQ qquad left(mboxSayright)
endaligned
endequation*
It is easy to see that $S$ is invertible and orthogonal.
Thus, beginequation
beginaligned
& AS=SQ\
Rightarrow & A=SQS^-1\
Rightarrow & A^T=left(S^-1right)^T Q^T S^T\
Rightarrow & A= left(S^Tright)^T Q^T S^T qquad left(mbox$A=A^T$right)\
Rightarrow & A= S Q^T S^-1 qquad hspace7mm left(mbox$S^T=S^-1$right)\
Rightarrow & A=S beginbmatrix
lambda & & & & & & \
&lambda& & & & & \
& & ddots & & & &\
& & & lambda & & & \
c_r+1,1 &c_r+1,2 &dots &c_r+1,r&c_r+1,r+1&dots & c_r+1,n\
vdots& vdots & ddots & vdots & vdots&ddots & vdots \
c_n1& c_n2& dots & c_nr& c_n,r+1&dots& c_nn
endbmatrix S^-1
endaligned
endequation
(Except the last equation, the above part was not done in the class. Am correct in getting the last equation?)
Now, we obtain,
beginequation
beginaligned
& Q= S^-1AS \
Rightarrow & Q=S^T A S\
Rightarrow & Q^T=S^T A^T left(S^Tright)^T\
Rightarrow & Q^T=S^T A S =Q\
endaligned
endequation
which indicates that $Q$ is a symmetric matrix. So, it should look like
$$Q=left[
beginarrayc
lambda \
& lambda\
& & ddots& \
& & & lambda\
hline
& & & &Large textbfC
endarrayright]$$
where $C$ is a symmetric matrix of order $left(n-rright)timesleft(n-rright)$.
Thus, it can be easily shown that $det(A-kI)=det(Q-kI)=left(lambda-kright)^r det(C-kI)$.
Now, the proof tries to show that $lambda$ can't be an eigenvalue of $C$. It uses contradiction to reach the conclusion. Suppose there exist an $(n-r)times 1$ non zero vector $u$ such that $Cu=lambda u$. Therefore, the above form of $Q$ allow us to get the following result:-
beginequation
beginaligned
Q beginbmatrix
0\
0\
u
endbmatrix
=beginbmatrix
0\
0\
lambda u
endbmatrix=lambda beginbmatrix
0\
0\
u
endbmatrix
endaligned
endequation
I have clearly understood till this point without much difficulty. But now it is claimed that $beginbmatrix
0\
0\
u
endbmatrix$ is an eigenvector of $A$, corresponding to the eigenvalue $lambda$. I am not able to understand it.
The above relation says that $beginbmatrix
0\
0\
u
endbmatrix$ is an $lambda$ eigenvector of $Q$. And, $Q$ is related to $A$ by $Q=S^-1A S$. From this relation we can just say that $Sbeginbmatrix
0\
0\
u
endbmatrix$ is an $lambda$ eigenvector of $A$. (Because, I know that if $Q=S^-1AS$ and if $v$ is a $lambda$ eigenvector of $Q$, then $Sv$ is a $lambda$ eigenvector of $A$. It doesn't tells that $v$ is an eigenvector of $A$. But here, I think, it may be true due to the fact that $S$ is orthogonal or $A$ is symmetric. I am not sure how it is possible in our case. )
The last part of the proof tells that $beginbmatrix
0\
0\
u
endbmatrix$ can't be an $lambda$ eigenvector of $A$.
Here also I didn't get the point. One of my fellow mates told me that $beginbmatrix
0\
0\
u
endbmatrix$ is in the span of $e_r+1,e_r+2,dots,e_n$, because the first $r$ components are zero. So, it means that it is in the span of $v_r+1,v_r+2,dots,v_n$. It did not make any sense for me. How does it imply an contradiction? If I suppose that he is correct, then could I give any contradiction using the fact that the eigenvectors corresponding to distinct eigenvalues must be independent?
linear-algebra proof-explanation
asked Aug 22 at 5:40
Bhargob
473214
473214
2
First you should realize that this is a consequence of the Principal Axis Theorem, that any real symmetric matrix has an orthogonal basis of eigenvectors with real eigenvalues. There are any number of proofs of the Principal Axis Theorem that you can refer to on the web by googling "proof of the principal axis theorem". Many of them go along the same lines as the proof of the more limited result your teacher gave. And reading them will hopefully help you understand your teacher's proof better.
â C Monsour
Aug 22 at 6:15
add a comment |Â
2
First you should realize that this is a consequence of the Principal Axis Theorem, that any real symmetric matrix has an orthogonal basis of eigenvectors with real eigenvalues. There are any number of proofs of the Principal Axis Theorem that you can refer to on the web by googling "proof of the principal axis theorem". Many of them go along the same lines as the proof of the more limited result your teacher gave. And reading them will hopefully help you understand your teacher's proof better.
â C Monsour
Aug 22 at 6:15
2
2
First you should realize that this is a consequence of the Principal Axis Theorem, that any real symmetric matrix has an orthogonal basis of eigenvectors with real eigenvalues. There are any number of proofs of the Principal Axis Theorem that you can refer to on the web by googling "proof of the principal axis theorem". Many of them go along the same lines as the proof of the more limited result your teacher gave. And reading them will hopefully help you understand your teacher's proof better.
â C Monsour
Aug 22 at 6:15
First you should realize that this is a consequence of the Principal Axis Theorem, that any real symmetric matrix has an orthogonal basis of eigenvectors with real eigenvalues. There are any number of proofs of the Principal Axis Theorem that you can refer to on the web by googling "proof of the principal axis theorem". Many of them go along the same lines as the proof of the more limited result your teacher gave. And reading them will hopefully help you understand your teacher's proof better.
â C Monsour
Aug 22 at 6:15
add a comment |Â
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2890658%2funderstanding-the-proof-algebraic-multiplicity-of-an-eigenvalue-of-a-real-symme%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
2
First you should realize that this is a consequence of the Principal Axis Theorem, that any real symmetric matrix has an orthogonal basis of eigenvectors with real eigenvalues. There are any number of proofs of the Principal Axis Theorem that you can refer to on the web by googling "proof of the principal axis theorem". Many of them go along the same lines as the proof of the more limited result your teacher gave. And reading them will hopefully help you understand your teacher's proof better.
â C Monsour
Aug 22 at 6:15