Why does the spectral norm equal the largest singular value?
Clash Royale CLAN TAG#URR8PPP
up vote
21
down vote
favorite
This may be a trivial question yet I was unable to find an answer:
$$left | A right | _2=sqrtlambda_textmax(A^^*A)=sigma_textmax(A)$$
where the spectral norm $left | A right | _2$ of a complex matrix $A$ is defined as $$textmax leftx$$
How does one prove the first and the second equality?
linear-algebra matrices svd singularvalues spectral-norm
 |Â
show 1 more comment
up vote
21
down vote
favorite
This may be a trivial question yet I was unable to find an answer:
$$left | A right | _2=sqrtlambda_textmax(A^^*A)=sigma_textmax(A)$$
where the spectral norm $left | A right | _2$ of a complex matrix $A$ is defined as $$textmax leftx$$
How does one prove the first and the second equality?
linear-algebra matrices svd singularvalues spectral-norm
What are your thoughts on the second equality?
â Git Gud
Nov 30 '13 at 11:35
@GitGud Oh, does that mean singular values are defined in this way, i.e., the second equality is a definition?
â mathemage
Nov 30 '13 at 12:23
It's not a definition. It's a direct consequence of the definition. Can you see this?
â Git Gud
Nov 30 '13 at 12:26
math.stackexchange.com/questions/482170/2-norm-vs-operator-norm
â Prahlad Vaidyanathan
Nov 30 '13 at 17:11
@GitGud One normally can't if one does not know about the Courant-Fischer's characterisation.
â Algebraic Pavel
Dec 1 '13 at 2:37
 |Â
show 1 more comment
up vote
21
down vote
favorite
up vote
21
down vote
favorite
This may be a trivial question yet I was unable to find an answer:
$$left | A right | _2=sqrtlambda_textmax(A^^*A)=sigma_textmax(A)$$
where the spectral norm $left | A right | _2$ of a complex matrix $A$ is defined as $$textmax leftx$$
How does one prove the first and the second equality?
linear-algebra matrices svd singularvalues spectral-norm
This may be a trivial question yet I was unable to find an answer:
$$left | A right | _2=sqrtlambda_textmax(A^^*A)=sigma_textmax(A)$$
where the spectral norm $left | A right | _2$ of a complex matrix $A$ is defined as $$textmax leftx$$
How does one prove the first and the second equality?
linear-algebra matrices svd singularvalues spectral-norm
edited Aug 21 at 11:05
Rodrigo de Azevedo
12.6k41751
12.6k41751
asked Nov 30 '13 at 10:43
mathemage
1491111
1491111
What are your thoughts on the second equality?
â Git Gud
Nov 30 '13 at 11:35
@GitGud Oh, does that mean singular values are defined in this way, i.e., the second equality is a definition?
â mathemage
Nov 30 '13 at 12:23
It's not a definition. It's a direct consequence of the definition. Can you see this?
â Git Gud
Nov 30 '13 at 12:26
math.stackexchange.com/questions/482170/2-norm-vs-operator-norm
â Prahlad Vaidyanathan
Nov 30 '13 at 17:11
@GitGud One normally can't if one does not know about the Courant-Fischer's characterisation.
â Algebraic Pavel
Dec 1 '13 at 2:37
 |Â
show 1 more comment
What are your thoughts on the second equality?
â Git Gud
Nov 30 '13 at 11:35
@GitGud Oh, does that mean singular values are defined in this way, i.e., the second equality is a definition?
â mathemage
Nov 30 '13 at 12:23
It's not a definition. It's a direct consequence of the definition. Can you see this?
â Git Gud
Nov 30 '13 at 12:26
math.stackexchange.com/questions/482170/2-norm-vs-operator-norm
â Prahlad Vaidyanathan
Nov 30 '13 at 17:11
@GitGud One normally can't if one does not know about the Courant-Fischer's characterisation.
â Algebraic Pavel
Dec 1 '13 at 2:37
What are your thoughts on the second equality?
â Git Gud
Nov 30 '13 at 11:35
What are your thoughts on the second equality?
â Git Gud
Nov 30 '13 at 11:35
@GitGud Oh, does that mean singular values are defined in this way, i.e., the second equality is a definition?
â mathemage
Nov 30 '13 at 12:23
@GitGud Oh, does that mean singular values are defined in this way, i.e., the second equality is a definition?
â mathemage
Nov 30 '13 at 12:23
It's not a definition. It's a direct consequence of the definition. Can you see this?
â Git Gud
Nov 30 '13 at 12:26
It's not a definition. It's a direct consequence of the definition. Can you see this?
â Git Gud
Nov 30 '13 at 12:26
math.stackexchange.com/questions/482170/2-norm-vs-operator-norm
â Prahlad Vaidyanathan
Nov 30 '13 at 17:11
math.stackexchange.com/questions/482170/2-norm-vs-operator-norm
â Prahlad Vaidyanathan
Nov 30 '13 at 17:11
@GitGud One normally can't if one does not know about the Courant-Fischer's characterisation.
â Algebraic Pavel
Dec 1 '13 at 2:37
@GitGud One normally can't if one does not know about the Courant-Fischer's characterisation.
â Algebraic Pavel
Dec 1 '13 at 2:37
 |Â
show 1 more comment
2 Answers
2
active
oldest
votes
up vote
19
down vote
accepted
Put $B=A^*A$ which is a Hermitian matrix.
As a linear transformation of Euclidean vector space $E$ is Hermite iff there exists an orthonormal basis of E consisting of all the eigenvectors of $B$
Let $lambda_1,...,lambda_n$ be the eigenvalues of $B$ and $left e_1,...e_n right $ be an orthonormal basis of $E$
Let $x=a_1e_1+...+a_ne_n$
we have $left | x right |=left langle sum_i=1^na_ie_i,sum_i=1^na_ie_i right rangle^1/2 =sqrtsum_i=1^na_i^2$,
$Bx=Bleft ( sum_i=1^na_ie_i right )=sum_i=1^na_iB(e_i)=sum_i=1^nlambda_ia_ie_i$
Denote $lambda_j_0$ to be the largest eigenvalue of $B$.
Therefore,
$left | Ax right |=left langle Ax,Ax right rangle=left langle x,A^*Ax right rangle=left langle x,Bx right rangle=left langle sum_i=1^na_ie_i,sum_i=1^nlambda_ia_ie_i right rangle=sqrtsum_i=1^na_ioverlinelambda_ia_i leq underset1leq jleq nmaxsqrtlambda_j right times (left | x right |)$
So, if $left | A right |$ = $textmax left : $ then $left | A right |leq underset1leq jleq nmaxsqrtlambda_j right $ (1)
Consider: $x_0=e_j_0$ $Rightarrow left | x right |=1$ so that $left | A right | geq left langle x,Bx right rangle=left langle e_j_0,B(e_j_0) right rangle=left langle e_j_0,lambda_j_0 e_j_0 right rangle=sqrtleft $ (2)
Combining (1) and (2) gives us $left | A right |= underset1leq jleq nmaxsqrt lambda_j right $ where $lambda_j$ is the eigenvalue of $B=A^*A$
Conclusion: $$left | A right | _2=sqrtlambda_textmax(A^^*A)=sigma_textmax(A)$$
Why is $A^*A$ a unitary matrix? For $A$ as a zero matrix or a general real diagonal matrix, this doesn't have to be unitary, right?
â mathemage
Nov 30 '13 at 16:29
1
That's my mistake. That should be Hermite. Got fixed!
â An Khuong Doan
Nov 30 '13 at 16:58
I think I understood the proof except for one part: in the "iff" statement, why does the reverse implication hold? For any unitary $U$ and diagonal non-Hermitian $Lambda$ the matrix $B = U^dagger Lambda U$ has eigenvectors forming orthonormal basis (namely, columns of $U$). However, $B$ is not Hermitian since $Lambda$ is not. This is a counterexample, right?
â mathemage
Dec 1 '13 at 19:09
1
And (2) is still blur to me: how can we say $$left | A right | geq left langle x,Bx right rangle=left langle e_j_0,B(e_j_0) right rangle$$
â sleeve chen
Jan 21 '15 at 0:01
1
@sleevechen for (1) we can say it because $left | A right |$ is the maximum of all the elements in the set, and it was just shown that each element is $leq $ the largest singular value times $left | x right | = 1$. For (2), since $left | A right |$ is the maximum, it must be $geq$ than any particular element, namely $left | A e_j_0 right |=left | Ae_j_0right |=left langle Ae_j_0,Ae_j_0 right rangle=left langle e_j_0,A^*Ae_j_0 right rangle=left langle e_j_0,Be_j_0 right rangle$
â ignoramus
Jun 8 '16 at 11:40
 |Â
show 3 more comments
up vote
6
down vote
First of all,
$$beginalign*sup_||Ax||_2 & = sup_||USigma V^Tx||_2 = sup_||Sigma V^Tx||_2endalign*$$
since $U$ is unitary, that is, $||Ux_0||_2^2 = x_0^TU^TUx_0 = x_0^Tx_0 = ||x_0||_2^2$, for some vector $x_0$.
Then let $y = V^Tx$. By the same argument above, $||y||_2 = ||V^Tx||_2 = ||x||_2 = 1$ since $V$ is unitary.
$$sup_||Sigma V^Tx||_2 = sup_y||Sigma y||_2$$
Since $Sigma = mboxdiag(sigma_1, cdots, sigma_n)$, where $sigma_1$ is the largest singular value. The max for the above, $sigma_1$, is attained when $y = (1,cdots,0)^T$. You can find the max by, for example, solving the above using a Lagrange Multiplier.
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
19
down vote
accepted
Put $B=A^*A$ which is a Hermitian matrix.
As a linear transformation of Euclidean vector space $E$ is Hermite iff there exists an orthonormal basis of E consisting of all the eigenvectors of $B$
Let $lambda_1,...,lambda_n$ be the eigenvalues of $B$ and $left e_1,...e_n right $ be an orthonormal basis of $E$
Let $x=a_1e_1+...+a_ne_n$
we have $left | x right |=left langle sum_i=1^na_ie_i,sum_i=1^na_ie_i right rangle^1/2 =sqrtsum_i=1^na_i^2$,
$Bx=Bleft ( sum_i=1^na_ie_i right )=sum_i=1^na_iB(e_i)=sum_i=1^nlambda_ia_ie_i$
Denote $lambda_j_0$ to be the largest eigenvalue of $B$.
Therefore,
$left | Ax right |=left langle Ax,Ax right rangle=left langle x,A^*Ax right rangle=left langle x,Bx right rangle=left langle sum_i=1^na_ie_i,sum_i=1^nlambda_ia_ie_i right rangle=sqrtsum_i=1^na_ioverlinelambda_ia_i leq underset1leq jleq nmaxsqrtlambda_j right times (left | x right |)$
So, if $left | A right |$ = $textmax left : $ then $left | A right |leq underset1leq jleq nmaxsqrtlambda_j right $ (1)
Consider: $x_0=e_j_0$ $Rightarrow left | x right |=1$ so that $left | A right | geq left langle x,Bx right rangle=left langle e_j_0,B(e_j_0) right rangle=left langle e_j_0,lambda_j_0 e_j_0 right rangle=sqrtleft $ (2)
Combining (1) and (2) gives us $left | A right |= underset1leq jleq nmaxsqrt lambda_j right $ where $lambda_j$ is the eigenvalue of $B=A^*A$
Conclusion: $$left | A right | _2=sqrtlambda_textmax(A^^*A)=sigma_textmax(A)$$
Why is $A^*A$ a unitary matrix? For $A$ as a zero matrix or a general real diagonal matrix, this doesn't have to be unitary, right?
â mathemage
Nov 30 '13 at 16:29
1
That's my mistake. That should be Hermite. Got fixed!
â An Khuong Doan
Nov 30 '13 at 16:58
I think I understood the proof except for one part: in the "iff" statement, why does the reverse implication hold? For any unitary $U$ and diagonal non-Hermitian $Lambda$ the matrix $B = U^dagger Lambda U$ has eigenvectors forming orthonormal basis (namely, columns of $U$). However, $B$ is not Hermitian since $Lambda$ is not. This is a counterexample, right?
â mathemage
Dec 1 '13 at 19:09
1
And (2) is still blur to me: how can we say $$left | A right | geq left langle x,Bx right rangle=left langle e_j_0,B(e_j_0) right rangle$$
â sleeve chen
Jan 21 '15 at 0:01
1
@sleevechen for (1) we can say it because $left | A right |$ is the maximum of all the elements in the set, and it was just shown that each element is $leq $ the largest singular value times $left | x right | = 1$. For (2), since $left | A right |$ is the maximum, it must be $geq$ than any particular element, namely $left | A e_j_0 right |=left | Ae_j_0right |=left langle Ae_j_0,Ae_j_0 right rangle=left langle e_j_0,A^*Ae_j_0 right rangle=left langle e_j_0,Be_j_0 right rangle$
â ignoramus
Jun 8 '16 at 11:40
 |Â
show 3 more comments
up vote
19
down vote
accepted
Put $B=A^*A$ which is a Hermitian matrix.
As a linear transformation of Euclidean vector space $E$ is Hermite iff there exists an orthonormal basis of E consisting of all the eigenvectors of $B$
Let $lambda_1,...,lambda_n$ be the eigenvalues of $B$ and $left e_1,...e_n right $ be an orthonormal basis of $E$
Let $x=a_1e_1+...+a_ne_n$
we have $left | x right |=left langle sum_i=1^na_ie_i,sum_i=1^na_ie_i right rangle^1/2 =sqrtsum_i=1^na_i^2$,
$Bx=Bleft ( sum_i=1^na_ie_i right )=sum_i=1^na_iB(e_i)=sum_i=1^nlambda_ia_ie_i$
Denote $lambda_j_0$ to be the largest eigenvalue of $B$.
Therefore,
$left | Ax right |=left langle Ax,Ax right rangle=left langle x,A^*Ax right rangle=left langle x,Bx right rangle=left langle sum_i=1^na_ie_i,sum_i=1^nlambda_ia_ie_i right rangle=sqrtsum_i=1^na_ioverlinelambda_ia_i leq underset1leq jleq nmaxsqrtlambda_j right times (left | x right |)$
So, if $left | A right |$ = $textmax left : $ then $left | A right |leq underset1leq jleq nmaxsqrtlambda_j right $ (1)
Consider: $x_0=e_j_0$ $Rightarrow left | x right |=1$ so that $left | A right | geq left langle x,Bx right rangle=left langle e_j_0,B(e_j_0) right rangle=left langle e_j_0,lambda_j_0 e_j_0 right rangle=sqrtleft $ (2)
Combining (1) and (2) gives us $left | A right |= underset1leq jleq nmaxsqrt lambda_j right $ where $lambda_j$ is the eigenvalue of $B=A^*A$
Conclusion: $$left | A right | _2=sqrtlambda_textmax(A^^*A)=sigma_textmax(A)$$
Why is $A^*A$ a unitary matrix? For $A$ as a zero matrix or a general real diagonal matrix, this doesn't have to be unitary, right?
â mathemage
Nov 30 '13 at 16:29
1
That's my mistake. That should be Hermite. Got fixed!
â An Khuong Doan
Nov 30 '13 at 16:58
I think I understood the proof except for one part: in the "iff" statement, why does the reverse implication hold? For any unitary $U$ and diagonal non-Hermitian $Lambda$ the matrix $B = U^dagger Lambda U$ has eigenvectors forming orthonormal basis (namely, columns of $U$). However, $B$ is not Hermitian since $Lambda$ is not. This is a counterexample, right?
â mathemage
Dec 1 '13 at 19:09
1
And (2) is still blur to me: how can we say $$left | A right | geq left langle x,Bx right rangle=left langle e_j_0,B(e_j_0) right rangle$$
â sleeve chen
Jan 21 '15 at 0:01
1
@sleevechen for (1) we can say it because $left | A right |$ is the maximum of all the elements in the set, and it was just shown that each element is $leq $ the largest singular value times $left | x right | = 1$. For (2), since $left | A right |$ is the maximum, it must be $geq$ than any particular element, namely $left | A e_j_0 right |=left | Ae_j_0right |=left langle Ae_j_0,Ae_j_0 right rangle=left langle e_j_0,A^*Ae_j_0 right rangle=left langle e_j_0,Be_j_0 right rangle$
â ignoramus
Jun 8 '16 at 11:40
 |Â
show 3 more comments
up vote
19
down vote
accepted
up vote
19
down vote
accepted
Put $B=A^*A$ which is a Hermitian matrix.
As a linear transformation of Euclidean vector space $E$ is Hermite iff there exists an orthonormal basis of E consisting of all the eigenvectors of $B$
Let $lambda_1,...,lambda_n$ be the eigenvalues of $B$ and $left e_1,...e_n right $ be an orthonormal basis of $E$
Let $x=a_1e_1+...+a_ne_n$
we have $left | x right |=left langle sum_i=1^na_ie_i,sum_i=1^na_ie_i right rangle^1/2 =sqrtsum_i=1^na_i^2$,
$Bx=Bleft ( sum_i=1^na_ie_i right )=sum_i=1^na_iB(e_i)=sum_i=1^nlambda_ia_ie_i$
Denote $lambda_j_0$ to be the largest eigenvalue of $B$.
Therefore,
$left | Ax right |=left langle Ax,Ax right rangle=left langle x,A^*Ax right rangle=left langle x,Bx right rangle=left langle sum_i=1^na_ie_i,sum_i=1^nlambda_ia_ie_i right rangle=sqrtsum_i=1^na_ioverlinelambda_ia_i leq underset1leq jleq nmaxsqrtlambda_j right times (left | x right |)$
So, if $left | A right |$ = $textmax left : $ then $left | A right |leq underset1leq jleq nmaxsqrtlambda_j right $ (1)
Consider: $x_0=e_j_0$ $Rightarrow left | x right |=1$ so that $left | A right | geq left langle x,Bx right rangle=left langle e_j_0,B(e_j_0) right rangle=left langle e_j_0,lambda_j_0 e_j_0 right rangle=sqrtleft $ (2)
Combining (1) and (2) gives us $left | A right |= underset1leq jleq nmaxsqrt lambda_j right $ where $lambda_j$ is the eigenvalue of $B=A^*A$
Conclusion: $$left | A right | _2=sqrtlambda_textmax(A^^*A)=sigma_textmax(A)$$
Put $B=A^*A$ which is a Hermitian matrix.
As a linear transformation of Euclidean vector space $E$ is Hermite iff there exists an orthonormal basis of E consisting of all the eigenvectors of $B$
Let $lambda_1,...,lambda_n$ be the eigenvalues of $B$ and $left e_1,...e_n right $ be an orthonormal basis of $E$
Let $x=a_1e_1+...+a_ne_n$
we have $left | x right |=left langle sum_i=1^na_ie_i,sum_i=1^na_ie_i right rangle^1/2 =sqrtsum_i=1^na_i^2$,
$Bx=Bleft ( sum_i=1^na_ie_i right )=sum_i=1^na_iB(e_i)=sum_i=1^nlambda_ia_ie_i$
Denote $lambda_j_0$ to be the largest eigenvalue of $B$.
Therefore,
$left | Ax right |=left langle Ax,Ax right rangle=left langle x,A^*Ax right rangle=left langle x,Bx right rangle=left langle sum_i=1^na_ie_i,sum_i=1^nlambda_ia_ie_i right rangle=sqrtsum_i=1^na_ioverlinelambda_ia_i leq underset1leq jleq nmaxsqrtlambda_j right times (left | x right |)$
So, if $left | A right |$ = $textmax left : $ then $left | A right |leq underset1leq jleq nmaxsqrtlambda_j right $ (1)
Consider: $x_0=e_j_0$ $Rightarrow left | x right |=1$ so that $left | A right | geq left langle x,Bx right rangle=left langle e_j_0,B(e_j_0) right rangle=left langle e_j_0,lambda_j_0 e_j_0 right rangle=sqrtleft $ (2)
Combining (1) and (2) gives us $left | A right |= underset1leq jleq nmaxsqrt lambda_j right $ where $lambda_j$ is the eigenvalue of $B=A^*A$
Conclusion: $$left | A right | _2=sqrtlambda_textmax(A^^*A)=sigma_textmax(A)$$
edited Nov 17 '17 at 7:43
user8795
5,30061842
5,30061842
answered Nov 30 '13 at 14:09
An Khuong Doan
1,473611
1,473611
Why is $A^*A$ a unitary matrix? For $A$ as a zero matrix or a general real diagonal matrix, this doesn't have to be unitary, right?
â mathemage
Nov 30 '13 at 16:29
1
That's my mistake. That should be Hermite. Got fixed!
â An Khuong Doan
Nov 30 '13 at 16:58
I think I understood the proof except for one part: in the "iff" statement, why does the reverse implication hold? For any unitary $U$ and diagonal non-Hermitian $Lambda$ the matrix $B = U^dagger Lambda U$ has eigenvectors forming orthonormal basis (namely, columns of $U$). However, $B$ is not Hermitian since $Lambda$ is not. This is a counterexample, right?
â mathemage
Dec 1 '13 at 19:09
1
And (2) is still blur to me: how can we say $$left | A right | geq left langle x,Bx right rangle=left langle e_j_0,B(e_j_0) right rangle$$
â sleeve chen
Jan 21 '15 at 0:01
1
@sleevechen for (1) we can say it because $left | A right |$ is the maximum of all the elements in the set, and it was just shown that each element is $leq $ the largest singular value times $left | x right | = 1$. For (2), since $left | A right |$ is the maximum, it must be $geq$ than any particular element, namely $left | A e_j_0 right |=left | Ae_j_0right |=left langle Ae_j_0,Ae_j_0 right rangle=left langle e_j_0,A^*Ae_j_0 right rangle=left langle e_j_0,Be_j_0 right rangle$
â ignoramus
Jun 8 '16 at 11:40
 |Â
show 3 more comments
Why is $A^*A$ a unitary matrix? For $A$ as a zero matrix or a general real diagonal matrix, this doesn't have to be unitary, right?
â mathemage
Nov 30 '13 at 16:29
1
That's my mistake. That should be Hermite. Got fixed!
â An Khuong Doan
Nov 30 '13 at 16:58
I think I understood the proof except for one part: in the "iff" statement, why does the reverse implication hold? For any unitary $U$ and diagonal non-Hermitian $Lambda$ the matrix $B = U^dagger Lambda U$ has eigenvectors forming orthonormal basis (namely, columns of $U$). However, $B$ is not Hermitian since $Lambda$ is not. This is a counterexample, right?
â mathemage
Dec 1 '13 at 19:09
1
And (2) is still blur to me: how can we say $$left | A right | geq left langle x,Bx right rangle=left langle e_j_0,B(e_j_0) right rangle$$
â sleeve chen
Jan 21 '15 at 0:01
1
@sleevechen for (1) we can say it because $left | A right |$ is the maximum of all the elements in the set, and it was just shown that each element is $leq $ the largest singular value times $left | x right | = 1$. For (2), since $left | A right |$ is the maximum, it must be $geq$ than any particular element, namely $left | A e_j_0 right |=left | Ae_j_0right |=left langle Ae_j_0,Ae_j_0 right rangle=left langle e_j_0,A^*Ae_j_0 right rangle=left langle e_j_0,Be_j_0 right rangle$
â ignoramus
Jun 8 '16 at 11:40
Why is $A^*A$ a unitary matrix? For $A$ as a zero matrix or a general real diagonal matrix, this doesn't have to be unitary, right?
â mathemage
Nov 30 '13 at 16:29
Why is $A^*A$ a unitary matrix? For $A$ as a zero matrix or a general real diagonal matrix, this doesn't have to be unitary, right?
â mathemage
Nov 30 '13 at 16:29
1
1
That's my mistake. That should be Hermite. Got fixed!
â An Khuong Doan
Nov 30 '13 at 16:58
That's my mistake. That should be Hermite. Got fixed!
â An Khuong Doan
Nov 30 '13 at 16:58
I think I understood the proof except for one part: in the "iff" statement, why does the reverse implication hold? For any unitary $U$ and diagonal non-Hermitian $Lambda$ the matrix $B = U^dagger Lambda U$ has eigenvectors forming orthonormal basis (namely, columns of $U$). However, $B$ is not Hermitian since $Lambda$ is not. This is a counterexample, right?
â mathemage
Dec 1 '13 at 19:09
I think I understood the proof except for one part: in the "iff" statement, why does the reverse implication hold? For any unitary $U$ and diagonal non-Hermitian $Lambda$ the matrix $B = U^dagger Lambda U$ has eigenvectors forming orthonormal basis (namely, columns of $U$). However, $B$ is not Hermitian since $Lambda$ is not. This is a counterexample, right?
â mathemage
Dec 1 '13 at 19:09
1
1
And (2) is still blur to me: how can we say $$left | A right | geq left langle x,Bx right rangle=left langle e_j_0,B(e_j_0) right rangle$$
â sleeve chen
Jan 21 '15 at 0:01
And (2) is still blur to me: how can we say $$left | A right | geq left langle x,Bx right rangle=left langle e_j_0,B(e_j_0) right rangle$$
â sleeve chen
Jan 21 '15 at 0:01
1
1
@sleevechen for (1) we can say it because $left | A right |$ is the maximum of all the elements in the set, and it was just shown that each element is $leq $ the largest singular value times $left | x right | = 1$. For (2), since $left | A right |$ is the maximum, it must be $geq$ than any particular element, namely $left | A e_j_0 right |=left | Ae_j_0right |=left langle Ae_j_0,Ae_j_0 right rangle=left langle e_j_0,A^*Ae_j_0 right rangle=left langle e_j_0,Be_j_0 right rangle$
â ignoramus
Jun 8 '16 at 11:40
@sleevechen for (1) we can say it because $left | A right |$ is the maximum of all the elements in the set, and it was just shown that each element is $leq $ the largest singular value times $left | x right | = 1$. For (2), since $left | A right |$ is the maximum, it must be $geq$ than any particular element, namely $left | A e_j_0 right |=left | Ae_j_0right |=left langle Ae_j_0,Ae_j_0 right rangle=left langle e_j_0,A^*Ae_j_0 right rangle=left langle e_j_0,Be_j_0 right rangle$
â ignoramus
Jun 8 '16 at 11:40
 |Â
show 3 more comments
up vote
6
down vote
First of all,
$$beginalign*sup_||Ax||_2 & = sup_||USigma V^Tx||_2 = sup_||Sigma V^Tx||_2endalign*$$
since $U$ is unitary, that is, $||Ux_0||_2^2 = x_0^TU^TUx_0 = x_0^Tx_0 = ||x_0||_2^2$, for some vector $x_0$.
Then let $y = V^Tx$. By the same argument above, $||y||_2 = ||V^Tx||_2 = ||x||_2 = 1$ since $V$ is unitary.
$$sup_||Sigma V^Tx||_2 = sup_y||Sigma y||_2$$
Since $Sigma = mboxdiag(sigma_1, cdots, sigma_n)$, where $sigma_1$ is the largest singular value. The max for the above, $sigma_1$, is attained when $y = (1,cdots,0)^T$. You can find the max by, for example, solving the above using a Lagrange Multiplier.
add a comment |Â
up vote
6
down vote
First of all,
$$beginalign*sup_||Ax||_2 & = sup_||USigma V^Tx||_2 = sup_||Sigma V^Tx||_2endalign*$$
since $U$ is unitary, that is, $||Ux_0||_2^2 = x_0^TU^TUx_0 = x_0^Tx_0 = ||x_0||_2^2$, for some vector $x_0$.
Then let $y = V^Tx$. By the same argument above, $||y||_2 = ||V^Tx||_2 = ||x||_2 = 1$ since $V$ is unitary.
$$sup_||Sigma V^Tx||_2 = sup_y||Sigma y||_2$$
Since $Sigma = mboxdiag(sigma_1, cdots, sigma_n)$, where $sigma_1$ is the largest singular value. The max for the above, $sigma_1$, is attained when $y = (1,cdots,0)^T$. You can find the max by, for example, solving the above using a Lagrange Multiplier.
add a comment |Â
up vote
6
down vote
up vote
6
down vote
First of all,
$$beginalign*sup_||Ax||_2 & = sup_||USigma V^Tx||_2 = sup_||Sigma V^Tx||_2endalign*$$
since $U$ is unitary, that is, $||Ux_0||_2^2 = x_0^TU^TUx_0 = x_0^Tx_0 = ||x_0||_2^2$, for some vector $x_0$.
Then let $y = V^Tx$. By the same argument above, $||y||_2 = ||V^Tx||_2 = ||x||_2 = 1$ since $V$ is unitary.
$$sup_||Sigma V^Tx||_2 = sup_y||Sigma y||_2$$
Since $Sigma = mboxdiag(sigma_1, cdots, sigma_n)$, where $sigma_1$ is the largest singular value. The max for the above, $sigma_1$, is attained when $y = (1,cdots,0)^T$. You can find the max by, for example, solving the above using a Lagrange Multiplier.
First of all,
$$beginalign*sup_||Ax||_2 & = sup_||USigma V^Tx||_2 = sup_||Sigma V^Tx||_2endalign*$$
since $U$ is unitary, that is, $||Ux_0||_2^2 = x_0^TU^TUx_0 = x_0^Tx_0 = ||x_0||_2^2$, for some vector $x_0$.
Then let $y = V^Tx$. By the same argument above, $||y||_2 = ||V^Tx||_2 = ||x||_2 = 1$ since $V$ is unitary.
$$sup_||Sigma V^Tx||_2 = sup_y||Sigma y||_2$$
Since $Sigma = mboxdiag(sigma_1, cdots, sigma_n)$, where $sigma_1$ is the largest singular value. The max for the above, $sigma_1$, is attained when $y = (1,cdots,0)^T$. You can find the max by, for example, solving the above using a Lagrange Multiplier.
answered Jan 31 at 22:44
yulunz
6111
6111
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f586663%2fwhy-does-the-spectral-norm-equal-the-largest-singular-value%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
What are your thoughts on the second equality?
â Git Gud
Nov 30 '13 at 11:35
@GitGud Oh, does that mean singular values are defined in this way, i.e., the second equality is a definition?
â mathemage
Nov 30 '13 at 12:23
It's not a definition. It's a direct consequence of the definition. Can you see this?
â Git Gud
Nov 30 '13 at 12:26
math.stackexchange.com/questions/482170/2-norm-vs-operator-norm
â Prahlad Vaidyanathan
Nov 30 '13 at 17:11
@GitGud One normally can't if one does not know about the Courant-Fischer's characterisation.
â Algebraic Pavel
Dec 1 '13 at 2:37