Frobenius Norm, Triangle inequality, and complex conjugates
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
I found a thread that solved the problem I need to turn in (or confirmed that I had done it correctly) but doesn't really resolve some confusion I have regarding norms and inner products.
I need to show that the Frobenius norm obeys the general definition of a matrix norm, and only the triangle inequality is giving me any trouble, but that's been worked to death : Frobenius Norm Triangle Inequality
But I went about it somewhat differently and it's highlighted a few concepts I'm shaky on. Here is my approach:
Starting from the defintion
$$
||A||_F = left( sum^m_i=1 sum^n_j=1 |A_ij|^2 right)^1/2.
$$
Now consider
$$
||A+B||_F = left( sum^m_i=1 sum^n_j=1 |A_ij+B_ij|^2 right)^1/2.
$$
Noting that each element $A_ij, B_ij$ can be thought of as vectors in $Re^2$, I can apply the good old fashioned triangle inequality to the square root of each summand
such that $|A_ij+B_ij|leq |A_ij|+|B_ij|$
Squaring both sides gives me
$|A_ij+B_ij|^2leq |A_ij|^2+|B_ij|^2 +2|A_ij||B_ij|$
I see that I'm on the right track, but I'm afraid I'm a bit stuck here.
If I sum over all elements, I get back
$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2sum^m_i=1 sum^n_j=1|A_ij||B_ij|$
Is my approach hopelessly flawed, or is there some way I can salvage this?
matrices inequality norm
add a comment |Â
up vote
1
down vote
favorite
I found a thread that solved the problem I need to turn in (or confirmed that I had done it correctly) but doesn't really resolve some confusion I have regarding norms and inner products.
I need to show that the Frobenius norm obeys the general definition of a matrix norm, and only the triangle inequality is giving me any trouble, but that's been worked to death : Frobenius Norm Triangle Inequality
But I went about it somewhat differently and it's highlighted a few concepts I'm shaky on. Here is my approach:
Starting from the defintion
$$
||A||_F = left( sum^m_i=1 sum^n_j=1 |A_ij|^2 right)^1/2.
$$
Now consider
$$
||A+B||_F = left( sum^m_i=1 sum^n_j=1 |A_ij+B_ij|^2 right)^1/2.
$$
Noting that each element $A_ij, B_ij$ can be thought of as vectors in $Re^2$, I can apply the good old fashioned triangle inequality to the square root of each summand
such that $|A_ij+B_ij|leq |A_ij|+|B_ij|$
Squaring both sides gives me
$|A_ij+B_ij|^2leq |A_ij|^2+|B_ij|^2 +2|A_ij||B_ij|$
I see that I'm on the right track, but I'm afraid I'm a bit stuck here.
If I sum over all elements, I get back
$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2sum^m_i=1 sum^n_j=1|A_ij||B_ij|$
Is my approach hopelessly flawed, or is there some way I can salvage this?
matrices inequality norm
I think this sort of reasoning will eventually work (you will need to use the Cauchy schwarz inequality at some point), but what you are basically doing is reproving the triangle inequality for vectors, except with a single index replaced by a double index. The frobenius norm of a matrix is exactly equal to the euclidean norm of the vectorized version of the matrix where you take the matrix and "unwrap" it into a very long vector.
â Nick Alger
Nov 21 '16 at 18:13
In every other proof I've seen, the Cauchy Schwarz inequality is used to reach the last line I have. I'm not allowed to use any notion of a matrix product, because we haven't been provided such machinery. I have no idea where to go from here; if I can prove that $sum^m_i=1 sum^n_j=1 |(A_ij||B_ij| leq sum^m_i=1 sum^n_j=1 |A_ij|sum^m_i=1 sum^n_j=1 ||B_ij|$ then I'm done, but I can't find a way to do so.
â BenL
Nov 21 '16 at 18:17
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I found a thread that solved the problem I need to turn in (or confirmed that I had done it correctly) but doesn't really resolve some confusion I have regarding norms and inner products.
I need to show that the Frobenius norm obeys the general definition of a matrix norm, and only the triangle inequality is giving me any trouble, but that's been worked to death : Frobenius Norm Triangle Inequality
But I went about it somewhat differently and it's highlighted a few concepts I'm shaky on. Here is my approach:
Starting from the defintion
$$
||A||_F = left( sum^m_i=1 sum^n_j=1 |A_ij|^2 right)^1/2.
$$
Now consider
$$
||A+B||_F = left( sum^m_i=1 sum^n_j=1 |A_ij+B_ij|^2 right)^1/2.
$$
Noting that each element $A_ij, B_ij$ can be thought of as vectors in $Re^2$, I can apply the good old fashioned triangle inequality to the square root of each summand
such that $|A_ij+B_ij|leq |A_ij|+|B_ij|$
Squaring both sides gives me
$|A_ij+B_ij|^2leq |A_ij|^2+|B_ij|^2 +2|A_ij||B_ij|$
I see that I'm on the right track, but I'm afraid I'm a bit stuck here.
If I sum over all elements, I get back
$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2sum^m_i=1 sum^n_j=1|A_ij||B_ij|$
Is my approach hopelessly flawed, or is there some way I can salvage this?
matrices inequality norm
I found a thread that solved the problem I need to turn in (or confirmed that I had done it correctly) but doesn't really resolve some confusion I have regarding norms and inner products.
I need to show that the Frobenius norm obeys the general definition of a matrix norm, and only the triangle inequality is giving me any trouble, but that's been worked to death : Frobenius Norm Triangle Inequality
But I went about it somewhat differently and it's highlighted a few concepts I'm shaky on. Here is my approach:
Starting from the defintion
$$
||A||_F = left( sum^m_i=1 sum^n_j=1 |A_ij|^2 right)^1/2.
$$
Now consider
$$
||A+B||_F = left( sum^m_i=1 sum^n_j=1 |A_ij+B_ij|^2 right)^1/2.
$$
Noting that each element $A_ij, B_ij$ can be thought of as vectors in $Re^2$, I can apply the good old fashioned triangle inequality to the square root of each summand
such that $|A_ij+B_ij|leq |A_ij|+|B_ij|$
Squaring both sides gives me
$|A_ij+B_ij|^2leq |A_ij|^2+|B_ij|^2 +2|A_ij||B_ij|$
I see that I'm on the right track, but I'm afraid I'm a bit stuck here.
If I sum over all elements, I get back
$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2sum^m_i=1 sum^n_j=1|A_ij||B_ij|$
Is my approach hopelessly flawed, or is there some way I can salvage this?
matrices inequality norm
matrices inequality norm
edited Apr 13 '17 at 12:21
Communityâ¦
1
1
asked Nov 21 '16 at 14:29
BenL
413215
413215
I think this sort of reasoning will eventually work (you will need to use the Cauchy schwarz inequality at some point), but what you are basically doing is reproving the triangle inequality for vectors, except with a single index replaced by a double index. The frobenius norm of a matrix is exactly equal to the euclidean norm of the vectorized version of the matrix where you take the matrix and "unwrap" it into a very long vector.
â Nick Alger
Nov 21 '16 at 18:13
In every other proof I've seen, the Cauchy Schwarz inequality is used to reach the last line I have. I'm not allowed to use any notion of a matrix product, because we haven't been provided such machinery. I have no idea where to go from here; if I can prove that $sum^m_i=1 sum^n_j=1 |(A_ij||B_ij| leq sum^m_i=1 sum^n_j=1 |A_ij|sum^m_i=1 sum^n_j=1 ||B_ij|$ then I'm done, but I can't find a way to do so.
â BenL
Nov 21 '16 at 18:17
add a comment |Â
I think this sort of reasoning will eventually work (you will need to use the Cauchy schwarz inequality at some point), but what you are basically doing is reproving the triangle inequality for vectors, except with a single index replaced by a double index. The frobenius norm of a matrix is exactly equal to the euclidean norm of the vectorized version of the matrix where you take the matrix and "unwrap" it into a very long vector.
â Nick Alger
Nov 21 '16 at 18:13
In every other proof I've seen, the Cauchy Schwarz inequality is used to reach the last line I have. I'm not allowed to use any notion of a matrix product, because we haven't been provided such machinery. I have no idea where to go from here; if I can prove that $sum^m_i=1 sum^n_j=1 |(A_ij||B_ij| leq sum^m_i=1 sum^n_j=1 |A_ij|sum^m_i=1 sum^n_j=1 ||B_ij|$ then I'm done, but I can't find a way to do so.
â BenL
Nov 21 '16 at 18:17
I think this sort of reasoning will eventually work (you will need to use the Cauchy schwarz inequality at some point), but what you are basically doing is reproving the triangle inequality for vectors, except with a single index replaced by a double index. The frobenius norm of a matrix is exactly equal to the euclidean norm of the vectorized version of the matrix where you take the matrix and "unwrap" it into a very long vector.
â Nick Alger
Nov 21 '16 at 18:13
I think this sort of reasoning will eventually work (you will need to use the Cauchy schwarz inequality at some point), but what you are basically doing is reproving the triangle inequality for vectors, except with a single index replaced by a double index. The frobenius norm of a matrix is exactly equal to the euclidean norm of the vectorized version of the matrix where you take the matrix and "unwrap" it into a very long vector.
â Nick Alger
Nov 21 '16 at 18:13
In every other proof I've seen, the Cauchy Schwarz inequality is used to reach the last line I have. I'm not allowed to use any notion of a matrix product, because we haven't been provided such machinery. I have no idea where to go from here; if I can prove that $sum^m_i=1 sum^n_j=1 |(A_ij||B_ij| leq sum^m_i=1 sum^n_j=1 |A_ij|sum^m_i=1 sum^n_j=1 ||B_ij|$ then I'm done, but I can't find a way to do so.
â BenL
Nov 21 '16 at 18:17
In every other proof I've seen, the Cauchy Schwarz inequality is used to reach the last line I have. I'm not allowed to use any notion of a matrix product, because we haven't been provided such machinery. I have no idea where to go from here; if I can prove that $sum^m_i=1 sum^n_j=1 |(A_ij||B_ij| leq sum^m_i=1 sum^n_j=1 |A_ij|sum^m_i=1 sum^n_j=1 ||B_ij|$ then I'm done, but I can't find a way to do so.
â BenL
Nov 21 '16 at 18:17
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
0
down vote
The Frobenius norm of a matrix is identical to the standard Euclidean norm of the vectorized version of the matrix. So, the triangle inequality for vectors directly implies the triangle inequality for the Frobenius norm for matrices.
Let $textvec(cdot)$ be the vectorization operator that takes a $n$-by-$m$ matrix and unfolds it into a long length $n cdot m$ vector, stacking each column below the previous one. For example,
$$textvecleft(beginbmatrix1 & 3 \ 2 & 4endbmatrixright) = beginbmatrix1 \ 2 \ 3 \ 4endbmatrix.$$
Incidentally, this is how the matrix is actually stored in a computer's memory in many programing languages.
Then from the definitions one can see that
$$||M||_textFro = ||textvec(M)||,$$
where the second norm is the standard euclidean on vectors. In each case we sum the squares of all the entries and take the square root. Additionally, one can notice that
$$textvec(A + B) = textvec(A) + textvec(B)$$
because in each case you just add each entry in one object to the corresponding entry in the other.
Hence,
beginalign
||A + B||_textFro &= ||textvec(A+B)||\
&= ||textvec(A) + textvec(B)|| \
& le ||textvec(A)|| + ||textvec(B)|| \
& = ||A||_textFro + ||B||_textFro.
endalign
Going from the second line to the third, we used the triangle inequality for vectors.
Edit:
As requested, you can finish your proof (not involving vectorization) as follows. You're really close. So far in the unvectorized proof from the original question question, the proof has progressed to the following statement:
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2sum_ij|A_ij||B_ij|$$
Applying the Cauchy-Schwarz inequality to the last term yields:
$$sum_ij|A_ij||B_ij| le left(sum_ij|A_ij|^2right)^1/2left(sum_ij|B_ij|^2right)^1/2 = ||A||_F ||B||_F.$$
Hence,
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2||A||_F ||B||_F.$$
Using the fact that $a^2 + b^2 + 2ab = (a + b)^2$, we get:
$$||A+B||_F^2 leq (||A||_F + ||B||_F)^2$$
which yields the desired triangle inequality after taking the square root of each side.
That's very interesting, but could you please address the work I've actually done? There is an enormous body of info on this, including what you've provided; I need feedback on method.
â BenL
Nov 21 '16 at 22:20
I think you are almost there; Use the Cauchy Schwarz inequality and then notice that $|x|^2 + |y|^2 + 2|x|~ |y| = (|x|+|y|)^2$.
â Nick Alger
Nov 22 '16 at 7:09
I guess the issue is that the Cauchy-Schwarz inequality I'm aware of is $<a,b> leq |a||b|$. It seems like you're applying it to the product of two vector norms and then using that to say they're less than the product of two matrix norms. I don't see how this follows from definitions.
â BenL
Nov 22 '16 at 12:09
You may be interested in the first chapter of the excellent book "The Cauchy Schwarz Master Class", where they talk about this inequality a great deal. This first chapter is legally available for free online: www-stat.wharton.upenn.edu/~steele/Publications/Books/CSMC/â¦
â Nick Alger
Nov 22 '16 at 19:13
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
The Frobenius norm of a matrix is identical to the standard Euclidean norm of the vectorized version of the matrix. So, the triangle inequality for vectors directly implies the triangle inequality for the Frobenius norm for matrices.
Let $textvec(cdot)$ be the vectorization operator that takes a $n$-by-$m$ matrix and unfolds it into a long length $n cdot m$ vector, stacking each column below the previous one. For example,
$$textvecleft(beginbmatrix1 & 3 \ 2 & 4endbmatrixright) = beginbmatrix1 \ 2 \ 3 \ 4endbmatrix.$$
Incidentally, this is how the matrix is actually stored in a computer's memory in many programing languages.
Then from the definitions one can see that
$$||M||_textFro = ||textvec(M)||,$$
where the second norm is the standard euclidean on vectors. In each case we sum the squares of all the entries and take the square root. Additionally, one can notice that
$$textvec(A + B) = textvec(A) + textvec(B)$$
because in each case you just add each entry in one object to the corresponding entry in the other.
Hence,
beginalign
||A + B||_textFro &= ||textvec(A+B)||\
&= ||textvec(A) + textvec(B)|| \
& le ||textvec(A)|| + ||textvec(B)|| \
& = ||A||_textFro + ||B||_textFro.
endalign
Going from the second line to the third, we used the triangle inequality for vectors.
Edit:
As requested, you can finish your proof (not involving vectorization) as follows. You're really close. So far in the unvectorized proof from the original question question, the proof has progressed to the following statement:
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2sum_ij|A_ij||B_ij|$$
Applying the Cauchy-Schwarz inequality to the last term yields:
$$sum_ij|A_ij||B_ij| le left(sum_ij|A_ij|^2right)^1/2left(sum_ij|B_ij|^2right)^1/2 = ||A||_F ||B||_F.$$
Hence,
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2||A||_F ||B||_F.$$
Using the fact that $a^2 + b^2 + 2ab = (a + b)^2$, we get:
$$||A+B||_F^2 leq (||A||_F + ||B||_F)^2$$
which yields the desired triangle inequality after taking the square root of each side.
That's very interesting, but could you please address the work I've actually done? There is an enormous body of info on this, including what you've provided; I need feedback on method.
â BenL
Nov 21 '16 at 22:20
I think you are almost there; Use the Cauchy Schwarz inequality and then notice that $|x|^2 + |y|^2 + 2|x|~ |y| = (|x|+|y|)^2$.
â Nick Alger
Nov 22 '16 at 7:09
I guess the issue is that the Cauchy-Schwarz inequality I'm aware of is $<a,b> leq |a||b|$. It seems like you're applying it to the product of two vector norms and then using that to say they're less than the product of two matrix norms. I don't see how this follows from definitions.
â BenL
Nov 22 '16 at 12:09
You may be interested in the first chapter of the excellent book "The Cauchy Schwarz Master Class", where they talk about this inequality a great deal. This first chapter is legally available for free online: www-stat.wharton.upenn.edu/~steele/Publications/Books/CSMC/â¦
â Nick Alger
Nov 22 '16 at 19:13
add a comment |Â
up vote
0
down vote
The Frobenius norm of a matrix is identical to the standard Euclidean norm of the vectorized version of the matrix. So, the triangle inequality for vectors directly implies the triangle inequality for the Frobenius norm for matrices.
Let $textvec(cdot)$ be the vectorization operator that takes a $n$-by-$m$ matrix and unfolds it into a long length $n cdot m$ vector, stacking each column below the previous one. For example,
$$textvecleft(beginbmatrix1 & 3 \ 2 & 4endbmatrixright) = beginbmatrix1 \ 2 \ 3 \ 4endbmatrix.$$
Incidentally, this is how the matrix is actually stored in a computer's memory in many programing languages.
Then from the definitions one can see that
$$||M||_textFro = ||textvec(M)||,$$
where the second norm is the standard euclidean on vectors. In each case we sum the squares of all the entries and take the square root. Additionally, one can notice that
$$textvec(A + B) = textvec(A) + textvec(B)$$
because in each case you just add each entry in one object to the corresponding entry in the other.
Hence,
beginalign
||A + B||_textFro &= ||textvec(A+B)||\
&= ||textvec(A) + textvec(B)|| \
& le ||textvec(A)|| + ||textvec(B)|| \
& = ||A||_textFro + ||B||_textFro.
endalign
Going from the second line to the third, we used the triangle inequality for vectors.
Edit:
As requested, you can finish your proof (not involving vectorization) as follows. You're really close. So far in the unvectorized proof from the original question question, the proof has progressed to the following statement:
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2sum_ij|A_ij||B_ij|$$
Applying the Cauchy-Schwarz inequality to the last term yields:
$$sum_ij|A_ij||B_ij| le left(sum_ij|A_ij|^2right)^1/2left(sum_ij|B_ij|^2right)^1/2 = ||A||_F ||B||_F.$$
Hence,
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2||A||_F ||B||_F.$$
Using the fact that $a^2 + b^2 + 2ab = (a + b)^2$, we get:
$$||A+B||_F^2 leq (||A||_F + ||B||_F)^2$$
which yields the desired triangle inequality after taking the square root of each side.
That's very interesting, but could you please address the work I've actually done? There is an enormous body of info on this, including what you've provided; I need feedback on method.
â BenL
Nov 21 '16 at 22:20
I think you are almost there; Use the Cauchy Schwarz inequality and then notice that $|x|^2 + |y|^2 + 2|x|~ |y| = (|x|+|y|)^2$.
â Nick Alger
Nov 22 '16 at 7:09
I guess the issue is that the Cauchy-Schwarz inequality I'm aware of is $<a,b> leq |a||b|$. It seems like you're applying it to the product of two vector norms and then using that to say they're less than the product of two matrix norms. I don't see how this follows from definitions.
â BenL
Nov 22 '16 at 12:09
You may be interested in the first chapter of the excellent book "The Cauchy Schwarz Master Class", where they talk about this inequality a great deal. This first chapter is legally available for free online: www-stat.wharton.upenn.edu/~steele/Publications/Books/CSMC/â¦
â Nick Alger
Nov 22 '16 at 19:13
add a comment |Â
up vote
0
down vote
up vote
0
down vote
The Frobenius norm of a matrix is identical to the standard Euclidean norm of the vectorized version of the matrix. So, the triangle inequality for vectors directly implies the triangle inequality for the Frobenius norm for matrices.
Let $textvec(cdot)$ be the vectorization operator that takes a $n$-by-$m$ matrix and unfolds it into a long length $n cdot m$ vector, stacking each column below the previous one. For example,
$$textvecleft(beginbmatrix1 & 3 \ 2 & 4endbmatrixright) = beginbmatrix1 \ 2 \ 3 \ 4endbmatrix.$$
Incidentally, this is how the matrix is actually stored in a computer's memory in many programing languages.
Then from the definitions one can see that
$$||M||_textFro = ||textvec(M)||,$$
where the second norm is the standard euclidean on vectors. In each case we sum the squares of all the entries and take the square root. Additionally, one can notice that
$$textvec(A + B) = textvec(A) + textvec(B)$$
because in each case you just add each entry in one object to the corresponding entry in the other.
Hence,
beginalign
||A + B||_textFro &= ||textvec(A+B)||\
&= ||textvec(A) + textvec(B)|| \
& le ||textvec(A)|| + ||textvec(B)|| \
& = ||A||_textFro + ||B||_textFro.
endalign
Going from the second line to the third, we used the triangle inequality for vectors.
Edit:
As requested, you can finish your proof (not involving vectorization) as follows. You're really close. So far in the unvectorized proof from the original question question, the proof has progressed to the following statement:
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2sum_ij|A_ij||B_ij|$$
Applying the Cauchy-Schwarz inequality to the last term yields:
$$sum_ij|A_ij||B_ij| le left(sum_ij|A_ij|^2right)^1/2left(sum_ij|B_ij|^2right)^1/2 = ||A||_F ||B||_F.$$
Hence,
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2||A||_F ||B||_F.$$
Using the fact that $a^2 + b^2 + 2ab = (a + b)^2$, we get:
$$||A+B||_F^2 leq (||A||_F + ||B||_F)^2$$
which yields the desired triangle inequality after taking the square root of each side.
The Frobenius norm of a matrix is identical to the standard Euclidean norm of the vectorized version of the matrix. So, the triangle inequality for vectors directly implies the triangle inequality for the Frobenius norm for matrices.
Let $textvec(cdot)$ be the vectorization operator that takes a $n$-by-$m$ matrix and unfolds it into a long length $n cdot m$ vector, stacking each column below the previous one. For example,
$$textvecleft(beginbmatrix1 & 3 \ 2 & 4endbmatrixright) = beginbmatrix1 \ 2 \ 3 \ 4endbmatrix.$$
Incidentally, this is how the matrix is actually stored in a computer's memory in many programing languages.
Then from the definitions one can see that
$$||M||_textFro = ||textvec(M)||,$$
where the second norm is the standard euclidean on vectors. In each case we sum the squares of all the entries and take the square root. Additionally, one can notice that
$$textvec(A + B) = textvec(A) + textvec(B)$$
because in each case you just add each entry in one object to the corresponding entry in the other.
Hence,
beginalign
||A + B||_textFro &= ||textvec(A+B)||\
&= ||textvec(A) + textvec(B)|| \
& le ||textvec(A)|| + ||textvec(B)|| \
& = ||A||_textFro + ||B||_textFro.
endalign
Going from the second line to the third, we used the triangle inequality for vectors.
Edit:
As requested, you can finish your proof (not involving vectorization) as follows. You're really close. So far in the unvectorized proof from the original question question, the proof has progressed to the following statement:
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2sum_ij|A_ij||B_ij|$$
Applying the Cauchy-Schwarz inequality to the last term yields:
$$sum_ij|A_ij||B_ij| le left(sum_ij|A_ij|^2right)^1/2left(sum_ij|B_ij|^2right)^1/2 = ||A||_F ||B||_F.$$
Hence,
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2||A||_F ||B||_F.$$
Using the fact that $a^2 + b^2 + 2ab = (a + b)^2$, we get:
$$||A+B||_F^2 leq (||A||_F + ||B||_F)^2$$
which yields the desired triangle inequality after taking the square root of each side.
edited Nov 22 '16 at 7:26
answered Nov 21 '16 at 19:15
Nick Alger
9,53353167
9,53353167
That's very interesting, but could you please address the work I've actually done? There is an enormous body of info on this, including what you've provided; I need feedback on method.
â BenL
Nov 21 '16 at 22:20
I think you are almost there; Use the Cauchy Schwarz inequality and then notice that $|x|^2 + |y|^2 + 2|x|~ |y| = (|x|+|y|)^2$.
â Nick Alger
Nov 22 '16 at 7:09
I guess the issue is that the Cauchy-Schwarz inequality I'm aware of is $<a,b> leq |a||b|$. It seems like you're applying it to the product of two vector norms and then using that to say they're less than the product of two matrix norms. I don't see how this follows from definitions.
â BenL
Nov 22 '16 at 12:09
You may be interested in the first chapter of the excellent book "The Cauchy Schwarz Master Class", where they talk about this inequality a great deal. This first chapter is legally available for free online: www-stat.wharton.upenn.edu/~steele/Publications/Books/CSMC/â¦
â Nick Alger
Nov 22 '16 at 19:13
add a comment |Â
That's very interesting, but could you please address the work I've actually done? There is an enormous body of info on this, including what you've provided; I need feedback on method.
â BenL
Nov 21 '16 at 22:20
I think you are almost there; Use the Cauchy Schwarz inequality and then notice that $|x|^2 + |y|^2 + 2|x|~ |y| = (|x|+|y|)^2$.
â Nick Alger
Nov 22 '16 at 7:09
I guess the issue is that the Cauchy-Schwarz inequality I'm aware of is $<a,b> leq |a||b|$. It seems like you're applying it to the product of two vector norms and then using that to say they're less than the product of two matrix norms. I don't see how this follows from definitions.
â BenL
Nov 22 '16 at 12:09
You may be interested in the first chapter of the excellent book "The Cauchy Schwarz Master Class", where they talk about this inequality a great deal. This first chapter is legally available for free online: www-stat.wharton.upenn.edu/~steele/Publications/Books/CSMC/â¦
â Nick Alger
Nov 22 '16 at 19:13
That's very interesting, but could you please address the work I've actually done? There is an enormous body of info on this, including what you've provided; I need feedback on method.
â BenL
Nov 21 '16 at 22:20
That's very interesting, but could you please address the work I've actually done? There is an enormous body of info on this, including what you've provided; I need feedback on method.
â BenL
Nov 21 '16 at 22:20
I think you are almost there; Use the Cauchy Schwarz inequality and then notice that $|x|^2 + |y|^2 + 2|x|~ |y| = (|x|+|y|)^2$.
â Nick Alger
Nov 22 '16 at 7:09
I think you are almost there; Use the Cauchy Schwarz inequality and then notice that $|x|^2 + |y|^2 + 2|x|~ |y| = (|x|+|y|)^2$.
â Nick Alger
Nov 22 '16 at 7:09
I guess the issue is that the Cauchy-Schwarz inequality I'm aware of is $<a,b> leq |a||b|$. It seems like you're applying it to the product of two vector norms and then using that to say they're less than the product of two matrix norms. I don't see how this follows from definitions.
â BenL
Nov 22 '16 at 12:09
I guess the issue is that the Cauchy-Schwarz inequality I'm aware of is $<a,b> leq |a||b|$. It seems like you're applying it to the product of two vector norms and then using that to say they're less than the product of two matrix norms. I don't see how this follows from definitions.
â BenL
Nov 22 '16 at 12:09
You may be interested in the first chapter of the excellent book "The Cauchy Schwarz Master Class", where they talk about this inequality a great deal. This first chapter is legally available for free online: www-stat.wharton.upenn.edu/~steele/Publications/Books/CSMC/â¦
â Nick Alger
Nov 22 '16 at 19:13
You may be interested in the first chapter of the excellent book "The Cauchy Schwarz Master Class", where they talk about this inequality a great deal. This first chapter is legally available for free online: www-stat.wharton.upenn.edu/~steele/Publications/Books/CSMC/â¦
â Nick Alger
Nov 22 '16 at 19:13
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2024247%2ffrobenius-norm-triangle-inequality-and-complex-conjugates%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
I think this sort of reasoning will eventually work (you will need to use the Cauchy schwarz inequality at some point), but what you are basically doing is reproving the triangle inequality for vectors, except with a single index replaced by a double index. The frobenius norm of a matrix is exactly equal to the euclidean norm of the vectorized version of the matrix where you take the matrix and "unwrap" it into a very long vector.
â Nick Alger
Nov 21 '16 at 18:13
In every other proof I've seen, the Cauchy Schwarz inequality is used to reach the last line I have. I'm not allowed to use any notion of a matrix product, because we haven't been provided such machinery. I have no idea where to go from here; if I can prove that $sum^m_i=1 sum^n_j=1 |(A_ij||B_ij| leq sum^m_i=1 sum^n_j=1 |A_ij|sum^m_i=1 sum^n_j=1 ||B_ij|$ then I'm done, but I can't find a way to do so.
â BenL
Nov 21 '16 at 18:17