Frobenius Norm, Triangle inequality, and complex conjugates

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite












I found a thread that solved the problem I need to turn in (or confirmed that I had done it correctly) but doesn't really resolve some confusion I have regarding norms and inner products.



I need to show that the Frobenius norm obeys the general definition of a matrix norm, and only the triangle inequality is giving me any trouble, but that's been worked to death : Frobenius Norm Triangle Inequality



But I went about it somewhat differently and it's highlighted a few concepts I'm shaky on. Here is my approach:



Starting from the defintion
$$
||A||_F = left( sum^m_i=1 sum^n_j=1 |A_ij|^2 right)^1/2.
$$



Now consider



$$
||A+B||_F = left( sum^m_i=1 sum^n_j=1 |A_ij+B_ij|^2 right)^1/2.
$$



Noting that each element $A_ij, B_ij$ can be thought of as vectors in $Re^2$, I can apply the good old fashioned triangle inequality to the square root of each summand



such that $|A_ij+B_ij|leq |A_ij|+|B_ij|$



Squaring both sides gives me



$|A_ij+B_ij|^2leq |A_ij|^2+|B_ij|^2 +2|A_ij||B_ij|$



I see that I'm on the right track, but I'm afraid I'm a bit stuck here.



If I sum over all elements, I get back



$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2sum^m_i=1 sum^n_j=1|A_ij||B_ij|$



Is my approach hopelessly flawed, or is there some way I can salvage this?










share|cite|improve this question























  • I think this sort of reasoning will eventually work (you will need to use the Cauchy schwarz inequality at some point), but what you are basically doing is reproving the triangle inequality for vectors, except with a single index replaced by a double index. The frobenius norm of a matrix is exactly equal to the euclidean norm of the vectorized version of the matrix where you take the matrix and "unwrap" it into a very long vector.
    – Nick Alger
    Nov 21 '16 at 18:13










  • In every other proof I've seen, the Cauchy Schwarz inequality is used to reach the last line I have. I'm not allowed to use any notion of a matrix product, because we haven't been provided such machinery. I have no idea where to go from here; if I can prove that $sum^m_i=1 sum^n_j=1 |(A_ij||B_ij| leq sum^m_i=1 sum^n_j=1 |A_ij|sum^m_i=1 sum^n_j=1 ||B_ij|$ then I'm done, but I can't find a way to do so.
    – BenL
    Nov 21 '16 at 18:17















up vote
1
down vote

favorite












I found a thread that solved the problem I need to turn in (or confirmed that I had done it correctly) but doesn't really resolve some confusion I have regarding norms and inner products.



I need to show that the Frobenius norm obeys the general definition of a matrix norm, and only the triangle inequality is giving me any trouble, but that's been worked to death : Frobenius Norm Triangle Inequality



But I went about it somewhat differently and it's highlighted a few concepts I'm shaky on. Here is my approach:



Starting from the defintion
$$
||A||_F = left( sum^m_i=1 sum^n_j=1 |A_ij|^2 right)^1/2.
$$



Now consider



$$
||A+B||_F = left( sum^m_i=1 sum^n_j=1 |A_ij+B_ij|^2 right)^1/2.
$$



Noting that each element $A_ij, B_ij$ can be thought of as vectors in $Re^2$, I can apply the good old fashioned triangle inequality to the square root of each summand



such that $|A_ij+B_ij|leq |A_ij|+|B_ij|$



Squaring both sides gives me



$|A_ij+B_ij|^2leq |A_ij|^2+|B_ij|^2 +2|A_ij||B_ij|$



I see that I'm on the right track, but I'm afraid I'm a bit stuck here.



If I sum over all elements, I get back



$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2sum^m_i=1 sum^n_j=1|A_ij||B_ij|$



Is my approach hopelessly flawed, or is there some way I can salvage this?










share|cite|improve this question























  • I think this sort of reasoning will eventually work (you will need to use the Cauchy schwarz inequality at some point), but what you are basically doing is reproving the triangle inequality for vectors, except with a single index replaced by a double index. The frobenius norm of a matrix is exactly equal to the euclidean norm of the vectorized version of the matrix where you take the matrix and "unwrap" it into a very long vector.
    – Nick Alger
    Nov 21 '16 at 18:13










  • In every other proof I've seen, the Cauchy Schwarz inequality is used to reach the last line I have. I'm not allowed to use any notion of a matrix product, because we haven't been provided such machinery. I have no idea where to go from here; if I can prove that $sum^m_i=1 sum^n_j=1 |(A_ij||B_ij| leq sum^m_i=1 sum^n_j=1 |A_ij|sum^m_i=1 sum^n_j=1 ||B_ij|$ then I'm done, but I can't find a way to do so.
    – BenL
    Nov 21 '16 at 18:17













up vote
1
down vote

favorite









up vote
1
down vote

favorite











I found a thread that solved the problem I need to turn in (or confirmed that I had done it correctly) but doesn't really resolve some confusion I have regarding norms and inner products.



I need to show that the Frobenius norm obeys the general definition of a matrix norm, and only the triangle inequality is giving me any trouble, but that's been worked to death : Frobenius Norm Triangle Inequality



But I went about it somewhat differently and it's highlighted a few concepts I'm shaky on. Here is my approach:



Starting from the defintion
$$
||A||_F = left( sum^m_i=1 sum^n_j=1 |A_ij|^2 right)^1/2.
$$



Now consider



$$
||A+B||_F = left( sum^m_i=1 sum^n_j=1 |A_ij+B_ij|^2 right)^1/2.
$$



Noting that each element $A_ij, B_ij$ can be thought of as vectors in $Re^2$, I can apply the good old fashioned triangle inequality to the square root of each summand



such that $|A_ij+B_ij|leq |A_ij|+|B_ij|$



Squaring both sides gives me



$|A_ij+B_ij|^2leq |A_ij|^2+|B_ij|^2 +2|A_ij||B_ij|$



I see that I'm on the right track, but I'm afraid I'm a bit stuck here.



If I sum over all elements, I get back



$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2sum^m_i=1 sum^n_j=1|A_ij||B_ij|$



Is my approach hopelessly flawed, or is there some way I can salvage this?










share|cite|improve this question















I found a thread that solved the problem I need to turn in (or confirmed that I had done it correctly) but doesn't really resolve some confusion I have regarding norms and inner products.



I need to show that the Frobenius norm obeys the general definition of a matrix norm, and only the triangle inequality is giving me any trouble, but that's been worked to death : Frobenius Norm Triangle Inequality



But I went about it somewhat differently and it's highlighted a few concepts I'm shaky on. Here is my approach:



Starting from the defintion
$$
||A||_F = left( sum^m_i=1 sum^n_j=1 |A_ij|^2 right)^1/2.
$$



Now consider



$$
||A+B||_F = left( sum^m_i=1 sum^n_j=1 |A_ij+B_ij|^2 right)^1/2.
$$



Noting that each element $A_ij, B_ij$ can be thought of as vectors in $Re^2$, I can apply the good old fashioned triangle inequality to the square root of each summand



such that $|A_ij+B_ij|leq |A_ij|+|B_ij|$



Squaring both sides gives me



$|A_ij+B_ij|^2leq |A_ij|^2+|B_ij|^2 +2|A_ij||B_ij|$



I see that I'm on the right track, but I'm afraid I'm a bit stuck here.



If I sum over all elements, I get back



$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2sum^m_i=1 sum^n_j=1|A_ij||B_ij|$



Is my approach hopelessly flawed, or is there some way I can salvage this?







matrices inequality norm






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Apr 13 '17 at 12:21









Community♦

1




1










asked Nov 21 '16 at 14:29









BenL

413215




413215











  • I think this sort of reasoning will eventually work (you will need to use the Cauchy schwarz inequality at some point), but what you are basically doing is reproving the triangle inequality for vectors, except with a single index replaced by a double index. The frobenius norm of a matrix is exactly equal to the euclidean norm of the vectorized version of the matrix where you take the matrix and "unwrap" it into a very long vector.
    – Nick Alger
    Nov 21 '16 at 18:13










  • In every other proof I've seen, the Cauchy Schwarz inequality is used to reach the last line I have. I'm not allowed to use any notion of a matrix product, because we haven't been provided such machinery. I have no idea where to go from here; if I can prove that $sum^m_i=1 sum^n_j=1 |(A_ij||B_ij| leq sum^m_i=1 sum^n_j=1 |A_ij|sum^m_i=1 sum^n_j=1 ||B_ij|$ then I'm done, but I can't find a way to do so.
    – BenL
    Nov 21 '16 at 18:17

















  • I think this sort of reasoning will eventually work (you will need to use the Cauchy schwarz inequality at some point), but what you are basically doing is reproving the triangle inequality for vectors, except with a single index replaced by a double index. The frobenius norm of a matrix is exactly equal to the euclidean norm of the vectorized version of the matrix where you take the matrix and "unwrap" it into a very long vector.
    – Nick Alger
    Nov 21 '16 at 18:13










  • In every other proof I've seen, the Cauchy Schwarz inequality is used to reach the last line I have. I'm not allowed to use any notion of a matrix product, because we haven't been provided such machinery. I have no idea where to go from here; if I can prove that $sum^m_i=1 sum^n_j=1 |(A_ij||B_ij| leq sum^m_i=1 sum^n_j=1 |A_ij|sum^m_i=1 sum^n_j=1 ||B_ij|$ then I'm done, but I can't find a way to do so.
    – BenL
    Nov 21 '16 at 18:17
















I think this sort of reasoning will eventually work (you will need to use the Cauchy schwarz inequality at some point), but what you are basically doing is reproving the triangle inequality for vectors, except with a single index replaced by a double index. The frobenius norm of a matrix is exactly equal to the euclidean norm of the vectorized version of the matrix where you take the matrix and "unwrap" it into a very long vector.
– Nick Alger
Nov 21 '16 at 18:13




I think this sort of reasoning will eventually work (you will need to use the Cauchy schwarz inequality at some point), but what you are basically doing is reproving the triangle inequality for vectors, except with a single index replaced by a double index. The frobenius norm of a matrix is exactly equal to the euclidean norm of the vectorized version of the matrix where you take the matrix and "unwrap" it into a very long vector.
– Nick Alger
Nov 21 '16 at 18:13












In every other proof I've seen, the Cauchy Schwarz inequality is used to reach the last line I have. I'm not allowed to use any notion of a matrix product, because we haven't been provided such machinery. I have no idea where to go from here; if I can prove that $sum^m_i=1 sum^n_j=1 |(A_ij||B_ij| leq sum^m_i=1 sum^n_j=1 |A_ij|sum^m_i=1 sum^n_j=1 ||B_ij|$ then I'm done, but I can't find a way to do so.
– BenL
Nov 21 '16 at 18:17





In every other proof I've seen, the Cauchy Schwarz inequality is used to reach the last line I have. I'm not allowed to use any notion of a matrix product, because we haven't been provided such machinery. I have no idea where to go from here; if I can prove that $sum^m_i=1 sum^n_j=1 |(A_ij||B_ij| leq sum^m_i=1 sum^n_j=1 |A_ij|sum^m_i=1 sum^n_j=1 ||B_ij|$ then I'm done, but I can't find a way to do so.
– BenL
Nov 21 '16 at 18:17











1 Answer
1






active

oldest

votes

















up vote
0
down vote













The Frobenius norm of a matrix is identical to the standard Euclidean norm of the vectorized version of the matrix. So, the triangle inequality for vectors directly implies the triangle inequality for the Frobenius norm for matrices.



Let $textvec(cdot)$ be the vectorization operator that takes a $n$-by-$m$ matrix and unfolds it into a long length $n cdot m$ vector, stacking each column below the previous one. For example,



$$textvecleft(beginbmatrix1 & 3 \ 2 & 4endbmatrixright) = beginbmatrix1 \ 2 \ 3 \ 4endbmatrix.$$



Incidentally, this is how the matrix is actually stored in a computer's memory in many programing languages.



Then from the definitions one can see that
$$||M||_textFro = ||textvec(M)||,$$
where the second norm is the standard euclidean on vectors. In each case we sum the squares of all the entries and take the square root. Additionally, one can notice that
$$textvec(A + B) = textvec(A) + textvec(B)$$
because in each case you just add each entry in one object to the corresponding entry in the other.



Hence,
beginalign
||A + B||_textFro &= ||textvec(A+B)||\
&= ||textvec(A) + textvec(B)|| \
& le ||textvec(A)|| + ||textvec(B)|| \
& = ||A||_textFro + ||B||_textFro.
endalign
Going from the second line to the third, we used the triangle inequality for vectors.




Edit:



As requested, you can finish your proof (not involving vectorization) as follows. You're really close. So far in the unvectorized proof from the original question question, the proof has progressed to the following statement:
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2sum_ij|A_ij||B_ij|$$



Applying the Cauchy-Schwarz inequality to the last term yields:
$$sum_ij|A_ij||B_ij| le left(sum_ij|A_ij|^2right)^1/2left(sum_ij|B_ij|^2right)^1/2 = ||A||_F ||B||_F.$$



Hence,
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2||A||_F ||B||_F.$$



Using the fact that $a^2 + b^2 + 2ab = (a + b)^2$, we get:
$$||A+B||_F^2 leq (||A||_F + ||B||_F)^2$$
which yields the desired triangle inequality after taking the square root of each side.






share|cite|improve this answer






















  • That's very interesting, but could you please address the work I've actually done? There is an enormous body of info on this, including what you've provided; I need feedback on method.
    – BenL
    Nov 21 '16 at 22:20










  • I think you are almost there; Use the Cauchy Schwarz inequality and then notice that $|x|^2 + |y|^2 + 2|x|~ |y| = (|x|+|y|)^2$.
    – Nick Alger
    Nov 22 '16 at 7:09











  • I guess the issue is that the Cauchy-Schwarz inequality I'm aware of is $<a,b> leq |a||b|$. It seems like you're applying it to the product of two vector norms and then using that to say they're less than the product of two matrix norms. I don't see how this follows from definitions.
    – BenL
    Nov 22 '16 at 12:09











  • You may be interested in the first chapter of the excellent book "The Cauchy Schwarz Master Class", where they talk about this inequality a great deal. This first chapter is legally available for free online: www-stat.wharton.upenn.edu/~steele/Publications/Books/CSMC/…
    – Nick Alger
    Nov 22 '16 at 19:13










Your Answer




StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2024247%2ffrobenius-norm-triangle-inequality-and-complex-conjugates%23new-answer', 'question_page');

);

Post as a guest






























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
0
down vote













The Frobenius norm of a matrix is identical to the standard Euclidean norm of the vectorized version of the matrix. So, the triangle inequality for vectors directly implies the triangle inequality for the Frobenius norm for matrices.



Let $textvec(cdot)$ be the vectorization operator that takes a $n$-by-$m$ matrix and unfolds it into a long length $n cdot m$ vector, stacking each column below the previous one. For example,



$$textvecleft(beginbmatrix1 & 3 \ 2 & 4endbmatrixright) = beginbmatrix1 \ 2 \ 3 \ 4endbmatrix.$$



Incidentally, this is how the matrix is actually stored in a computer's memory in many programing languages.



Then from the definitions one can see that
$$||M||_textFro = ||textvec(M)||,$$
where the second norm is the standard euclidean on vectors. In each case we sum the squares of all the entries and take the square root. Additionally, one can notice that
$$textvec(A + B) = textvec(A) + textvec(B)$$
because in each case you just add each entry in one object to the corresponding entry in the other.



Hence,
beginalign
||A + B||_textFro &= ||textvec(A+B)||\
&= ||textvec(A) + textvec(B)|| \
& le ||textvec(A)|| + ||textvec(B)|| \
& = ||A||_textFro + ||B||_textFro.
endalign
Going from the second line to the third, we used the triangle inequality for vectors.




Edit:



As requested, you can finish your proof (not involving vectorization) as follows. You're really close. So far in the unvectorized proof from the original question question, the proof has progressed to the following statement:
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2sum_ij|A_ij||B_ij|$$



Applying the Cauchy-Schwarz inequality to the last term yields:
$$sum_ij|A_ij||B_ij| le left(sum_ij|A_ij|^2right)^1/2left(sum_ij|B_ij|^2right)^1/2 = ||A||_F ||B||_F.$$



Hence,
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2||A||_F ||B||_F.$$



Using the fact that $a^2 + b^2 + 2ab = (a + b)^2$, we get:
$$||A+B||_F^2 leq (||A||_F + ||B||_F)^2$$
which yields the desired triangle inequality after taking the square root of each side.






share|cite|improve this answer






















  • That's very interesting, but could you please address the work I've actually done? There is an enormous body of info on this, including what you've provided; I need feedback on method.
    – BenL
    Nov 21 '16 at 22:20










  • I think you are almost there; Use the Cauchy Schwarz inequality and then notice that $|x|^2 + |y|^2 + 2|x|~ |y| = (|x|+|y|)^2$.
    – Nick Alger
    Nov 22 '16 at 7:09











  • I guess the issue is that the Cauchy-Schwarz inequality I'm aware of is $<a,b> leq |a||b|$. It seems like you're applying it to the product of two vector norms and then using that to say they're less than the product of two matrix norms. I don't see how this follows from definitions.
    – BenL
    Nov 22 '16 at 12:09











  • You may be interested in the first chapter of the excellent book "The Cauchy Schwarz Master Class", where they talk about this inequality a great deal. This first chapter is legally available for free online: www-stat.wharton.upenn.edu/~steele/Publications/Books/CSMC/…
    – Nick Alger
    Nov 22 '16 at 19:13














up vote
0
down vote













The Frobenius norm of a matrix is identical to the standard Euclidean norm of the vectorized version of the matrix. So, the triangle inequality for vectors directly implies the triangle inequality for the Frobenius norm for matrices.



Let $textvec(cdot)$ be the vectorization operator that takes a $n$-by-$m$ matrix and unfolds it into a long length $n cdot m$ vector, stacking each column below the previous one. For example,



$$textvecleft(beginbmatrix1 & 3 \ 2 & 4endbmatrixright) = beginbmatrix1 \ 2 \ 3 \ 4endbmatrix.$$



Incidentally, this is how the matrix is actually stored in a computer's memory in many programing languages.



Then from the definitions one can see that
$$||M||_textFro = ||textvec(M)||,$$
where the second norm is the standard euclidean on vectors. In each case we sum the squares of all the entries and take the square root. Additionally, one can notice that
$$textvec(A + B) = textvec(A) + textvec(B)$$
because in each case you just add each entry in one object to the corresponding entry in the other.



Hence,
beginalign
||A + B||_textFro &= ||textvec(A+B)||\
&= ||textvec(A) + textvec(B)|| \
& le ||textvec(A)|| + ||textvec(B)|| \
& = ||A||_textFro + ||B||_textFro.
endalign
Going from the second line to the third, we used the triangle inequality for vectors.




Edit:



As requested, you can finish your proof (not involving vectorization) as follows. You're really close. So far in the unvectorized proof from the original question question, the proof has progressed to the following statement:
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2sum_ij|A_ij||B_ij|$$



Applying the Cauchy-Schwarz inequality to the last term yields:
$$sum_ij|A_ij||B_ij| le left(sum_ij|A_ij|^2right)^1/2left(sum_ij|B_ij|^2right)^1/2 = ||A||_F ||B||_F.$$



Hence,
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2||A||_F ||B||_F.$$



Using the fact that $a^2 + b^2 + 2ab = (a + b)^2$, we get:
$$||A+B||_F^2 leq (||A||_F + ||B||_F)^2$$
which yields the desired triangle inequality after taking the square root of each side.






share|cite|improve this answer






















  • That's very interesting, but could you please address the work I've actually done? There is an enormous body of info on this, including what you've provided; I need feedback on method.
    – BenL
    Nov 21 '16 at 22:20










  • I think you are almost there; Use the Cauchy Schwarz inequality and then notice that $|x|^2 + |y|^2 + 2|x|~ |y| = (|x|+|y|)^2$.
    – Nick Alger
    Nov 22 '16 at 7:09











  • I guess the issue is that the Cauchy-Schwarz inequality I'm aware of is $<a,b> leq |a||b|$. It seems like you're applying it to the product of two vector norms and then using that to say they're less than the product of two matrix norms. I don't see how this follows from definitions.
    – BenL
    Nov 22 '16 at 12:09











  • You may be interested in the first chapter of the excellent book "The Cauchy Schwarz Master Class", where they talk about this inequality a great deal. This first chapter is legally available for free online: www-stat.wharton.upenn.edu/~steele/Publications/Books/CSMC/…
    – Nick Alger
    Nov 22 '16 at 19:13












up vote
0
down vote










up vote
0
down vote









The Frobenius norm of a matrix is identical to the standard Euclidean norm of the vectorized version of the matrix. So, the triangle inequality for vectors directly implies the triangle inequality for the Frobenius norm for matrices.



Let $textvec(cdot)$ be the vectorization operator that takes a $n$-by-$m$ matrix and unfolds it into a long length $n cdot m$ vector, stacking each column below the previous one. For example,



$$textvecleft(beginbmatrix1 & 3 \ 2 & 4endbmatrixright) = beginbmatrix1 \ 2 \ 3 \ 4endbmatrix.$$



Incidentally, this is how the matrix is actually stored in a computer's memory in many programing languages.



Then from the definitions one can see that
$$||M||_textFro = ||textvec(M)||,$$
where the second norm is the standard euclidean on vectors. In each case we sum the squares of all the entries and take the square root. Additionally, one can notice that
$$textvec(A + B) = textvec(A) + textvec(B)$$
because in each case you just add each entry in one object to the corresponding entry in the other.



Hence,
beginalign
||A + B||_textFro &= ||textvec(A+B)||\
&= ||textvec(A) + textvec(B)|| \
& le ||textvec(A)|| + ||textvec(B)|| \
& = ||A||_textFro + ||B||_textFro.
endalign
Going from the second line to the third, we used the triangle inequality for vectors.




Edit:



As requested, you can finish your proof (not involving vectorization) as follows. You're really close. So far in the unvectorized proof from the original question question, the proof has progressed to the following statement:
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2sum_ij|A_ij||B_ij|$$



Applying the Cauchy-Schwarz inequality to the last term yields:
$$sum_ij|A_ij||B_ij| le left(sum_ij|A_ij|^2right)^1/2left(sum_ij|B_ij|^2right)^1/2 = ||A||_F ||B||_F.$$



Hence,
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2||A||_F ||B||_F.$$



Using the fact that $a^2 + b^2 + 2ab = (a + b)^2$, we get:
$$||A+B||_F^2 leq (||A||_F + ||B||_F)^2$$
which yields the desired triangle inequality after taking the square root of each side.






share|cite|improve this answer














The Frobenius norm of a matrix is identical to the standard Euclidean norm of the vectorized version of the matrix. So, the triangle inequality for vectors directly implies the triangle inequality for the Frobenius norm for matrices.



Let $textvec(cdot)$ be the vectorization operator that takes a $n$-by-$m$ matrix and unfolds it into a long length $n cdot m$ vector, stacking each column below the previous one. For example,



$$textvecleft(beginbmatrix1 & 3 \ 2 & 4endbmatrixright) = beginbmatrix1 \ 2 \ 3 \ 4endbmatrix.$$



Incidentally, this is how the matrix is actually stored in a computer's memory in many programing languages.



Then from the definitions one can see that
$$||M||_textFro = ||textvec(M)||,$$
where the second norm is the standard euclidean on vectors. In each case we sum the squares of all the entries and take the square root. Additionally, one can notice that
$$textvec(A + B) = textvec(A) + textvec(B)$$
because in each case you just add each entry in one object to the corresponding entry in the other.



Hence,
beginalign
||A + B||_textFro &= ||textvec(A+B)||\
&= ||textvec(A) + textvec(B)|| \
& le ||textvec(A)|| + ||textvec(B)|| \
& = ||A||_textFro + ||B||_textFro.
endalign
Going from the second line to the third, we used the triangle inequality for vectors.




Edit:



As requested, you can finish your proof (not involving vectorization) as follows. You're really close. So far in the unvectorized proof from the original question question, the proof has progressed to the following statement:
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2sum_ij|A_ij||B_ij|$$



Applying the Cauchy-Schwarz inequality to the last term yields:
$$sum_ij|A_ij||B_ij| le left(sum_ij|A_ij|^2right)^1/2left(sum_ij|B_ij|^2right)^1/2 = ||A||_F ||B||_F.$$



Hence,
$$||A+B||_F^2 leq ||A||_F^2 + ||B||_F^2 + 2||A||_F ||B||_F.$$



Using the fact that $a^2 + b^2 + 2ab = (a + b)^2$, we get:
$$||A+B||_F^2 leq (||A||_F + ||B||_F)^2$$
which yields the desired triangle inequality after taking the square root of each side.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Nov 22 '16 at 7:26

























answered Nov 21 '16 at 19:15









Nick Alger

9,53353167




9,53353167











  • That's very interesting, but could you please address the work I've actually done? There is an enormous body of info on this, including what you've provided; I need feedback on method.
    – BenL
    Nov 21 '16 at 22:20










  • I think you are almost there; Use the Cauchy Schwarz inequality and then notice that $|x|^2 + |y|^2 + 2|x|~ |y| = (|x|+|y|)^2$.
    – Nick Alger
    Nov 22 '16 at 7:09











  • I guess the issue is that the Cauchy-Schwarz inequality I'm aware of is $<a,b> leq |a||b|$. It seems like you're applying it to the product of two vector norms and then using that to say they're less than the product of two matrix norms. I don't see how this follows from definitions.
    – BenL
    Nov 22 '16 at 12:09











  • You may be interested in the first chapter of the excellent book "The Cauchy Schwarz Master Class", where they talk about this inequality a great deal. This first chapter is legally available for free online: www-stat.wharton.upenn.edu/~steele/Publications/Books/CSMC/…
    – Nick Alger
    Nov 22 '16 at 19:13
















  • That's very interesting, but could you please address the work I've actually done? There is an enormous body of info on this, including what you've provided; I need feedback on method.
    – BenL
    Nov 21 '16 at 22:20










  • I think you are almost there; Use the Cauchy Schwarz inequality and then notice that $|x|^2 + |y|^2 + 2|x|~ |y| = (|x|+|y|)^2$.
    – Nick Alger
    Nov 22 '16 at 7:09











  • I guess the issue is that the Cauchy-Schwarz inequality I'm aware of is $<a,b> leq |a||b|$. It seems like you're applying it to the product of two vector norms and then using that to say they're less than the product of two matrix norms. I don't see how this follows from definitions.
    – BenL
    Nov 22 '16 at 12:09











  • You may be interested in the first chapter of the excellent book "The Cauchy Schwarz Master Class", where they talk about this inequality a great deal. This first chapter is legally available for free online: www-stat.wharton.upenn.edu/~steele/Publications/Books/CSMC/…
    – Nick Alger
    Nov 22 '16 at 19:13















That's very interesting, but could you please address the work I've actually done? There is an enormous body of info on this, including what you've provided; I need feedback on method.
– BenL
Nov 21 '16 at 22:20




That's very interesting, but could you please address the work I've actually done? There is an enormous body of info on this, including what you've provided; I need feedback on method.
– BenL
Nov 21 '16 at 22:20












I think you are almost there; Use the Cauchy Schwarz inequality and then notice that $|x|^2 + |y|^2 + 2|x|~ |y| = (|x|+|y|)^2$.
– Nick Alger
Nov 22 '16 at 7:09





I think you are almost there; Use the Cauchy Schwarz inequality and then notice that $|x|^2 + |y|^2 + 2|x|~ |y| = (|x|+|y|)^2$.
– Nick Alger
Nov 22 '16 at 7:09













I guess the issue is that the Cauchy-Schwarz inequality I'm aware of is $<a,b> leq |a||b|$. It seems like you're applying it to the product of two vector norms and then using that to say they're less than the product of two matrix norms. I don't see how this follows from definitions.
– BenL
Nov 22 '16 at 12:09





I guess the issue is that the Cauchy-Schwarz inequality I'm aware of is $<a,b> leq |a||b|$. It seems like you're applying it to the product of two vector norms and then using that to say they're less than the product of two matrix norms. I don't see how this follows from definitions.
– BenL
Nov 22 '16 at 12:09













You may be interested in the first chapter of the excellent book "The Cauchy Schwarz Master Class", where they talk about this inequality a great deal. This first chapter is legally available for free online: www-stat.wharton.upenn.edu/~steele/Publications/Books/CSMC/…
– Nick Alger
Nov 22 '16 at 19:13




You may be interested in the first chapter of the excellent book "The Cauchy Schwarz Master Class", where they talk about this inequality a great deal. This first chapter is legally available for free online: www-stat.wharton.upenn.edu/~steele/Publications/Books/CSMC/…
– Nick Alger
Nov 22 '16 at 19:13

















 

draft saved


draft discarded















































 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2024247%2ffrobenius-norm-triangle-inequality-and-complex-conjugates%23new-answer', 'question_page');

);

Post as a guest













































































這個網誌中的熱門文章

How to combine Bézier curves to a surface?

Carbon dioxide

Why am i infinitely getting the same tweet with the Twitter Search API?