Is there two different definition of covariance?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite












I see two different expressions of covariance and wonder whether they are different:




$$Cov(X,Y)=sum_i p(X=X_i,Y=Y_i)(X_i-E(X))(Y_i-E(Y))$$




and




$$Cov(X,Y)=sum_(i,j) p(X=X_i,Y=Y_j)(X_i-E(X))(Y_j-E(Y))$$




Are the two expressions for different situations? I find it confused.







share|cite|improve this question




















  • Only the first expression makes sense. When you draw several variables at the same time from a joint probability distribution, covariance measures the likelihood of a pair of those variables to be simultaneously large or small. In the second case, the variables must have been drawn independently. If that was the case, they have been drawn from marginal probability distributions, and thus it is incorrect to use joint probability distribution for their integral measures such as covariance
    – Aleksejs Fomins
    Aug 23 at 10:18










  • @AleksejsFomins actually I wonder how to derive $Cov(X,Y)=0$ for independent $X,Y$ using the first expression, I think it should be the second expression to derive it.
    – user571299
    Aug 23 at 10:22










  • Sorry, I think I got myself confused as well. The two definitions are completely the same, because summing over all combinations of 2 variable values (first expression) is the same as first summing over all possible values of one variable and then the other (2nd expression). The word of caution that I wrote before is about calculating the sample covariance. In that case, one should use the first expression to emphasize that both values were sampled simultaneously
    – Aleksejs Fomins
    Aug 23 at 10:46











  • @AleksejsFomins sorry I cannot well understand. The second includes $X=X_1,Y=Y_2$, but the first doesn’t. Why should they be the same?
    – user571299
    Aug 23 at 10:50










  • Can you explain to me, what you think the first sum does?
    – Aleksejs Fomins
    Aug 23 at 10:52














up vote
0
down vote

favorite












I see two different expressions of covariance and wonder whether they are different:




$$Cov(X,Y)=sum_i p(X=X_i,Y=Y_i)(X_i-E(X))(Y_i-E(Y))$$




and




$$Cov(X,Y)=sum_(i,j) p(X=X_i,Y=Y_j)(X_i-E(X))(Y_j-E(Y))$$




Are the two expressions for different situations? I find it confused.







share|cite|improve this question




















  • Only the first expression makes sense. When you draw several variables at the same time from a joint probability distribution, covariance measures the likelihood of a pair of those variables to be simultaneously large or small. In the second case, the variables must have been drawn independently. If that was the case, they have been drawn from marginal probability distributions, and thus it is incorrect to use joint probability distribution for their integral measures such as covariance
    – Aleksejs Fomins
    Aug 23 at 10:18










  • @AleksejsFomins actually I wonder how to derive $Cov(X,Y)=0$ for independent $X,Y$ using the first expression, I think it should be the second expression to derive it.
    – user571299
    Aug 23 at 10:22










  • Sorry, I think I got myself confused as well. The two definitions are completely the same, because summing over all combinations of 2 variable values (first expression) is the same as first summing over all possible values of one variable and then the other (2nd expression). The word of caution that I wrote before is about calculating the sample covariance. In that case, one should use the first expression to emphasize that both values were sampled simultaneously
    – Aleksejs Fomins
    Aug 23 at 10:46











  • @AleksejsFomins sorry I cannot well understand. The second includes $X=X_1,Y=Y_2$, but the first doesn’t. Why should they be the same?
    – user571299
    Aug 23 at 10:50










  • Can you explain to me, what you think the first sum does?
    – Aleksejs Fomins
    Aug 23 at 10:52












up vote
0
down vote

favorite









up vote
0
down vote

favorite











I see two different expressions of covariance and wonder whether they are different:




$$Cov(X,Y)=sum_i p(X=X_i,Y=Y_i)(X_i-E(X))(Y_i-E(Y))$$




and




$$Cov(X,Y)=sum_(i,j) p(X=X_i,Y=Y_j)(X_i-E(X))(Y_j-E(Y))$$




Are the two expressions for different situations? I find it confused.







share|cite|improve this question












I see two different expressions of covariance and wonder whether they are different:




$$Cov(X,Y)=sum_i p(X=X_i,Y=Y_i)(X_i-E(X))(Y_i-E(Y))$$




and




$$Cov(X,Y)=sum_(i,j) p(X=X_i,Y=Y_j)(X_i-E(X))(Y_j-E(Y))$$




Are the two expressions for different situations? I find it confused.









share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Aug 23 at 10:07









user571299

33




33











  • Only the first expression makes sense. When you draw several variables at the same time from a joint probability distribution, covariance measures the likelihood of a pair of those variables to be simultaneously large or small. In the second case, the variables must have been drawn independently. If that was the case, they have been drawn from marginal probability distributions, and thus it is incorrect to use joint probability distribution for their integral measures such as covariance
    – Aleksejs Fomins
    Aug 23 at 10:18










  • @AleksejsFomins actually I wonder how to derive $Cov(X,Y)=0$ for independent $X,Y$ using the first expression, I think it should be the second expression to derive it.
    – user571299
    Aug 23 at 10:22










  • Sorry, I think I got myself confused as well. The two definitions are completely the same, because summing over all combinations of 2 variable values (first expression) is the same as first summing over all possible values of one variable and then the other (2nd expression). The word of caution that I wrote before is about calculating the sample covariance. In that case, one should use the first expression to emphasize that both values were sampled simultaneously
    – Aleksejs Fomins
    Aug 23 at 10:46











  • @AleksejsFomins sorry I cannot well understand. The second includes $X=X_1,Y=Y_2$, but the first doesn’t. Why should they be the same?
    – user571299
    Aug 23 at 10:50










  • Can you explain to me, what you think the first sum does?
    – Aleksejs Fomins
    Aug 23 at 10:52
















  • Only the first expression makes sense. When you draw several variables at the same time from a joint probability distribution, covariance measures the likelihood of a pair of those variables to be simultaneously large or small. In the second case, the variables must have been drawn independently. If that was the case, they have been drawn from marginal probability distributions, and thus it is incorrect to use joint probability distribution for their integral measures such as covariance
    – Aleksejs Fomins
    Aug 23 at 10:18










  • @AleksejsFomins actually I wonder how to derive $Cov(X,Y)=0$ for independent $X,Y$ using the first expression, I think it should be the second expression to derive it.
    – user571299
    Aug 23 at 10:22










  • Sorry, I think I got myself confused as well. The two definitions are completely the same, because summing over all combinations of 2 variable values (first expression) is the same as first summing over all possible values of one variable and then the other (2nd expression). The word of caution that I wrote before is about calculating the sample covariance. In that case, one should use the first expression to emphasize that both values were sampled simultaneously
    – Aleksejs Fomins
    Aug 23 at 10:46











  • @AleksejsFomins sorry I cannot well understand. The second includes $X=X_1,Y=Y_2$, but the first doesn’t. Why should they be the same?
    – user571299
    Aug 23 at 10:50










  • Can you explain to me, what you think the first sum does?
    – Aleksejs Fomins
    Aug 23 at 10:52















Only the first expression makes sense. When you draw several variables at the same time from a joint probability distribution, covariance measures the likelihood of a pair of those variables to be simultaneously large or small. In the second case, the variables must have been drawn independently. If that was the case, they have been drawn from marginal probability distributions, and thus it is incorrect to use joint probability distribution for their integral measures such as covariance
– Aleksejs Fomins
Aug 23 at 10:18




Only the first expression makes sense. When you draw several variables at the same time from a joint probability distribution, covariance measures the likelihood of a pair of those variables to be simultaneously large or small. In the second case, the variables must have been drawn independently. If that was the case, they have been drawn from marginal probability distributions, and thus it is incorrect to use joint probability distribution for their integral measures such as covariance
– Aleksejs Fomins
Aug 23 at 10:18












@AleksejsFomins actually I wonder how to derive $Cov(X,Y)=0$ for independent $X,Y$ using the first expression, I think it should be the second expression to derive it.
– user571299
Aug 23 at 10:22




@AleksejsFomins actually I wonder how to derive $Cov(X,Y)=0$ for independent $X,Y$ using the first expression, I think it should be the second expression to derive it.
– user571299
Aug 23 at 10:22












Sorry, I think I got myself confused as well. The two definitions are completely the same, because summing over all combinations of 2 variable values (first expression) is the same as first summing over all possible values of one variable and then the other (2nd expression). The word of caution that I wrote before is about calculating the sample covariance. In that case, one should use the first expression to emphasize that both values were sampled simultaneously
– Aleksejs Fomins
Aug 23 at 10:46





Sorry, I think I got myself confused as well. The two definitions are completely the same, because summing over all combinations of 2 variable values (first expression) is the same as first summing over all possible values of one variable and then the other (2nd expression). The word of caution that I wrote before is about calculating the sample covariance. In that case, one should use the first expression to emphasize that both values were sampled simultaneously
– Aleksejs Fomins
Aug 23 at 10:46













@AleksejsFomins sorry I cannot well understand. The second includes $X=X_1,Y=Y_2$, but the first doesn’t. Why should they be the same?
– user571299
Aug 23 at 10:50




@AleksejsFomins sorry I cannot well understand. The second includes $X=X_1,Y=Y_2$, but the first doesn’t. Why should they be the same?
– user571299
Aug 23 at 10:50












Can you explain to me, what you think the first sum does?
– Aleksejs Fomins
Aug 23 at 10:52




Can you explain to me, what you think the first sum does?
– Aleksejs Fomins
Aug 23 at 10:52















active

oldest

votes











Your Answer




StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2891931%2fis-there-two-different-definition-of-covariance%23new-answer', 'question_page');

);

Post as a guest



































active

oldest

votes













active

oldest

votes









active

oldest

votes






active

oldest

votes















 

draft saved


draft discarded















































 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2891931%2fis-there-two-different-definition-of-covariance%23new-answer', 'question_page');

);

Post as a guest













































































這個網誌中的熱門文章

Is there any way to eliminate the singular point to solve this integral by hand or by approximations?

Why am i infinitely getting the same tweet with the Twitter Search API?

Carbon dioxide