Is there two different definition of covariance?
Clash Royale CLAN TAG#URR8PPP
up vote
0
down vote
favorite
I see two different expressions of covariance and wonder whether they are different:
$$Cov(X,Y)=sum_i p(X=X_i,Y=Y_i)(X_i-E(X))(Y_i-E(Y))$$
and
$$Cov(X,Y)=sum_(i,j) p(X=X_i,Y=Y_j)(X_i-E(X))(Y_j-E(Y))$$
Are the two expressions for different situations? I find it confused.
probability
 |Â
show 7 more comments
up vote
0
down vote
favorite
I see two different expressions of covariance and wonder whether they are different:
$$Cov(X,Y)=sum_i p(X=X_i,Y=Y_i)(X_i-E(X))(Y_i-E(Y))$$
and
$$Cov(X,Y)=sum_(i,j) p(X=X_i,Y=Y_j)(X_i-E(X))(Y_j-E(Y))$$
Are the two expressions for different situations? I find it confused.
probability
Only the first expression makes sense. When you draw several variables at the same time from a joint probability distribution, covariance measures the likelihood of a pair of those variables to be simultaneously large or small. In the second case, the variables must have been drawn independently. If that was the case, they have been drawn from marginal probability distributions, and thus it is incorrect to use joint probability distribution for their integral measures such as covariance
â Aleksejs Fomins
Aug 23 at 10:18
@AleksejsFomins actually I wonder how to derive $Cov(X,Y)=0$ for independent $X,Y$ using the first expression, I think it should be the second expression to derive it.
â user571299
Aug 23 at 10:22
Sorry, I think I got myself confused as well. The two definitions are completely the same, because summing over all combinations of 2 variable values (first expression) is the same as first summing over all possible values of one variable and then the other (2nd expression). The word of caution that I wrote before is about calculating the sample covariance. In that case, one should use the first expression to emphasize that both values were sampled simultaneously
â Aleksejs Fomins
Aug 23 at 10:46
@AleksejsFomins sorry I cannot well understand. The second includes $X=X_1,Y=Y_2$, but the first doesnâÂÂt. Why should they be the same?
â user571299
Aug 23 at 10:50
Can you explain to me, what you think the first sum does?
â Aleksejs Fomins
Aug 23 at 10:52
 |Â
show 7 more comments
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I see two different expressions of covariance and wonder whether they are different:
$$Cov(X,Y)=sum_i p(X=X_i,Y=Y_i)(X_i-E(X))(Y_i-E(Y))$$
and
$$Cov(X,Y)=sum_(i,j) p(X=X_i,Y=Y_j)(X_i-E(X))(Y_j-E(Y))$$
Are the two expressions for different situations? I find it confused.
probability
I see two different expressions of covariance and wonder whether they are different:
$$Cov(X,Y)=sum_i p(X=X_i,Y=Y_i)(X_i-E(X))(Y_i-E(Y))$$
and
$$Cov(X,Y)=sum_(i,j) p(X=X_i,Y=Y_j)(X_i-E(X))(Y_j-E(Y))$$
Are the two expressions for different situations? I find it confused.
probability
asked Aug 23 at 10:07
user571299
33
33
Only the first expression makes sense. When you draw several variables at the same time from a joint probability distribution, covariance measures the likelihood of a pair of those variables to be simultaneously large or small. In the second case, the variables must have been drawn independently. If that was the case, they have been drawn from marginal probability distributions, and thus it is incorrect to use joint probability distribution for their integral measures such as covariance
â Aleksejs Fomins
Aug 23 at 10:18
@AleksejsFomins actually I wonder how to derive $Cov(X,Y)=0$ for independent $X,Y$ using the first expression, I think it should be the second expression to derive it.
â user571299
Aug 23 at 10:22
Sorry, I think I got myself confused as well. The two definitions are completely the same, because summing over all combinations of 2 variable values (first expression) is the same as first summing over all possible values of one variable and then the other (2nd expression). The word of caution that I wrote before is about calculating the sample covariance. In that case, one should use the first expression to emphasize that both values were sampled simultaneously
â Aleksejs Fomins
Aug 23 at 10:46
@AleksejsFomins sorry I cannot well understand. The second includes $X=X_1,Y=Y_2$, but the first doesnâÂÂt. Why should they be the same?
â user571299
Aug 23 at 10:50
Can you explain to me, what you think the first sum does?
â Aleksejs Fomins
Aug 23 at 10:52
 |Â
show 7 more comments
Only the first expression makes sense. When you draw several variables at the same time from a joint probability distribution, covariance measures the likelihood of a pair of those variables to be simultaneously large or small. In the second case, the variables must have been drawn independently. If that was the case, they have been drawn from marginal probability distributions, and thus it is incorrect to use joint probability distribution for their integral measures such as covariance
â Aleksejs Fomins
Aug 23 at 10:18
@AleksejsFomins actually I wonder how to derive $Cov(X,Y)=0$ for independent $X,Y$ using the first expression, I think it should be the second expression to derive it.
â user571299
Aug 23 at 10:22
Sorry, I think I got myself confused as well. The two definitions are completely the same, because summing over all combinations of 2 variable values (first expression) is the same as first summing over all possible values of one variable and then the other (2nd expression). The word of caution that I wrote before is about calculating the sample covariance. In that case, one should use the first expression to emphasize that both values were sampled simultaneously
â Aleksejs Fomins
Aug 23 at 10:46
@AleksejsFomins sorry I cannot well understand. The second includes $X=X_1,Y=Y_2$, but the first doesnâÂÂt. Why should they be the same?
â user571299
Aug 23 at 10:50
Can you explain to me, what you think the first sum does?
â Aleksejs Fomins
Aug 23 at 10:52
Only the first expression makes sense. When you draw several variables at the same time from a joint probability distribution, covariance measures the likelihood of a pair of those variables to be simultaneously large or small. In the second case, the variables must have been drawn independently. If that was the case, they have been drawn from marginal probability distributions, and thus it is incorrect to use joint probability distribution for their integral measures such as covariance
â Aleksejs Fomins
Aug 23 at 10:18
Only the first expression makes sense. When you draw several variables at the same time from a joint probability distribution, covariance measures the likelihood of a pair of those variables to be simultaneously large or small. In the second case, the variables must have been drawn independently. If that was the case, they have been drawn from marginal probability distributions, and thus it is incorrect to use joint probability distribution for their integral measures such as covariance
â Aleksejs Fomins
Aug 23 at 10:18
@AleksejsFomins actually I wonder how to derive $Cov(X,Y)=0$ for independent $X,Y$ using the first expression, I think it should be the second expression to derive it.
â user571299
Aug 23 at 10:22
@AleksejsFomins actually I wonder how to derive $Cov(X,Y)=0$ for independent $X,Y$ using the first expression, I think it should be the second expression to derive it.
â user571299
Aug 23 at 10:22
Sorry, I think I got myself confused as well. The two definitions are completely the same, because summing over all combinations of 2 variable values (first expression) is the same as first summing over all possible values of one variable and then the other (2nd expression). The word of caution that I wrote before is about calculating the sample covariance. In that case, one should use the first expression to emphasize that both values were sampled simultaneously
â Aleksejs Fomins
Aug 23 at 10:46
Sorry, I think I got myself confused as well. The two definitions are completely the same, because summing over all combinations of 2 variable values (first expression) is the same as first summing over all possible values of one variable and then the other (2nd expression). The word of caution that I wrote before is about calculating the sample covariance. In that case, one should use the first expression to emphasize that both values were sampled simultaneously
â Aleksejs Fomins
Aug 23 at 10:46
@AleksejsFomins sorry I cannot well understand. The second includes $X=X_1,Y=Y_2$, but the first doesnâÂÂt. Why should they be the same?
â user571299
Aug 23 at 10:50
@AleksejsFomins sorry I cannot well understand. The second includes $X=X_1,Y=Y_2$, but the first doesnâÂÂt. Why should they be the same?
â user571299
Aug 23 at 10:50
Can you explain to me, what you think the first sum does?
â Aleksejs Fomins
Aug 23 at 10:52
Can you explain to me, what you think the first sum does?
â Aleksejs Fomins
Aug 23 at 10:52
 |Â
show 7 more comments
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2891931%2fis-there-two-different-definition-of-covariance%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Only the first expression makes sense. When you draw several variables at the same time from a joint probability distribution, covariance measures the likelihood of a pair of those variables to be simultaneously large or small. In the second case, the variables must have been drawn independently. If that was the case, they have been drawn from marginal probability distributions, and thus it is incorrect to use joint probability distribution for their integral measures such as covariance
â Aleksejs Fomins
Aug 23 at 10:18
@AleksejsFomins actually I wonder how to derive $Cov(X,Y)=0$ for independent $X,Y$ using the first expression, I think it should be the second expression to derive it.
â user571299
Aug 23 at 10:22
Sorry, I think I got myself confused as well. The two definitions are completely the same, because summing over all combinations of 2 variable values (first expression) is the same as first summing over all possible values of one variable and then the other (2nd expression). The word of caution that I wrote before is about calculating the sample covariance. In that case, one should use the first expression to emphasize that both values were sampled simultaneously
â Aleksejs Fomins
Aug 23 at 10:46
@AleksejsFomins sorry I cannot well understand. The second includes $X=X_1,Y=Y_2$, but the first doesnâÂÂt. Why should they be the same?
â user571299
Aug 23 at 10:50
Can you explain to me, what you think the first sum does?
â Aleksejs Fomins
Aug 23 at 10:52