Additivity of expected value
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
Let's say I have a set $Omega$ with a probability $P$ on its subsets (the fact that it could be undefined on some subset is not important here, let's not think about that). I have a random variable $(X,Y): Omega to mathbbR^2$, and a function $phi: mathbbR^2 to mathbbR^2$. Then, I define the probability induced by the random variable as $P_X,Y(A)=P((X,Y)^-1(A))$, with $A subseteq mathbbR^2$. A well known result tells me that (given $omega=(x,y) in Omega$): $$int_Omega phi((X,Y)(omega))dP=int_mathbbR^2 phi(x) dP_X,Y$$
Consequently, taking $phi =x+y$, I have that $E[(X,Y)]= int_Omega (X,Y)(omega) dP=sum_x_j,y_i in Omega (X,Y)(x_i,y_j) p_x,y(x_i,y_j)$ if $Omega$ is countable, where $p_x,y$ is the probability that $(X,Y)(omega)=(x_i,y_j)$.
What I don't get is why it is that $E[X+Y]=E[X]+E[Y]$ when $X$ and $Y$ are integrable. For instance, if I had a random variable $X+Y$ with $X$ and $Y$ that assume all positive integer values, I don't get why $$E[X+Y]=sum_n=1^infty n p_x,y(X+Y=n)=sum_i=1^infty i p_x(X=i) + sum_j=1^infty j p_y (Y=j)=E[X]+E[Y]$$
The integral interpretation, also, doesn't help me in figuring this out. I wonder, also, if there is a criterion such that we know when integrating $phi$ is additive with respect to the random variables. For instance, the expected value is found taking $phi=x+y$, and we have additivity.
probability probability-theory probability-distributions
add a comment |Â
up vote
1
down vote
favorite
Let's say I have a set $Omega$ with a probability $P$ on its subsets (the fact that it could be undefined on some subset is not important here, let's not think about that). I have a random variable $(X,Y): Omega to mathbbR^2$, and a function $phi: mathbbR^2 to mathbbR^2$. Then, I define the probability induced by the random variable as $P_X,Y(A)=P((X,Y)^-1(A))$, with $A subseteq mathbbR^2$. A well known result tells me that (given $omega=(x,y) in Omega$): $$int_Omega phi((X,Y)(omega))dP=int_mathbbR^2 phi(x) dP_X,Y$$
Consequently, taking $phi =x+y$, I have that $E[(X,Y)]= int_Omega (X,Y)(omega) dP=sum_x_j,y_i in Omega (X,Y)(x_i,y_j) p_x,y(x_i,y_j)$ if $Omega$ is countable, where $p_x,y$ is the probability that $(X,Y)(omega)=(x_i,y_j)$.
What I don't get is why it is that $E[X+Y]=E[X]+E[Y]$ when $X$ and $Y$ are integrable. For instance, if I had a random variable $X+Y$ with $X$ and $Y$ that assume all positive integer values, I don't get why $$E[X+Y]=sum_n=1^infty n p_x,y(X+Y=n)=sum_i=1^infty i p_x(X=i) + sum_j=1^infty j p_y (Y=j)=E[X]+E[Y]$$
The integral interpretation, also, doesn't help me in figuring this out. I wonder, also, if there is a criterion such that we know when integrating $phi$ is additive with respect to the random variables. For instance, the expected value is found taking $phi=x+y$, and we have additivity.
probability probability-theory probability-distributions
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
Let's say I have a set $Omega$ with a probability $P$ on its subsets (the fact that it could be undefined on some subset is not important here, let's not think about that). I have a random variable $(X,Y): Omega to mathbbR^2$, and a function $phi: mathbbR^2 to mathbbR^2$. Then, I define the probability induced by the random variable as $P_X,Y(A)=P((X,Y)^-1(A))$, with $A subseteq mathbbR^2$. A well known result tells me that (given $omega=(x,y) in Omega$): $$int_Omega phi((X,Y)(omega))dP=int_mathbbR^2 phi(x) dP_X,Y$$
Consequently, taking $phi =x+y$, I have that $E[(X,Y)]= int_Omega (X,Y)(omega) dP=sum_x_j,y_i in Omega (X,Y)(x_i,y_j) p_x,y(x_i,y_j)$ if $Omega$ is countable, where $p_x,y$ is the probability that $(X,Y)(omega)=(x_i,y_j)$.
What I don't get is why it is that $E[X+Y]=E[X]+E[Y]$ when $X$ and $Y$ are integrable. For instance, if I had a random variable $X+Y$ with $X$ and $Y$ that assume all positive integer values, I don't get why $$E[X+Y]=sum_n=1^infty n p_x,y(X+Y=n)=sum_i=1^infty i p_x(X=i) + sum_j=1^infty j p_y (Y=j)=E[X]+E[Y]$$
The integral interpretation, also, doesn't help me in figuring this out. I wonder, also, if there is a criterion such that we know when integrating $phi$ is additive with respect to the random variables. For instance, the expected value is found taking $phi=x+y$, and we have additivity.
probability probability-theory probability-distributions
Let's say I have a set $Omega$ with a probability $P$ on its subsets (the fact that it could be undefined on some subset is not important here, let's not think about that). I have a random variable $(X,Y): Omega to mathbbR^2$, and a function $phi: mathbbR^2 to mathbbR^2$. Then, I define the probability induced by the random variable as $P_X,Y(A)=P((X,Y)^-1(A))$, with $A subseteq mathbbR^2$. A well known result tells me that (given $omega=(x,y) in Omega$): $$int_Omega phi((X,Y)(omega))dP=int_mathbbR^2 phi(x) dP_X,Y$$
Consequently, taking $phi =x+y$, I have that $E[(X,Y)]= int_Omega (X,Y)(omega) dP=sum_x_j,y_i in Omega (X,Y)(x_i,y_j) p_x,y(x_i,y_j)$ if $Omega$ is countable, where $p_x,y$ is the probability that $(X,Y)(omega)=(x_i,y_j)$.
What I don't get is why it is that $E[X+Y]=E[X]+E[Y]$ when $X$ and $Y$ are integrable. For instance, if I had a random variable $X+Y$ with $X$ and $Y$ that assume all positive integer values, I don't get why $$E[X+Y]=sum_n=1^infty n p_x,y(X+Y=n)=sum_i=1^infty i p_x(X=i) + sum_j=1^infty j p_y (Y=j)=E[X]+E[Y]$$
The integral interpretation, also, doesn't help me in figuring this out. I wonder, also, if there is a criterion such that we know when integrating $phi$ is additive with respect to the random variables. For instance, the expected value is found taking $phi=x+y$, and we have additivity.
probability probability-theory probability-distributions
edited Aug 20 at 11:43
asked Aug 20 at 10:22
tommy1996q
508313
508313
add a comment |Â
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
1
down vote
accepted
$$sum_nnmathsfPleft(X+Y=nright)=sum_nsum_i+j=nleft(i+jright)mathsfPleft(X=iwedge Y=jright)=$$$$sum isum_jmathsfPleft(X=iwedge Y=jright)+sum jsum_imathsfPleft(X=iwedge Y=jright)=$$$$sum imathsfPleft(X=iright)+sum jmathsfPleft(Y=jright)$$
Yeah thanks, I have tried doing this way and it turns out to work. The key to work this out (both with the integral and with the series) seems to be the fact that you can separate the two variables, and using in a clever way the fact that $P_X,Y(x_i,y_j)=P(X=x_i wedge Y=y_j)$
â tommy1996q
Aug 20 at 12:46
Actually in order to prove that expectation is linear you can just use the more basic: $mathbbEleft(X+Yright):=int Xleft(omegaright)+Yleft(omegaright)mathsfPleft(domegaright)=int Xleft(omegaright)mathsfPleft(domegaright)+int Yleft(omegaright)mathsfPleft(domegaright)=mathbbEX+mathbbEY$. That is also more general (the rv's do not have to be discrete).
â drhab
Aug 20 at 12:50
You are absolutely right, don't know why I wanted to mess things up. It was a good exercise anyway. Thanks!
â tommy1996q
Aug 20 at 13:44
You are welcome.
â drhab
Aug 20 at 13:56
add a comment |Â
up vote
1
down vote
The formula you have written is not correct. $phi$ is a function of two variables and you should write $int phi (x,y), dP_X,Y$ on the right side. $phi=1$ doesn't give you anything. It gives you $1=1$!. For $E(X+Y)=EX+EY$ take $phi (x,y)=x+y$ and use the fact that marginals of $P_X,Y$ are $P_X$ and $P_Y$.
Yeah, of course $phi$ is what you said, thanks
â tommy1996q
Aug 20 at 11:43
Also, what do you mean by marginals? I was trying to rearrange the series or integrate in some useful way, but maybe tou are talking about something different? (Or maybe itâÂÂs just me who isnâÂÂt aware of the term âÂÂmarginal of $P_x,y$)
â tommy1996q
Aug 20 at 11:47
If $phi$ depends only on $x$ then $int phi (x,y) , dP_X,Y= int phi (x,y) , dP_X$ and similarly when $phi$ depends only on $y$. $P_X$ and $P_Y$ are the marginal distributions and $P_X,Y$ is the joint distribution.
â Kavi Rama Murthy
Aug 20 at 11:52
Sorry for bad typo, don't have space. Hope I got it now. For the integral, $int_R^2(x+y)dP_X,Y= int_R x dP_X,Y + int_R y dP_X,Y$, and in the first integral $dP_X,Y=dP_X$ cause for every subset $A$ of $R^2$, x^-1 (A)= B times R$ with $B in Omega$because y isn't involved, and the integral can thus be simplified in terms of the x only. For the series, maybe if I rearrange it this way, it works: $sum_n=1^infty (x+y)p(n)=sum_nsum_i+j=n (i+j)p(X=i,Y=j)$ and rearranging $sum_ii sum_j p(X=i,Y=j) + sum_j j sum_ip(X=i,Y=j)=sum_i i P(X=i) + sum_j j P(Y=j)$
â tommy1996q
Aug 20 at 12:42
Is this correct?
â tommy1996q
Aug 20 at 12:43
 |Â
show 1 more comment
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
$$sum_nnmathsfPleft(X+Y=nright)=sum_nsum_i+j=nleft(i+jright)mathsfPleft(X=iwedge Y=jright)=$$$$sum isum_jmathsfPleft(X=iwedge Y=jright)+sum jsum_imathsfPleft(X=iwedge Y=jright)=$$$$sum imathsfPleft(X=iright)+sum jmathsfPleft(Y=jright)$$
Yeah thanks, I have tried doing this way and it turns out to work. The key to work this out (both with the integral and with the series) seems to be the fact that you can separate the two variables, and using in a clever way the fact that $P_X,Y(x_i,y_j)=P(X=x_i wedge Y=y_j)$
â tommy1996q
Aug 20 at 12:46
Actually in order to prove that expectation is linear you can just use the more basic: $mathbbEleft(X+Yright):=int Xleft(omegaright)+Yleft(omegaright)mathsfPleft(domegaright)=int Xleft(omegaright)mathsfPleft(domegaright)+int Yleft(omegaright)mathsfPleft(domegaright)=mathbbEX+mathbbEY$. That is also more general (the rv's do not have to be discrete).
â drhab
Aug 20 at 12:50
You are absolutely right, don't know why I wanted to mess things up. It was a good exercise anyway. Thanks!
â tommy1996q
Aug 20 at 13:44
You are welcome.
â drhab
Aug 20 at 13:56
add a comment |Â
up vote
1
down vote
accepted
$$sum_nnmathsfPleft(X+Y=nright)=sum_nsum_i+j=nleft(i+jright)mathsfPleft(X=iwedge Y=jright)=$$$$sum isum_jmathsfPleft(X=iwedge Y=jright)+sum jsum_imathsfPleft(X=iwedge Y=jright)=$$$$sum imathsfPleft(X=iright)+sum jmathsfPleft(Y=jright)$$
Yeah thanks, I have tried doing this way and it turns out to work. The key to work this out (both with the integral and with the series) seems to be the fact that you can separate the two variables, and using in a clever way the fact that $P_X,Y(x_i,y_j)=P(X=x_i wedge Y=y_j)$
â tommy1996q
Aug 20 at 12:46
Actually in order to prove that expectation is linear you can just use the more basic: $mathbbEleft(X+Yright):=int Xleft(omegaright)+Yleft(omegaright)mathsfPleft(domegaright)=int Xleft(omegaright)mathsfPleft(domegaright)+int Yleft(omegaright)mathsfPleft(domegaright)=mathbbEX+mathbbEY$. That is also more general (the rv's do not have to be discrete).
â drhab
Aug 20 at 12:50
You are absolutely right, don't know why I wanted to mess things up. It was a good exercise anyway. Thanks!
â tommy1996q
Aug 20 at 13:44
You are welcome.
â drhab
Aug 20 at 13:56
add a comment |Â
up vote
1
down vote
accepted
up vote
1
down vote
accepted
$$sum_nnmathsfPleft(X+Y=nright)=sum_nsum_i+j=nleft(i+jright)mathsfPleft(X=iwedge Y=jright)=$$$$sum isum_jmathsfPleft(X=iwedge Y=jright)+sum jsum_imathsfPleft(X=iwedge Y=jright)=$$$$sum imathsfPleft(X=iright)+sum jmathsfPleft(Y=jright)$$
$$sum_nnmathsfPleft(X+Y=nright)=sum_nsum_i+j=nleft(i+jright)mathsfPleft(X=iwedge Y=jright)=$$$$sum isum_jmathsfPleft(X=iwedge Y=jright)+sum jsum_imathsfPleft(X=iwedge Y=jright)=$$$$sum imathsfPleft(X=iright)+sum jmathsfPleft(Y=jright)$$
answered Aug 20 at 12:34
drhab
87.8k541119
87.8k541119
Yeah thanks, I have tried doing this way and it turns out to work. The key to work this out (both with the integral and with the series) seems to be the fact that you can separate the two variables, and using in a clever way the fact that $P_X,Y(x_i,y_j)=P(X=x_i wedge Y=y_j)$
â tommy1996q
Aug 20 at 12:46
Actually in order to prove that expectation is linear you can just use the more basic: $mathbbEleft(X+Yright):=int Xleft(omegaright)+Yleft(omegaright)mathsfPleft(domegaright)=int Xleft(omegaright)mathsfPleft(domegaright)+int Yleft(omegaright)mathsfPleft(domegaright)=mathbbEX+mathbbEY$. That is also more general (the rv's do not have to be discrete).
â drhab
Aug 20 at 12:50
You are absolutely right, don't know why I wanted to mess things up. It was a good exercise anyway. Thanks!
â tommy1996q
Aug 20 at 13:44
You are welcome.
â drhab
Aug 20 at 13:56
add a comment |Â
Yeah thanks, I have tried doing this way and it turns out to work. The key to work this out (both with the integral and with the series) seems to be the fact that you can separate the two variables, and using in a clever way the fact that $P_X,Y(x_i,y_j)=P(X=x_i wedge Y=y_j)$
â tommy1996q
Aug 20 at 12:46
Actually in order to prove that expectation is linear you can just use the more basic: $mathbbEleft(X+Yright):=int Xleft(omegaright)+Yleft(omegaright)mathsfPleft(domegaright)=int Xleft(omegaright)mathsfPleft(domegaright)+int Yleft(omegaright)mathsfPleft(domegaright)=mathbbEX+mathbbEY$. That is also more general (the rv's do not have to be discrete).
â drhab
Aug 20 at 12:50
You are absolutely right, don't know why I wanted to mess things up. It was a good exercise anyway. Thanks!
â tommy1996q
Aug 20 at 13:44
You are welcome.
â drhab
Aug 20 at 13:56
Yeah thanks, I have tried doing this way and it turns out to work. The key to work this out (both with the integral and with the series) seems to be the fact that you can separate the two variables, and using in a clever way the fact that $P_X,Y(x_i,y_j)=P(X=x_i wedge Y=y_j)$
â tommy1996q
Aug 20 at 12:46
Yeah thanks, I have tried doing this way and it turns out to work. The key to work this out (both with the integral and with the series) seems to be the fact that you can separate the two variables, and using in a clever way the fact that $P_X,Y(x_i,y_j)=P(X=x_i wedge Y=y_j)$
â tommy1996q
Aug 20 at 12:46
Actually in order to prove that expectation is linear you can just use the more basic: $mathbbEleft(X+Yright):=int Xleft(omegaright)+Yleft(omegaright)mathsfPleft(domegaright)=int Xleft(omegaright)mathsfPleft(domegaright)+int Yleft(omegaright)mathsfPleft(domegaright)=mathbbEX+mathbbEY$. That is also more general (the rv's do not have to be discrete).
â drhab
Aug 20 at 12:50
Actually in order to prove that expectation is linear you can just use the more basic: $mathbbEleft(X+Yright):=int Xleft(omegaright)+Yleft(omegaright)mathsfPleft(domegaright)=int Xleft(omegaright)mathsfPleft(domegaright)+int Yleft(omegaright)mathsfPleft(domegaright)=mathbbEX+mathbbEY$. That is also more general (the rv's do not have to be discrete).
â drhab
Aug 20 at 12:50
You are absolutely right, don't know why I wanted to mess things up. It was a good exercise anyway. Thanks!
â tommy1996q
Aug 20 at 13:44
You are absolutely right, don't know why I wanted to mess things up. It was a good exercise anyway. Thanks!
â tommy1996q
Aug 20 at 13:44
You are welcome.
â drhab
Aug 20 at 13:56
You are welcome.
â drhab
Aug 20 at 13:56
add a comment |Â
up vote
1
down vote
The formula you have written is not correct. $phi$ is a function of two variables and you should write $int phi (x,y), dP_X,Y$ on the right side. $phi=1$ doesn't give you anything. It gives you $1=1$!. For $E(X+Y)=EX+EY$ take $phi (x,y)=x+y$ and use the fact that marginals of $P_X,Y$ are $P_X$ and $P_Y$.
Yeah, of course $phi$ is what you said, thanks
â tommy1996q
Aug 20 at 11:43
Also, what do you mean by marginals? I was trying to rearrange the series or integrate in some useful way, but maybe tou are talking about something different? (Or maybe itâÂÂs just me who isnâÂÂt aware of the term âÂÂmarginal of $P_x,y$)
â tommy1996q
Aug 20 at 11:47
If $phi$ depends only on $x$ then $int phi (x,y) , dP_X,Y= int phi (x,y) , dP_X$ and similarly when $phi$ depends only on $y$. $P_X$ and $P_Y$ are the marginal distributions and $P_X,Y$ is the joint distribution.
â Kavi Rama Murthy
Aug 20 at 11:52
Sorry for bad typo, don't have space. Hope I got it now. For the integral, $int_R^2(x+y)dP_X,Y= int_R x dP_X,Y + int_R y dP_X,Y$, and in the first integral $dP_X,Y=dP_X$ cause for every subset $A$ of $R^2$, x^-1 (A)= B times R$ with $B in Omega$because y isn't involved, and the integral can thus be simplified in terms of the x only. For the series, maybe if I rearrange it this way, it works: $sum_n=1^infty (x+y)p(n)=sum_nsum_i+j=n (i+j)p(X=i,Y=j)$ and rearranging $sum_ii sum_j p(X=i,Y=j) + sum_j j sum_ip(X=i,Y=j)=sum_i i P(X=i) + sum_j j P(Y=j)$
â tommy1996q
Aug 20 at 12:42
Is this correct?
â tommy1996q
Aug 20 at 12:43
 |Â
show 1 more comment
up vote
1
down vote
The formula you have written is not correct. $phi$ is a function of two variables and you should write $int phi (x,y), dP_X,Y$ on the right side. $phi=1$ doesn't give you anything. It gives you $1=1$!. For $E(X+Y)=EX+EY$ take $phi (x,y)=x+y$ and use the fact that marginals of $P_X,Y$ are $P_X$ and $P_Y$.
Yeah, of course $phi$ is what you said, thanks
â tommy1996q
Aug 20 at 11:43
Also, what do you mean by marginals? I was trying to rearrange the series or integrate in some useful way, but maybe tou are talking about something different? (Or maybe itâÂÂs just me who isnâÂÂt aware of the term âÂÂmarginal of $P_x,y$)
â tommy1996q
Aug 20 at 11:47
If $phi$ depends only on $x$ then $int phi (x,y) , dP_X,Y= int phi (x,y) , dP_X$ and similarly when $phi$ depends only on $y$. $P_X$ and $P_Y$ are the marginal distributions and $P_X,Y$ is the joint distribution.
â Kavi Rama Murthy
Aug 20 at 11:52
Sorry for bad typo, don't have space. Hope I got it now. For the integral, $int_R^2(x+y)dP_X,Y= int_R x dP_X,Y + int_R y dP_X,Y$, and in the first integral $dP_X,Y=dP_X$ cause for every subset $A$ of $R^2$, x^-1 (A)= B times R$ with $B in Omega$because y isn't involved, and the integral can thus be simplified in terms of the x only. For the series, maybe if I rearrange it this way, it works: $sum_n=1^infty (x+y)p(n)=sum_nsum_i+j=n (i+j)p(X=i,Y=j)$ and rearranging $sum_ii sum_j p(X=i,Y=j) + sum_j j sum_ip(X=i,Y=j)=sum_i i P(X=i) + sum_j j P(Y=j)$
â tommy1996q
Aug 20 at 12:42
Is this correct?
â tommy1996q
Aug 20 at 12:43
 |Â
show 1 more comment
up vote
1
down vote
up vote
1
down vote
The formula you have written is not correct. $phi$ is a function of two variables and you should write $int phi (x,y), dP_X,Y$ on the right side. $phi=1$ doesn't give you anything. It gives you $1=1$!. For $E(X+Y)=EX+EY$ take $phi (x,y)=x+y$ and use the fact that marginals of $P_X,Y$ are $P_X$ and $P_Y$.
The formula you have written is not correct. $phi$ is a function of two variables and you should write $int phi (x,y), dP_X,Y$ on the right side. $phi=1$ doesn't give you anything. It gives you $1=1$!. For $E(X+Y)=EX+EY$ take $phi (x,y)=x+y$ and use the fact that marginals of $P_X,Y$ are $P_X$ and $P_Y$.
answered Aug 20 at 10:26
Kavi Rama Murthy
23.3k2933
23.3k2933
Yeah, of course $phi$ is what you said, thanks
â tommy1996q
Aug 20 at 11:43
Also, what do you mean by marginals? I was trying to rearrange the series or integrate in some useful way, but maybe tou are talking about something different? (Or maybe itâÂÂs just me who isnâÂÂt aware of the term âÂÂmarginal of $P_x,y$)
â tommy1996q
Aug 20 at 11:47
If $phi$ depends only on $x$ then $int phi (x,y) , dP_X,Y= int phi (x,y) , dP_X$ and similarly when $phi$ depends only on $y$. $P_X$ and $P_Y$ are the marginal distributions and $P_X,Y$ is the joint distribution.
â Kavi Rama Murthy
Aug 20 at 11:52
Sorry for bad typo, don't have space. Hope I got it now. For the integral, $int_R^2(x+y)dP_X,Y= int_R x dP_X,Y + int_R y dP_X,Y$, and in the first integral $dP_X,Y=dP_X$ cause for every subset $A$ of $R^2$, x^-1 (A)= B times R$ with $B in Omega$because y isn't involved, and the integral can thus be simplified in terms of the x only. For the series, maybe if I rearrange it this way, it works: $sum_n=1^infty (x+y)p(n)=sum_nsum_i+j=n (i+j)p(X=i,Y=j)$ and rearranging $sum_ii sum_j p(X=i,Y=j) + sum_j j sum_ip(X=i,Y=j)=sum_i i P(X=i) + sum_j j P(Y=j)$
â tommy1996q
Aug 20 at 12:42
Is this correct?
â tommy1996q
Aug 20 at 12:43
 |Â
show 1 more comment
Yeah, of course $phi$ is what you said, thanks
â tommy1996q
Aug 20 at 11:43
Also, what do you mean by marginals? I was trying to rearrange the series or integrate in some useful way, but maybe tou are talking about something different? (Or maybe itâÂÂs just me who isnâÂÂt aware of the term âÂÂmarginal of $P_x,y$)
â tommy1996q
Aug 20 at 11:47
If $phi$ depends only on $x$ then $int phi (x,y) , dP_X,Y= int phi (x,y) , dP_X$ and similarly when $phi$ depends only on $y$. $P_X$ and $P_Y$ are the marginal distributions and $P_X,Y$ is the joint distribution.
â Kavi Rama Murthy
Aug 20 at 11:52
Sorry for bad typo, don't have space. Hope I got it now. For the integral, $int_R^2(x+y)dP_X,Y= int_R x dP_X,Y + int_R y dP_X,Y$, and in the first integral $dP_X,Y=dP_X$ cause for every subset $A$ of $R^2$, x^-1 (A)= B times R$ with $B in Omega$because y isn't involved, and the integral can thus be simplified in terms of the x only. For the series, maybe if I rearrange it this way, it works: $sum_n=1^infty (x+y)p(n)=sum_nsum_i+j=n (i+j)p(X=i,Y=j)$ and rearranging $sum_ii sum_j p(X=i,Y=j) + sum_j j sum_ip(X=i,Y=j)=sum_i i P(X=i) + sum_j j P(Y=j)$
â tommy1996q
Aug 20 at 12:42
Is this correct?
â tommy1996q
Aug 20 at 12:43
Yeah, of course $phi$ is what you said, thanks
â tommy1996q
Aug 20 at 11:43
Yeah, of course $phi$ is what you said, thanks
â tommy1996q
Aug 20 at 11:43
Also, what do you mean by marginals? I was trying to rearrange the series or integrate in some useful way, but maybe tou are talking about something different? (Or maybe itâÂÂs just me who isnâÂÂt aware of the term âÂÂmarginal of $P_x,y$)
â tommy1996q
Aug 20 at 11:47
Also, what do you mean by marginals? I was trying to rearrange the series or integrate in some useful way, but maybe tou are talking about something different? (Or maybe itâÂÂs just me who isnâÂÂt aware of the term âÂÂmarginal of $P_x,y$)
â tommy1996q
Aug 20 at 11:47
If $phi$ depends only on $x$ then $int phi (x,y) , dP_X,Y= int phi (x,y) , dP_X$ and similarly when $phi$ depends only on $y$. $P_X$ and $P_Y$ are the marginal distributions and $P_X,Y$ is the joint distribution.
â Kavi Rama Murthy
Aug 20 at 11:52
If $phi$ depends only on $x$ then $int phi (x,y) , dP_X,Y= int phi (x,y) , dP_X$ and similarly when $phi$ depends only on $y$. $P_X$ and $P_Y$ are the marginal distributions and $P_X,Y$ is the joint distribution.
â Kavi Rama Murthy
Aug 20 at 11:52
Sorry for bad typo, don't have space. Hope I got it now. For the integral, $int_R^2(x+y)dP_X,Y= int_R x dP_X,Y + int_R y dP_X,Y$, and in the first integral $dP_X,Y=dP_X$ cause for every subset $A$ of $R^2$, x^-1 (A)= B times R$ with $B in Omega$because y isn't involved, and the integral can thus be simplified in terms of the x only. For the series, maybe if I rearrange it this way, it works: $sum_n=1^infty (x+y)p(n)=sum_nsum_i+j=n (i+j)p(X=i,Y=j)$ and rearranging $sum_ii sum_j p(X=i,Y=j) + sum_j j sum_ip(X=i,Y=j)=sum_i i P(X=i) + sum_j j P(Y=j)$
â tommy1996q
Aug 20 at 12:42
Sorry for bad typo, don't have space. Hope I got it now. For the integral, $int_R^2(x+y)dP_X,Y= int_R x dP_X,Y + int_R y dP_X,Y$, and in the first integral $dP_X,Y=dP_X$ cause for every subset $A$ of $R^2$, x^-1 (A)= B times R$ with $B in Omega$because y isn't involved, and the integral can thus be simplified in terms of the x only. For the series, maybe if I rearrange it this way, it works: $sum_n=1^infty (x+y)p(n)=sum_nsum_i+j=n (i+j)p(X=i,Y=j)$ and rearranging $sum_ii sum_j p(X=i,Y=j) + sum_j j sum_ip(X=i,Y=j)=sum_i i P(X=i) + sum_j j P(Y=j)$
â tommy1996q
Aug 20 at 12:42
Is this correct?
â tommy1996q
Aug 20 at 12:43
Is this correct?
â tommy1996q
Aug 20 at 12:43
 |Â
show 1 more comment
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2888618%2fadditivity-of-expected-value%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password