Additivity of expected value

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite












Let's say I have a set $Omega$ with a probability $P$ on its subsets (the fact that it could be undefined on some subset is not important here, let's not think about that). I have a random variable $(X,Y): Omega to mathbbR^2$, and a function $phi: mathbbR^2 to mathbbR^2$. Then, I define the probability induced by the random variable as $P_X,Y(A)=P((X,Y)^-1(A))$, with $A subseteq mathbbR^2$. A well known result tells me that (given $omega=(x,y) in Omega$): $$int_Omega phi((X,Y)(omega))dP=int_mathbbR^2 phi(x) dP_X,Y$$



Consequently, taking $phi =x+y$, I have that $E[(X,Y)]= int_Omega (X,Y)(omega) dP=sum_x_j,y_i in Omega (X,Y)(x_i,y_j) p_x,y(x_i,y_j)$ if $Omega$ is countable, where $p_x,y$ is the probability that $(X,Y)(omega)=(x_i,y_j)$.



What I don't get is why it is that $E[X+Y]=E[X]+E[Y]$ when $X$ and $Y$ are integrable. For instance, if I had a random variable $X+Y$ with $X$ and $Y$ that assume all positive integer values, I don't get why $$E[X+Y]=sum_n=1^infty n p_x,y(X+Y=n)=sum_i=1^infty i p_x(X=i) + sum_j=1^infty j p_y (Y=j)=E[X]+E[Y]$$



The integral interpretation, also, doesn't help me in figuring this out. I wonder, also, if there is a criterion such that we know when integrating $phi$ is additive with respect to the random variables. For instance, the expected value is found taking $phi=x+y$, and we have additivity.







share|cite|improve this question


























    up vote
    1
    down vote

    favorite












    Let's say I have a set $Omega$ with a probability $P$ on its subsets (the fact that it could be undefined on some subset is not important here, let's not think about that). I have a random variable $(X,Y): Omega to mathbbR^2$, and a function $phi: mathbbR^2 to mathbbR^2$. Then, I define the probability induced by the random variable as $P_X,Y(A)=P((X,Y)^-1(A))$, with $A subseteq mathbbR^2$. A well known result tells me that (given $omega=(x,y) in Omega$): $$int_Omega phi((X,Y)(omega))dP=int_mathbbR^2 phi(x) dP_X,Y$$



    Consequently, taking $phi =x+y$, I have that $E[(X,Y)]= int_Omega (X,Y)(omega) dP=sum_x_j,y_i in Omega (X,Y)(x_i,y_j) p_x,y(x_i,y_j)$ if $Omega$ is countable, where $p_x,y$ is the probability that $(X,Y)(omega)=(x_i,y_j)$.



    What I don't get is why it is that $E[X+Y]=E[X]+E[Y]$ when $X$ and $Y$ are integrable. For instance, if I had a random variable $X+Y$ with $X$ and $Y$ that assume all positive integer values, I don't get why $$E[X+Y]=sum_n=1^infty n p_x,y(X+Y=n)=sum_i=1^infty i p_x(X=i) + sum_j=1^infty j p_y (Y=j)=E[X]+E[Y]$$



    The integral interpretation, also, doesn't help me in figuring this out. I wonder, also, if there is a criterion such that we know when integrating $phi$ is additive with respect to the random variables. For instance, the expected value is found taking $phi=x+y$, and we have additivity.







    share|cite|improve this question
























      up vote
      1
      down vote

      favorite









      up vote
      1
      down vote

      favorite











      Let's say I have a set $Omega$ with a probability $P$ on its subsets (the fact that it could be undefined on some subset is not important here, let's not think about that). I have a random variable $(X,Y): Omega to mathbbR^2$, and a function $phi: mathbbR^2 to mathbbR^2$. Then, I define the probability induced by the random variable as $P_X,Y(A)=P((X,Y)^-1(A))$, with $A subseteq mathbbR^2$. A well known result tells me that (given $omega=(x,y) in Omega$): $$int_Omega phi((X,Y)(omega))dP=int_mathbbR^2 phi(x) dP_X,Y$$



      Consequently, taking $phi =x+y$, I have that $E[(X,Y)]= int_Omega (X,Y)(omega) dP=sum_x_j,y_i in Omega (X,Y)(x_i,y_j) p_x,y(x_i,y_j)$ if $Omega$ is countable, where $p_x,y$ is the probability that $(X,Y)(omega)=(x_i,y_j)$.



      What I don't get is why it is that $E[X+Y]=E[X]+E[Y]$ when $X$ and $Y$ are integrable. For instance, if I had a random variable $X+Y$ with $X$ and $Y$ that assume all positive integer values, I don't get why $$E[X+Y]=sum_n=1^infty n p_x,y(X+Y=n)=sum_i=1^infty i p_x(X=i) + sum_j=1^infty j p_y (Y=j)=E[X]+E[Y]$$



      The integral interpretation, also, doesn't help me in figuring this out. I wonder, also, if there is a criterion such that we know when integrating $phi$ is additive with respect to the random variables. For instance, the expected value is found taking $phi=x+y$, and we have additivity.







      share|cite|improve this question














      Let's say I have a set $Omega$ with a probability $P$ on its subsets (the fact that it could be undefined on some subset is not important here, let's not think about that). I have a random variable $(X,Y): Omega to mathbbR^2$, and a function $phi: mathbbR^2 to mathbbR^2$. Then, I define the probability induced by the random variable as $P_X,Y(A)=P((X,Y)^-1(A))$, with $A subseteq mathbbR^2$. A well known result tells me that (given $omega=(x,y) in Omega$): $$int_Omega phi((X,Y)(omega))dP=int_mathbbR^2 phi(x) dP_X,Y$$



      Consequently, taking $phi =x+y$, I have that $E[(X,Y)]= int_Omega (X,Y)(omega) dP=sum_x_j,y_i in Omega (X,Y)(x_i,y_j) p_x,y(x_i,y_j)$ if $Omega$ is countable, where $p_x,y$ is the probability that $(X,Y)(omega)=(x_i,y_j)$.



      What I don't get is why it is that $E[X+Y]=E[X]+E[Y]$ when $X$ and $Y$ are integrable. For instance, if I had a random variable $X+Y$ with $X$ and $Y$ that assume all positive integer values, I don't get why $$E[X+Y]=sum_n=1^infty n p_x,y(X+Y=n)=sum_i=1^infty i p_x(X=i) + sum_j=1^infty j p_y (Y=j)=E[X]+E[Y]$$



      The integral interpretation, also, doesn't help me in figuring this out. I wonder, also, if there is a criterion such that we know when integrating $phi$ is additive with respect to the random variables. For instance, the expected value is found taking $phi=x+y$, and we have additivity.









      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Aug 20 at 11:43

























      asked Aug 20 at 10:22









      tommy1996q

      508313




      508313




















          2 Answers
          2






          active

          oldest

          votes

















          up vote
          1
          down vote



          accepted










          $$sum_nnmathsfPleft(X+Y=nright)=sum_nsum_i+j=nleft(i+jright)mathsfPleft(X=iwedge Y=jright)=$$$$sum isum_jmathsfPleft(X=iwedge Y=jright)+sum jsum_imathsfPleft(X=iwedge Y=jright)=$$$$sum imathsfPleft(X=iright)+sum jmathsfPleft(Y=jright)$$






          share|cite|improve this answer




















          • Yeah thanks, I have tried doing this way and it turns out to work. The key to work this out (both with the integral and with the series) seems to be the fact that you can separate the two variables, and using in a clever way the fact that $P_X,Y(x_i,y_j)=P(X=x_i wedge Y=y_j)$
            – tommy1996q
            Aug 20 at 12:46










          • Actually in order to prove that expectation is linear you can just use the more basic: $mathbbEleft(X+Yright):=int Xleft(omegaright)+Yleft(omegaright)mathsfPleft(domegaright)=int Xleft(omegaright)mathsfPleft(domegaright)+int Yleft(omegaright)mathsfPleft(domegaright)=mathbbEX+mathbbEY$. That is also more general (the rv's do not have to be discrete).
            – drhab
            Aug 20 at 12:50











          • You are absolutely right, don't know why I wanted to mess things up. It was a good exercise anyway. Thanks!
            – tommy1996q
            Aug 20 at 13:44










          • You are welcome.
            – drhab
            Aug 20 at 13:56

















          up vote
          1
          down vote













          The formula you have written is not correct. $phi$ is a function of two variables and you should write $int phi (x,y), dP_X,Y$ on the right side. $phi=1$ doesn't give you anything. It gives you $1=1$!. For $E(X+Y)=EX+EY$ take $phi (x,y)=x+y$ and use the fact that marginals of $P_X,Y$ are $P_X$ and $P_Y$.






          share|cite|improve this answer




















          • Yeah, of course $phi$ is what you said, thanks
            – tommy1996q
            Aug 20 at 11:43










          • Also, what do you mean by marginals? I was trying to rearrange the series or integrate in some useful way, but maybe tou are talking about something different? (Or maybe it’s just me who isn’t aware of the term “marginal of $P_x,y$)
            – tommy1996q
            Aug 20 at 11:47











          • If $phi$ depends only on $x$ then $int phi (x,y) , dP_X,Y= int phi (x,y) , dP_X$ and similarly when $phi$ depends only on $y$. $P_X$ and $P_Y$ are the marginal distributions and $P_X,Y$ is the joint distribution.
            – Kavi Rama Murthy
            Aug 20 at 11:52











          • Sorry for bad typo, don't have space. Hope I got it now. For the integral, $int_R^2(x+y)dP_X,Y= int_R x dP_X,Y + int_R y dP_X,Y$, and in the first integral $dP_X,Y=dP_X$ cause for every subset $A$ of $R^2$, x^-1 (A)= B times R$ with $B in Omega$because y isn't involved, and the integral can thus be simplified in terms of the x only. For the series, maybe if I rearrange it this way, it works: $sum_n=1^infty (x+y)p(n)=sum_nsum_i+j=n (i+j)p(X=i,Y=j)$ and rearranging $sum_ii sum_j p(X=i,Y=j) + sum_j j sum_ip(X=i,Y=j)=sum_i i P(X=i) + sum_j j P(Y=j)$
            – tommy1996q
            Aug 20 at 12:42











          • Is this correct?
            – tommy1996q
            Aug 20 at 12:43










          Your Answer




          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "69"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: false,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );








           

          draft saved


          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2888618%2fadditivity-of-expected-value%23new-answer', 'question_page');

          );

          Post as a guest






























          2 Answers
          2






          active

          oldest

          votes








          2 Answers
          2






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          1
          down vote



          accepted










          $$sum_nnmathsfPleft(X+Y=nright)=sum_nsum_i+j=nleft(i+jright)mathsfPleft(X=iwedge Y=jright)=$$$$sum isum_jmathsfPleft(X=iwedge Y=jright)+sum jsum_imathsfPleft(X=iwedge Y=jright)=$$$$sum imathsfPleft(X=iright)+sum jmathsfPleft(Y=jright)$$






          share|cite|improve this answer




















          • Yeah thanks, I have tried doing this way and it turns out to work. The key to work this out (both with the integral and with the series) seems to be the fact that you can separate the two variables, and using in a clever way the fact that $P_X,Y(x_i,y_j)=P(X=x_i wedge Y=y_j)$
            – tommy1996q
            Aug 20 at 12:46










          • Actually in order to prove that expectation is linear you can just use the more basic: $mathbbEleft(X+Yright):=int Xleft(omegaright)+Yleft(omegaright)mathsfPleft(domegaright)=int Xleft(omegaright)mathsfPleft(domegaright)+int Yleft(omegaright)mathsfPleft(domegaright)=mathbbEX+mathbbEY$. That is also more general (the rv's do not have to be discrete).
            – drhab
            Aug 20 at 12:50











          • You are absolutely right, don't know why I wanted to mess things up. It was a good exercise anyway. Thanks!
            – tommy1996q
            Aug 20 at 13:44










          • You are welcome.
            – drhab
            Aug 20 at 13:56














          up vote
          1
          down vote



          accepted










          $$sum_nnmathsfPleft(X+Y=nright)=sum_nsum_i+j=nleft(i+jright)mathsfPleft(X=iwedge Y=jright)=$$$$sum isum_jmathsfPleft(X=iwedge Y=jright)+sum jsum_imathsfPleft(X=iwedge Y=jright)=$$$$sum imathsfPleft(X=iright)+sum jmathsfPleft(Y=jright)$$






          share|cite|improve this answer




















          • Yeah thanks, I have tried doing this way and it turns out to work. The key to work this out (both with the integral and with the series) seems to be the fact that you can separate the two variables, and using in a clever way the fact that $P_X,Y(x_i,y_j)=P(X=x_i wedge Y=y_j)$
            – tommy1996q
            Aug 20 at 12:46










          • Actually in order to prove that expectation is linear you can just use the more basic: $mathbbEleft(X+Yright):=int Xleft(omegaright)+Yleft(omegaright)mathsfPleft(domegaright)=int Xleft(omegaright)mathsfPleft(domegaright)+int Yleft(omegaright)mathsfPleft(domegaright)=mathbbEX+mathbbEY$. That is also more general (the rv's do not have to be discrete).
            – drhab
            Aug 20 at 12:50











          • You are absolutely right, don't know why I wanted to mess things up. It was a good exercise anyway. Thanks!
            – tommy1996q
            Aug 20 at 13:44










          • You are welcome.
            – drhab
            Aug 20 at 13:56












          up vote
          1
          down vote



          accepted







          up vote
          1
          down vote



          accepted






          $$sum_nnmathsfPleft(X+Y=nright)=sum_nsum_i+j=nleft(i+jright)mathsfPleft(X=iwedge Y=jright)=$$$$sum isum_jmathsfPleft(X=iwedge Y=jright)+sum jsum_imathsfPleft(X=iwedge Y=jright)=$$$$sum imathsfPleft(X=iright)+sum jmathsfPleft(Y=jright)$$






          share|cite|improve this answer












          $$sum_nnmathsfPleft(X+Y=nright)=sum_nsum_i+j=nleft(i+jright)mathsfPleft(X=iwedge Y=jright)=$$$$sum isum_jmathsfPleft(X=iwedge Y=jright)+sum jsum_imathsfPleft(X=iwedge Y=jright)=$$$$sum imathsfPleft(X=iright)+sum jmathsfPleft(Y=jright)$$







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Aug 20 at 12:34









          drhab

          87.8k541119




          87.8k541119











          • Yeah thanks, I have tried doing this way and it turns out to work. The key to work this out (both with the integral and with the series) seems to be the fact that you can separate the two variables, and using in a clever way the fact that $P_X,Y(x_i,y_j)=P(X=x_i wedge Y=y_j)$
            – tommy1996q
            Aug 20 at 12:46










          • Actually in order to prove that expectation is linear you can just use the more basic: $mathbbEleft(X+Yright):=int Xleft(omegaright)+Yleft(omegaright)mathsfPleft(domegaright)=int Xleft(omegaright)mathsfPleft(domegaright)+int Yleft(omegaright)mathsfPleft(domegaright)=mathbbEX+mathbbEY$. That is also more general (the rv's do not have to be discrete).
            – drhab
            Aug 20 at 12:50











          • You are absolutely right, don't know why I wanted to mess things up. It was a good exercise anyway. Thanks!
            – tommy1996q
            Aug 20 at 13:44










          • You are welcome.
            – drhab
            Aug 20 at 13:56
















          • Yeah thanks, I have tried doing this way and it turns out to work. The key to work this out (both with the integral and with the series) seems to be the fact that you can separate the two variables, and using in a clever way the fact that $P_X,Y(x_i,y_j)=P(X=x_i wedge Y=y_j)$
            – tommy1996q
            Aug 20 at 12:46










          • Actually in order to prove that expectation is linear you can just use the more basic: $mathbbEleft(X+Yright):=int Xleft(omegaright)+Yleft(omegaright)mathsfPleft(domegaright)=int Xleft(omegaright)mathsfPleft(domegaright)+int Yleft(omegaright)mathsfPleft(domegaright)=mathbbEX+mathbbEY$. That is also more general (the rv's do not have to be discrete).
            – drhab
            Aug 20 at 12:50











          • You are absolutely right, don't know why I wanted to mess things up. It was a good exercise anyway. Thanks!
            – tommy1996q
            Aug 20 at 13:44










          • You are welcome.
            – drhab
            Aug 20 at 13:56















          Yeah thanks, I have tried doing this way and it turns out to work. The key to work this out (both with the integral and with the series) seems to be the fact that you can separate the two variables, and using in a clever way the fact that $P_X,Y(x_i,y_j)=P(X=x_i wedge Y=y_j)$
          – tommy1996q
          Aug 20 at 12:46




          Yeah thanks, I have tried doing this way and it turns out to work. The key to work this out (both with the integral and with the series) seems to be the fact that you can separate the two variables, and using in a clever way the fact that $P_X,Y(x_i,y_j)=P(X=x_i wedge Y=y_j)$
          – tommy1996q
          Aug 20 at 12:46












          Actually in order to prove that expectation is linear you can just use the more basic: $mathbbEleft(X+Yright):=int Xleft(omegaright)+Yleft(omegaright)mathsfPleft(domegaright)=int Xleft(omegaright)mathsfPleft(domegaright)+int Yleft(omegaright)mathsfPleft(domegaright)=mathbbEX+mathbbEY$. That is also more general (the rv's do not have to be discrete).
          – drhab
          Aug 20 at 12:50





          Actually in order to prove that expectation is linear you can just use the more basic: $mathbbEleft(X+Yright):=int Xleft(omegaright)+Yleft(omegaright)mathsfPleft(domegaright)=int Xleft(omegaright)mathsfPleft(domegaright)+int Yleft(omegaright)mathsfPleft(domegaright)=mathbbEX+mathbbEY$. That is also more general (the rv's do not have to be discrete).
          – drhab
          Aug 20 at 12:50













          You are absolutely right, don't know why I wanted to mess things up. It was a good exercise anyway. Thanks!
          – tommy1996q
          Aug 20 at 13:44




          You are absolutely right, don't know why I wanted to mess things up. It was a good exercise anyway. Thanks!
          – tommy1996q
          Aug 20 at 13:44












          You are welcome.
          – drhab
          Aug 20 at 13:56




          You are welcome.
          – drhab
          Aug 20 at 13:56










          up vote
          1
          down vote













          The formula you have written is not correct. $phi$ is a function of two variables and you should write $int phi (x,y), dP_X,Y$ on the right side. $phi=1$ doesn't give you anything. It gives you $1=1$!. For $E(X+Y)=EX+EY$ take $phi (x,y)=x+y$ and use the fact that marginals of $P_X,Y$ are $P_X$ and $P_Y$.






          share|cite|improve this answer




















          • Yeah, of course $phi$ is what you said, thanks
            – tommy1996q
            Aug 20 at 11:43










          • Also, what do you mean by marginals? I was trying to rearrange the series or integrate in some useful way, but maybe tou are talking about something different? (Or maybe it’s just me who isn’t aware of the term “marginal of $P_x,y$)
            – tommy1996q
            Aug 20 at 11:47











          • If $phi$ depends only on $x$ then $int phi (x,y) , dP_X,Y= int phi (x,y) , dP_X$ and similarly when $phi$ depends only on $y$. $P_X$ and $P_Y$ are the marginal distributions and $P_X,Y$ is the joint distribution.
            – Kavi Rama Murthy
            Aug 20 at 11:52











          • Sorry for bad typo, don't have space. Hope I got it now. For the integral, $int_R^2(x+y)dP_X,Y= int_R x dP_X,Y + int_R y dP_X,Y$, and in the first integral $dP_X,Y=dP_X$ cause for every subset $A$ of $R^2$, x^-1 (A)= B times R$ with $B in Omega$because y isn't involved, and the integral can thus be simplified in terms of the x only. For the series, maybe if I rearrange it this way, it works: $sum_n=1^infty (x+y)p(n)=sum_nsum_i+j=n (i+j)p(X=i,Y=j)$ and rearranging $sum_ii sum_j p(X=i,Y=j) + sum_j j sum_ip(X=i,Y=j)=sum_i i P(X=i) + sum_j j P(Y=j)$
            – tommy1996q
            Aug 20 at 12:42











          • Is this correct?
            – tommy1996q
            Aug 20 at 12:43














          up vote
          1
          down vote













          The formula you have written is not correct. $phi$ is a function of two variables and you should write $int phi (x,y), dP_X,Y$ on the right side. $phi=1$ doesn't give you anything. It gives you $1=1$!. For $E(X+Y)=EX+EY$ take $phi (x,y)=x+y$ and use the fact that marginals of $P_X,Y$ are $P_X$ and $P_Y$.






          share|cite|improve this answer




















          • Yeah, of course $phi$ is what you said, thanks
            – tommy1996q
            Aug 20 at 11:43










          • Also, what do you mean by marginals? I was trying to rearrange the series or integrate in some useful way, but maybe tou are talking about something different? (Or maybe it’s just me who isn’t aware of the term “marginal of $P_x,y$)
            – tommy1996q
            Aug 20 at 11:47











          • If $phi$ depends only on $x$ then $int phi (x,y) , dP_X,Y= int phi (x,y) , dP_X$ and similarly when $phi$ depends only on $y$. $P_X$ and $P_Y$ are the marginal distributions and $P_X,Y$ is the joint distribution.
            – Kavi Rama Murthy
            Aug 20 at 11:52











          • Sorry for bad typo, don't have space. Hope I got it now. For the integral, $int_R^2(x+y)dP_X,Y= int_R x dP_X,Y + int_R y dP_X,Y$, and in the first integral $dP_X,Y=dP_X$ cause for every subset $A$ of $R^2$, x^-1 (A)= B times R$ with $B in Omega$because y isn't involved, and the integral can thus be simplified in terms of the x only. For the series, maybe if I rearrange it this way, it works: $sum_n=1^infty (x+y)p(n)=sum_nsum_i+j=n (i+j)p(X=i,Y=j)$ and rearranging $sum_ii sum_j p(X=i,Y=j) + sum_j j sum_ip(X=i,Y=j)=sum_i i P(X=i) + sum_j j P(Y=j)$
            – tommy1996q
            Aug 20 at 12:42











          • Is this correct?
            – tommy1996q
            Aug 20 at 12:43












          up vote
          1
          down vote










          up vote
          1
          down vote









          The formula you have written is not correct. $phi$ is a function of two variables and you should write $int phi (x,y), dP_X,Y$ on the right side. $phi=1$ doesn't give you anything. It gives you $1=1$!. For $E(X+Y)=EX+EY$ take $phi (x,y)=x+y$ and use the fact that marginals of $P_X,Y$ are $P_X$ and $P_Y$.






          share|cite|improve this answer












          The formula you have written is not correct. $phi$ is a function of two variables and you should write $int phi (x,y), dP_X,Y$ on the right side. $phi=1$ doesn't give you anything. It gives you $1=1$!. For $E(X+Y)=EX+EY$ take $phi (x,y)=x+y$ and use the fact that marginals of $P_X,Y$ are $P_X$ and $P_Y$.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Aug 20 at 10:26









          Kavi Rama Murthy

          23.3k2933




          23.3k2933











          • Yeah, of course $phi$ is what you said, thanks
            – tommy1996q
            Aug 20 at 11:43










          • Also, what do you mean by marginals? I was trying to rearrange the series or integrate in some useful way, but maybe tou are talking about something different? (Or maybe it’s just me who isn’t aware of the term “marginal of $P_x,y$)
            – tommy1996q
            Aug 20 at 11:47











          • If $phi$ depends only on $x$ then $int phi (x,y) , dP_X,Y= int phi (x,y) , dP_X$ and similarly when $phi$ depends only on $y$. $P_X$ and $P_Y$ are the marginal distributions and $P_X,Y$ is the joint distribution.
            – Kavi Rama Murthy
            Aug 20 at 11:52











          • Sorry for bad typo, don't have space. Hope I got it now. For the integral, $int_R^2(x+y)dP_X,Y= int_R x dP_X,Y + int_R y dP_X,Y$, and in the first integral $dP_X,Y=dP_X$ cause for every subset $A$ of $R^2$, x^-1 (A)= B times R$ with $B in Omega$because y isn't involved, and the integral can thus be simplified in terms of the x only. For the series, maybe if I rearrange it this way, it works: $sum_n=1^infty (x+y)p(n)=sum_nsum_i+j=n (i+j)p(X=i,Y=j)$ and rearranging $sum_ii sum_j p(X=i,Y=j) + sum_j j sum_ip(X=i,Y=j)=sum_i i P(X=i) + sum_j j P(Y=j)$
            – tommy1996q
            Aug 20 at 12:42











          • Is this correct?
            – tommy1996q
            Aug 20 at 12:43
















          • Yeah, of course $phi$ is what you said, thanks
            – tommy1996q
            Aug 20 at 11:43










          • Also, what do you mean by marginals? I was trying to rearrange the series or integrate in some useful way, but maybe tou are talking about something different? (Or maybe it’s just me who isn’t aware of the term “marginal of $P_x,y$)
            – tommy1996q
            Aug 20 at 11:47











          • If $phi$ depends only on $x$ then $int phi (x,y) , dP_X,Y= int phi (x,y) , dP_X$ and similarly when $phi$ depends only on $y$. $P_X$ and $P_Y$ are the marginal distributions and $P_X,Y$ is the joint distribution.
            – Kavi Rama Murthy
            Aug 20 at 11:52











          • Sorry for bad typo, don't have space. Hope I got it now. For the integral, $int_R^2(x+y)dP_X,Y= int_R x dP_X,Y + int_R y dP_X,Y$, and in the first integral $dP_X,Y=dP_X$ cause for every subset $A$ of $R^2$, x^-1 (A)= B times R$ with $B in Omega$because y isn't involved, and the integral can thus be simplified in terms of the x only. For the series, maybe if I rearrange it this way, it works: $sum_n=1^infty (x+y)p(n)=sum_nsum_i+j=n (i+j)p(X=i,Y=j)$ and rearranging $sum_ii sum_j p(X=i,Y=j) + sum_j j sum_ip(X=i,Y=j)=sum_i i P(X=i) + sum_j j P(Y=j)$
            – tommy1996q
            Aug 20 at 12:42











          • Is this correct?
            – tommy1996q
            Aug 20 at 12:43















          Yeah, of course $phi$ is what you said, thanks
          – tommy1996q
          Aug 20 at 11:43




          Yeah, of course $phi$ is what you said, thanks
          – tommy1996q
          Aug 20 at 11:43












          Also, what do you mean by marginals? I was trying to rearrange the series or integrate in some useful way, but maybe tou are talking about something different? (Or maybe it’s just me who isn’t aware of the term “marginal of $P_x,y$)
          – tommy1996q
          Aug 20 at 11:47





          Also, what do you mean by marginals? I was trying to rearrange the series or integrate in some useful way, but maybe tou are talking about something different? (Or maybe it’s just me who isn’t aware of the term “marginal of $P_x,y$)
          – tommy1996q
          Aug 20 at 11:47













          If $phi$ depends only on $x$ then $int phi (x,y) , dP_X,Y= int phi (x,y) , dP_X$ and similarly when $phi$ depends only on $y$. $P_X$ and $P_Y$ are the marginal distributions and $P_X,Y$ is the joint distribution.
          – Kavi Rama Murthy
          Aug 20 at 11:52





          If $phi$ depends only on $x$ then $int phi (x,y) , dP_X,Y= int phi (x,y) , dP_X$ and similarly when $phi$ depends only on $y$. $P_X$ and $P_Y$ are the marginal distributions and $P_X,Y$ is the joint distribution.
          – Kavi Rama Murthy
          Aug 20 at 11:52













          Sorry for bad typo, don't have space. Hope I got it now. For the integral, $int_R^2(x+y)dP_X,Y= int_R x dP_X,Y + int_R y dP_X,Y$, and in the first integral $dP_X,Y=dP_X$ cause for every subset $A$ of $R^2$, x^-1 (A)= B times R$ with $B in Omega$because y isn't involved, and the integral can thus be simplified in terms of the x only. For the series, maybe if I rearrange it this way, it works: $sum_n=1^infty (x+y)p(n)=sum_nsum_i+j=n (i+j)p(X=i,Y=j)$ and rearranging $sum_ii sum_j p(X=i,Y=j) + sum_j j sum_ip(X=i,Y=j)=sum_i i P(X=i) + sum_j j P(Y=j)$
          – tommy1996q
          Aug 20 at 12:42





          Sorry for bad typo, don't have space. Hope I got it now. For the integral, $int_R^2(x+y)dP_X,Y= int_R x dP_X,Y + int_R y dP_X,Y$, and in the first integral $dP_X,Y=dP_X$ cause for every subset $A$ of $R^2$, x^-1 (A)= B times R$ with $B in Omega$because y isn't involved, and the integral can thus be simplified in terms of the x only. For the series, maybe if I rearrange it this way, it works: $sum_n=1^infty (x+y)p(n)=sum_nsum_i+j=n (i+j)p(X=i,Y=j)$ and rearranging $sum_ii sum_j p(X=i,Y=j) + sum_j j sum_ip(X=i,Y=j)=sum_i i P(X=i) + sum_j j P(Y=j)$
          – tommy1996q
          Aug 20 at 12:42













          Is this correct?
          – tommy1996q
          Aug 20 at 12:43




          Is this correct?
          – tommy1996q
          Aug 20 at 12:43












           

          draft saved


          draft discarded


























           


          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2888618%2fadditivity-of-expected-value%23new-answer', 'question_page');

          );

          Post as a guest













































































          這個網誌中的熱門文章

          How to combine Bézier curves to a surface?

          Carbon dioxide

          Why am i infinitely getting the same tweet with the Twitter Search API?