Is this proof for linearity of Expectation correct?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite












I'm studying the first properties of expectation and on my notes I have a proof for its linearity. However my notes arent fully clear so I tried reinterpreting them, and this is the result:



Linearity of Expectation (proof):



Let $X$ and $Y$ be random variables, and $textImage(X) = s_1, s_2, dots, s_k, dots $, and $textImage(X) = t_1, t_2, dots, t_k, dots $ where both sets have at most countably many elements (the random variables are discrete). Let's put $P_i = mathbbP( X = s_i)$ and $Q_j = mathbbP(Y = t_j)$ and eventually $pi_ij = mathbbP[ (X = s_i)cap(Y = t_j)]$ which is the probability that $X = s_i$ and meanwhile $Y = t_j$.



We have that $textImage(X+Y) = s_1+t_1, s_2+t_1, dots, s_2+t_1, s_2 + t_2, dots, s_k
+t_1, s_k + t_2, dots $.



So by definition of expectation:



$E[X+Y] = sumlimits_i=0^inftysumlimits_j=0^infty(s_i+t_j)mathbbP(X+Y = s_i+t_j) = sumlimits_i=0^inftysumlimits_j=0^infty(s_i+t_j)pi_ij$



Using distributive property of multiplication



$E[X+Y] = sumlimits_i=0^inftysumlimits_j=0^inftys_ipi_ij + sumlimits_i=0^inftysumlimits_j=0^inftyt_jpi_ij =
sumlimits_i=0^inftys_isumlimits_j=0^inftypi_ij + sumlimits_j=0^inftyt_jsumlimits_i=0^inftypi_ij$



We observe now that by definiton $pi_ij$ is $mathbbP[(X = s_i)cap(Y = t_j)]$



and therefore



$sumlimits_j=0^inftypi_ij = sumlimits_j=0^inftymathbbP[(X = s_i)cap(Y = t_j)]$



Where $Y = t_i$ is the same as $T^-1(t_j) = mathcalD_jinOmega ;;$ and $Omega$ is the sample space. Because $Y$ is a function $mathcalD_j$ are disjoint and so they make a partition for $Omega$, i.e. $bigcuplimits_j=0^inftymathcalD_j = Omega$.



Using additivity for measures



$sumlimits_j=0^inftypi_ij = mathbbP( bigcuplimits_j=0^infty[(X = s_i) cap (Y = t_j) ] = mathbbP[ (X = s_i) capbigcuplimits_j=0^infty Y = t_j ] = mathbbP[ (X = s_i) cap Omega ] = mathbbP(X = s_i) $



By reproducing the same procedure for $sumlimits_i=0^inftypi_ij$ we find out that



$sumlimits_i=0^inftypi_ij = mathbbP( Y = t_j )$



Substituting in $E[X+Y] = sumlimits_i=0^inftys_isumlimits_j=0^inftypi_ij + sumlimits_j=0^inftyt_jsumlimits_i=0^inftypi_ij$



we get:



$E[X+Y] = sumlimits_i=0^inftys_imathbbP(X = s_i) + sumlimits_j=0^inftyt_jmathbbP(Y = t_j) = E[X] + E[Y]$.



Is it correct? My biggest doubt is where we state that $mathbbP(X+Y = s_i+t_j) = pi_ij$ cause I expect that there could be more than one way to get $X+Y = s_i+t_j$, and not only if $X = s_i$ and $Y = t_j$.










share|cite|improve this question



























    up vote
    1
    down vote

    favorite












    I'm studying the first properties of expectation and on my notes I have a proof for its linearity. However my notes arent fully clear so I tried reinterpreting them, and this is the result:



    Linearity of Expectation (proof):



    Let $X$ and $Y$ be random variables, and $textImage(X) = s_1, s_2, dots, s_k, dots $, and $textImage(X) = t_1, t_2, dots, t_k, dots $ where both sets have at most countably many elements (the random variables are discrete). Let's put $P_i = mathbbP( X = s_i)$ and $Q_j = mathbbP(Y = t_j)$ and eventually $pi_ij = mathbbP[ (X = s_i)cap(Y = t_j)]$ which is the probability that $X = s_i$ and meanwhile $Y = t_j$.



    We have that $textImage(X+Y) = s_1+t_1, s_2+t_1, dots, s_2+t_1, s_2 + t_2, dots, s_k
    +t_1, s_k + t_2, dots $.



    So by definition of expectation:



    $E[X+Y] = sumlimits_i=0^inftysumlimits_j=0^infty(s_i+t_j)mathbbP(X+Y = s_i+t_j) = sumlimits_i=0^inftysumlimits_j=0^infty(s_i+t_j)pi_ij$



    Using distributive property of multiplication



    $E[X+Y] = sumlimits_i=0^inftysumlimits_j=0^inftys_ipi_ij + sumlimits_i=0^inftysumlimits_j=0^inftyt_jpi_ij =
    sumlimits_i=0^inftys_isumlimits_j=0^inftypi_ij + sumlimits_j=0^inftyt_jsumlimits_i=0^inftypi_ij$



    We observe now that by definiton $pi_ij$ is $mathbbP[(X = s_i)cap(Y = t_j)]$



    and therefore



    $sumlimits_j=0^inftypi_ij = sumlimits_j=0^inftymathbbP[(X = s_i)cap(Y = t_j)]$



    Where $Y = t_i$ is the same as $T^-1(t_j) = mathcalD_jinOmega ;;$ and $Omega$ is the sample space. Because $Y$ is a function $mathcalD_j$ are disjoint and so they make a partition for $Omega$, i.e. $bigcuplimits_j=0^inftymathcalD_j = Omega$.



    Using additivity for measures



    $sumlimits_j=0^inftypi_ij = mathbbP( bigcuplimits_j=0^infty[(X = s_i) cap (Y = t_j) ] = mathbbP[ (X = s_i) capbigcuplimits_j=0^infty Y = t_j ] = mathbbP[ (X = s_i) cap Omega ] = mathbbP(X = s_i) $



    By reproducing the same procedure for $sumlimits_i=0^inftypi_ij$ we find out that



    $sumlimits_i=0^inftypi_ij = mathbbP( Y = t_j )$



    Substituting in $E[X+Y] = sumlimits_i=0^inftys_isumlimits_j=0^inftypi_ij + sumlimits_j=0^inftyt_jsumlimits_i=0^inftypi_ij$



    we get:



    $E[X+Y] = sumlimits_i=0^inftys_imathbbP(X = s_i) + sumlimits_j=0^inftyt_jmathbbP(Y = t_j) = E[X] + E[Y]$.



    Is it correct? My biggest doubt is where we state that $mathbbP(X+Y = s_i+t_j) = pi_ij$ cause I expect that there could be more than one way to get $X+Y = s_i+t_j$, and not only if $X = s_i$ and $Y = t_j$.










    share|cite|improve this question

























      up vote
      1
      down vote

      favorite









      up vote
      1
      down vote

      favorite











      I'm studying the first properties of expectation and on my notes I have a proof for its linearity. However my notes arent fully clear so I tried reinterpreting them, and this is the result:



      Linearity of Expectation (proof):



      Let $X$ and $Y$ be random variables, and $textImage(X) = s_1, s_2, dots, s_k, dots $, and $textImage(X) = t_1, t_2, dots, t_k, dots $ where both sets have at most countably many elements (the random variables are discrete). Let's put $P_i = mathbbP( X = s_i)$ and $Q_j = mathbbP(Y = t_j)$ and eventually $pi_ij = mathbbP[ (X = s_i)cap(Y = t_j)]$ which is the probability that $X = s_i$ and meanwhile $Y = t_j$.



      We have that $textImage(X+Y) = s_1+t_1, s_2+t_1, dots, s_2+t_1, s_2 + t_2, dots, s_k
      +t_1, s_k + t_2, dots $.



      So by definition of expectation:



      $E[X+Y] = sumlimits_i=0^inftysumlimits_j=0^infty(s_i+t_j)mathbbP(X+Y = s_i+t_j) = sumlimits_i=0^inftysumlimits_j=0^infty(s_i+t_j)pi_ij$



      Using distributive property of multiplication



      $E[X+Y] = sumlimits_i=0^inftysumlimits_j=0^inftys_ipi_ij + sumlimits_i=0^inftysumlimits_j=0^inftyt_jpi_ij =
      sumlimits_i=0^inftys_isumlimits_j=0^inftypi_ij + sumlimits_j=0^inftyt_jsumlimits_i=0^inftypi_ij$



      We observe now that by definiton $pi_ij$ is $mathbbP[(X = s_i)cap(Y = t_j)]$



      and therefore



      $sumlimits_j=0^inftypi_ij = sumlimits_j=0^inftymathbbP[(X = s_i)cap(Y = t_j)]$



      Where $Y = t_i$ is the same as $T^-1(t_j) = mathcalD_jinOmega ;;$ and $Omega$ is the sample space. Because $Y$ is a function $mathcalD_j$ are disjoint and so they make a partition for $Omega$, i.e. $bigcuplimits_j=0^inftymathcalD_j = Omega$.



      Using additivity for measures



      $sumlimits_j=0^inftypi_ij = mathbbP( bigcuplimits_j=0^infty[(X = s_i) cap (Y = t_j) ] = mathbbP[ (X = s_i) capbigcuplimits_j=0^infty Y = t_j ] = mathbbP[ (X = s_i) cap Omega ] = mathbbP(X = s_i) $



      By reproducing the same procedure for $sumlimits_i=0^inftypi_ij$ we find out that



      $sumlimits_i=0^inftypi_ij = mathbbP( Y = t_j )$



      Substituting in $E[X+Y] = sumlimits_i=0^inftys_isumlimits_j=0^inftypi_ij + sumlimits_j=0^inftyt_jsumlimits_i=0^inftypi_ij$



      we get:



      $E[X+Y] = sumlimits_i=0^inftys_imathbbP(X = s_i) + sumlimits_j=0^inftyt_jmathbbP(Y = t_j) = E[X] + E[Y]$.



      Is it correct? My biggest doubt is where we state that $mathbbP(X+Y = s_i+t_j) = pi_ij$ cause I expect that there could be more than one way to get $X+Y = s_i+t_j$, and not only if $X = s_i$ and $Y = t_j$.










      share|cite|improve this question















      I'm studying the first properties of expectation and on my notes I have a proof for its linearity. However my notes arent fully clear so I tried reinterpreting them, and this is the result:



      Linearity of Expectation (proof):



      Let $X$ and $Y$ be random variables, and $textImage(X) = s_1, s_2, dots, s_k, dots $, and $textImage(X) = t_1, t_2, dots, t_k, dots $ where both sets have at most countably many elements (the random variables are discrete). Let's put $P_i = mathbbP( X = s_i)$ and $Q_j = mathbbP(Y = t_j)$ and eventually $pi_ij = mathbbP[ (X = s_i)cap(Y = t_j)]$ which is the probability that $X = s_i$ and meanwhile $Y = t_j$.



      We have that $textImage(X+Y) = s_1+t_1, s_2+t_1, dots, s_2+t_1, s_2 + t_2, dots, s_k
      +t_1, s_k + t_2, dots $.



      So by definition of expectation:



      $E[X+Y] = sumlimits_i=0^inftysumlimits_j=0^infty(s_i+t_j)mathbbP(X+Y = s_i+t_j) = sumlimits_i=0^inftysumlimits_j=0^infty(s_i+t_j)pi_ij$



      Using distributive property of multiplication



      $E[X+Y] = sumlimits_i=0^inftysumlimits_j=0^inftys_ipi_ij + sumlimits_i=0^inftysumlimits_j=0^inftyt_jpi_ij =
      sumlimits_i=0^inftys_isumlimits_j=0^inftypi_ij + sumlimits_j=0^inftyt_jsumlimits_i=0^inftypi_ij$



      We observe now that by definiton $pi_ij$ is $mathbbP[(X = s_i)cap(Y = t_j)]$



      and therefore



      $sumlimits_j=0^inftypi_ij = sumlimits_j=0^inftymathbbP[(X = s_i)cap(Y = t_j)]$



      Where $Y = t_i$ is the same as $T^-1(t_j) = mathcalD_jinOmega ;;$ and $Omega$ is the sample space. Because $Y$ is a function $mathcalD_j$ are disjoint and so they make a partition for $Omega$, i.e. $bigcuplimits_j=0^inftymathcalD_j = Omega$.



      Using additivity for measures



      $sumlimits_j=0^inftypi_ij = mathbbP( bigcuplimits_j=0^infty[(X = s_i) cap (Y = t_j) ] = mathbbP[ (X = s_i) capbigcuplimits_j=0^infty Y = t_j ] = mathbbP[ (X = s_i) cap Omega ] = mathbbP(X = s_i) $



      By reproducing the same procedure for $sumlimits_i=0^inftypi_ij$ we find out that



      $sumlimits_i=0^inftypi_ij = mathbbP( Y = t_j )$



      Substituting in $E[X+Y] = sumlimits_i=0^inftys_isumlimits_j=0^inftypi_ij + sumlimits_j=0^inftyt_jsumlimits_i=0^inftypi_ij$



      we get:



      $E[X+Y] = sumlimits_i=0^inftys_imathbbP(X = s_i) + sumlimits_j=0^inftyt_jmathbbP(Y = t_j) = E[X] + E[Y]$.



      Is it correct? My biggest doubt is where we state that $mathbbP(X+Y = s_i+t_j) = pi_ij$ cause I expect that there could be more than one way to get $X+Y = s_i+t_j$, and not only if $X = s_i$ and $Y = t_j$.







      probability-theory expected-value






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Sep 3 at 13:48









      drhab

      89k541122




      89k541122










      asked Sep 3 at 13:10









      Baffo rasta

      4711




      4711




















          1 Answer
          1






          active

          oldest

          votes

















          up vote
          0
          down vote



          accepted










          Your proof is fine and shows that expectation of two discrete random variables that are both defined on the same probability space is the sum of the individual expectations (if they exist of course).



          You make use of the distributions of $X,Y,X+Y$ but that is not necessary.



          There is a route closer to definitions and working for all integrable random variables (not only discrete).



          Let $X,Y$ be random variables defined on probability space $(Omega,mathcal A,mathsf P)$.



          Then also $Z=X+Y$ prescribed by $omegamapsto X(omega)+Y(omega)$ is a random variable and - if $X$ and $Y$ are integrable - this with:



          $$mathbb EZ:=int Z(omega)mathsf P(domega)=int X(omega)+Y(omega)mathsf P(domega)=int X(omega)mathsf P(domega)+int Y(omega)mathsf P(domega)=mathbb EX+mathbb EY$$



          Essential is just the fact that for integrable functions $f,g$ : $$int f+g;dmu=int f;dmu+int g;dmu$$for any measure $mu$.






          share|cite|improve this answer




















          • thanks for your answer, however integral of a measure is treated in the successive part of my notes so I can't use it in my proof.
            – Baffo rasta
            Sep 3 at 14:13






          • 1




            Yes, you can: by using the point measure $Bbb P$.
            – amsmath
            Sep 3 at 14:17










          • As @amsmath says. But important for you is that you practicize the definition of $mathbb EX$ as it is given in your notes. If that definition makes use of the distribution of $X$ then you cannot do better (yet). Things will clear up later then.
            – drhab
            Sep 3 at 14:26










          Your Answer




          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "69"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: false,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













           

          draft saved


          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2903847%2fis-this-proof-for-linearity-of-expectation-correct%23new-answer', 'question_page');

          );

          Post as a guest






























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          0
          down vote



          accepted










          Your proof is fine and shows that expectation of two discrete random variables that are both defined on the same probability space is the sum of the individual expectations (if they exist of course).



          You make use of the distributions of $X,Y,X+Y$ but that is not necessary.



          There is a route closer to definitions and working for all integrable random variables (not only discrete).



          Let $X,Y$ be random variables defined on probability space $(Omega,mathcal A,mathsf P)$.



          Then also $Z=X+Y$ prescribed by $omegamapsto X(omega)+Y(omega)$ is a random variable and - if $X$ and $Y$ are integrable - this with:



          $$mathbb EZ:=int Z(omega)mathsf P(domega)=int X(omega)+Y(omega)mathsf P(domega)=int X(omega)mathsf P(domega)+int Y(omega)mathsf P(domega)=mathbb EX+mathbb EY$$



          Essential is just the fact that for integrable functions $f,g$ : $$int f+g;dmu=int f;dmu+int g;dmu$$for any measure $mu$.






          share|cite|improve this answer




















          • thanks for your answer, however integral of a measure is treated in the successive part of my notes so I can't use it in my proof.
            – Baffo rasta
            Sep 3 at 14:13






          • 1




            Yes, you can: by using the point measure $Bbb P$.
            – amsmath
            Sep 3 at 14:17










          • As @amsmath says. But important for you is that you practicize the definition of $mathbb EX$ as it is given in your notes. If that definition makes use of the distribution of $X$ then you cannot do better (yet). Things will clear up later then.
            – drhab
            Sep 3 at 14:26














          up vote
          0
          down vote



          accepted










          Your proof is fine and shows that expectation of two discrete random variables that are both defined on the same probability space is the sum of the individual expectations (if they exist of course).



          You make use of the distributions of $X,Y,X+Y$ but that is not necessary.



          There is a route closer to definitions and working for all integrable random variables (not only discrete).



          Let $X,Y$ be random variables defined on probability space $(Omega,mathcal A,mathsf P)$.



          Then also $Z=X+Y$ prescribed by $omegamapsto X(omega)+Y(omega)$ is a random variable and - if $X$ and $Y$ are integrable - this with:



          $$mathbb EZ:=int Z(omega)mathsf P(domega)=int X(omega)+Y(omega)mathsf P(domega)=int X(omega)mathsf P(domega)+int Y(omega)mathsf P(domega)=mathbb EX+mathbb EY$$



          Essential is just the fact that for integrable functions $f,g$ : $$int f+g;dmu=int f;dmu+int g;dmu$$for any measure $mu$.






          share|cite|improve this answer




















          • thanks for your answer, however integral of a measure is treated in the successive part of my notes so I can't use it in my proof.
            – Baffo rasta
            Sep 3 at 14:13






          • 1




            Yes, you can: by using the point measure $Bbb P$.
            – amsmath
            Sep 3 at 14:17










          • As @amsmath says. But important for you is that you practicize the definition of $mathbb EX$ as it is given in your notes. If that definition makes use of the distribution of $X$ then you cannot do better (yet). Things will clear up later then.
            – drhab
            Sep 3 at 14:26












          up vote
          0
          down vote



          accepted







          up vote
          0
          down vote



          accepted






          Your proof is fine and shows that expectation of two discrete random variables that are both defined on the same probability space is the sum of the individual expectations (if they exist of course).



          You make use of the distributions of $X,Y,X+Y$ but that is not necessary.



          There is a route closer to definitions and working for all integrable random variables (not only discrete).



          Let $X,Y$ be random variables defined on probability space $(Omega,mathcal A,mathsf P)$.



          Then also $Z=X+Y$ prescribed by $omegamapsto X(omega)+Y(omega)$ is a random variable and - if $X$ and $Y$ are integrable - this with:



          $$mathbb EZ:=int Z(omega)mathsf P(domega)=int X(omega)+Y(omega)mathsf P(domega)=int X(omega)mathsf P(domega)+int Y(omega)mathsf P(domega)=mathbb EX+mathbb EY$$



          Essential is just the fact that for integrable functions $f,g$ : $$int f+g;dmu=int f;dmu+int g;dmu$$for any measure $mu$.






          share|cite|improve this answer












          Your proof is fine and shows that expectation of two discrete random variables that are both defined on the same probability space is the sum of the individual expectations (if they exist of course).



          You make use of the distributions of $X,Y,X+Y$ but that is not necessary.



          There is a route closer to definitions and working for all integrable random variables (not only discrete).



          Let $X,Y$ be random variables defined on probability space $(Omega,mathcal A,mathsf P)$.



          Then also $Z=X+Y$ prescribed by $omegamapsto X(omega)+Y(omega)$ is a random variable and - if $X$ and $Y$ are integrable - this with:



          $$mathbb EZ:=int Z(omega)mathsf P(domega)=int X(omega)+Y(omega)mathsf P(domega)=int X(omega)mathsf P(domega)+int Y(omega)mathsf P(domega)=mathbb EX+mathbb EY$$



          Essential is just the fact that for integrable functions $f,g$ : $$int f+g;dmu=int f;dmu+int g;dmu$$for any measure $mu$.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Sep 3 at 13:59









          drhab

          89k541122




          89k541122











          • thanks for your answer, however integral of a measure is treated in the successive part of my notes so I can't use it in my proof.
            – Baffo rasta
            Sep 3 at 14:13






          • 1




            Yes, you can: by using the point measure $Bbb P$.
            – amsmath
            Sep 3 at 14:17










          • As @amsmath says. But important for you is that you practicize the definition of $mathbb EX$ as it is given in your notes. If that definition makes use of the distribution of $X$ then you cannot do better (yet). Things will clear up later then.
            – drhab
            Sep 3 at 14:26
















          • thanks for your answer, however integral of a measure is treated in the successive part of my notes so I can't use it in my proof.
            – Baffo rasta
            Sep 3 at 14:13






          • 1




            Yes, you can: by using the point measure $Bbb P$.
            – amsmath
            Sep 3 at 14:17










          • As @amsmath says. But important for you is that you practicize the definition of $mathbb EX$ as it is given in your notes. If that definition makes use of the distribution of $X$ then you cannot do better (yet). Things will clear up later then.
            – drhab
            Sep 3 at 14:26















          thanks for your answer, however integral of a measure is treated in the successive part of my notes so I can't use it in my proof.
          – Baffo rasta
          Sep 3 at 14:13




          thanks for your answer, however integral of a measure is treated in the successive part of my notes so I can't use it in my proof.
          – Baffo rasta
          Sep 3 at 14:13




          1




          1




          Yes, you can: by using the point measure $Bbb P$.
          – amsmath
          Sep 3 at 14:17




          Yes, you can: by using the point measure $Bbb P$.
          – amsmath
          Sep 3 at 14:17












          As @amsmath says. But important for you is that you practicize the definition of $mathbb EX$ as it is given in your notes. If that definition makes use of the distribution of $X$ then you cannot do better (yet). Things will clear up later then.
          – drhab
          Sep 3 at 14:26




          As @amsmath says. But important for you is that you practicize the definition of $mathbb EX$ as it is given in your notes. If that definition makes use of the distribution of $X$ then you cannot do better (yet). Things will clear up later then.
          – drhab
          Sep 3 at 14:26

















           

          draft saved


          draft discarded















































           


          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2903847%2fis-this-proof-for-linearity-of-expectation-correct%23new-answer', 'question_page');

          );

          Post as a guest













































































          這個網誌中的熱門文章

          How to combine Bézier curves to a surface?

          Mutual Information Always Non-negative

          Why am i infinitely getting the same tweet with the Twitter Search API?