$YsimoperatornamePoisson(lambda_2 = 15)$. Let $Z = X + Y$ . Compute $operatornameCorr(X, Z)$. [duplicate]

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite













This question already has an answer here:



  • $XsimmathrmPoisson(lambda_1 = 5)$ and $YsimmathrmPoisson(lambda_2 = 15)$. Let $Z = X + Y$ . Compute $mathrmCorr(X, Z)$.

    2 answers



Let $X$ and $Y$ be independent random variables such that $Xsim operatornamePoisson(lambda_1 = 5)$ and
$Ysim operatornamePoisson(lambda_2 = 15)$. Let $Z = X + Y$ . Compute $operatornameCorr(X, Z)$.



Answer:



$operatornameVar(X) = 5$



$operatornameVar(Y) = 15$



$operatornameCov(Z) = operatornameCov(X, X+Y) = operatornameCov(X, X) = operatornameVar(X) = 5$



$operatornameVar(Z) = operatornameVar(X + Y) = V(X) + V(Y) = 5 + 15 = 20$



$$operatornameCorr(X, Z) = fracoperatornameCov(X, Z)sqrtoperatornameVar(X) sqrtoperatornameVar(Z) = frac5sqrt5 cdot 20 = 0.5$$




I'm just wondering if this statement is true for all covariance values or just if independent.



$operatornameCov(Z) = operatornameCov(X, X+Y) = operatornameCov(X, X) = operatornameVar(X) = 5$



Couldn't find it on wiki







share|cite|improve this question














marked as duplicate by heropup probability
Users with the  probability badge can single-handedly close probability questions as duplicates and reopen them as needed.

StackExchange.ready(function()
if (StackExchange.options.isMobile) return;

$('.dupe-hammer-message-hover:not(.hover-bound)').each(function()
var $hover = $(this).addClass('hover-bound'),
$msg = $hover.siblings('.dupe-hammer-message');

$hover.hover(
function()
$hover.showInfoMessage('',
messageElement: $msg.clone().show(),
transient: false,
position: my: 'bottom left', at: 'top center', offsetTop: -7 ,
dismissable: false,
relativeToBody: true
);
,
function()
StackExchange.helpers.removeMessages();

);
);
);
Aug 10 at 21:56


This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.


















    up vote
    1
    down vote

    favorite













    This question already has an answer here:



    • $XsimmathrmPoisson(lambda_1 = 5)$ and $YsimmathrmPoisson(lambda_2 = 15)$. Let $Z = X + Y$ . Compute $mathrmCorr(X, Z)$.

      2 answers



    Let $X$ and $Y$ be independent random variables such that $Xsim operatornamePoisson(lambda_1 = 5)$ and
    $Ysim operatornamePoisson(lambda_2 = 15)$. Let $Z = X + Y$ . Compute $operatornameCorr(X, Z)$.



    Answer:



    $operatornameVar(X) = 5$



    $operatornameVar(Y) = 15$



    $operatornameCov(Z) = operatornameCov(X, X+Y) = operatornameCov(X, X) = operatornameVar(X) = 5$



    $operatornameVar(Z) = operatornameVar(X + Y) = V(X) + V(Y) = 5 + 15 = 20$



    $$operatornameCorr(X, Z) = fracoperatornameCov(X, Z)sqrtoperatornameVar(X) sqrtoperatornameVar(Z) = frac5sqrt5 cdot 20 = 0.5$$




    I'm just wondering if this statement is true for all covariance values or just if independent.



    $operatornameCov(Z) = operatornameCov(X, X+Y) = operatornameCov(X, X) = operatornameVar(X) = 5$



    Couldn't find it on wiki







    share|cite|improve this question














    marked as duplicate by heropup probability
    Users with the  probability badge can single-handedly close probability questions as duplicates and reopen them as needed.

    StackExchange.ready(function()
    if (StackExchange.options.isMobile) return;

    $('.dupe-hammer-message-hover:not(.hover-bound)').each(function()
    var $hover = $(this).addClass('hover-bound'),
    $msg = $hover.siblings('.dupe-hammer-message');

    $hover.hover(
    function()
    $hover.showInfoMessage('',
    messageElement: $msg.clone().show(),
    transient: false,
    position: my: 'bottom left', at: 'top center', offsetTop: -7 ,
    dismissable: false,
    relativeToBody: true
    );
    ,
    function()
    StackExchange.helpers.removeMessages();

    );
    );
    );
    Aug 10 at 21:56


    This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.
















      up vote
      1
      down vote

      favorite









      up vote
      1
      down vote

      favorite












      This question already has an answer here:



      • $XsimmathrmPoisson(lambda_1 = 5)$ and $YsimmathrmPoisson(lambda_2 = 15)$. Let $Z = X + Y$ . Compute $mathrmCorr(X, Z)$.

        2 answers



      Let $X$ and $Y$ be independent random variables such that $Xsim operatornamePoisson(lambda_1 = 5)$ and
      $Ysim operatornamePoisson(lambda_2 = 15)$. Let $Z = X + Y$ . Compute $operatornameCorr(X, Z)$.



      Answer:



      $operatornameVar(X) = 5$



      $operatornameVar(Y) = 15$



      $operatornameCov(Z) = operatornameCov(X, X+Y) = operatornameCov(X, X) = operatornameVar(X) = 5$



      $operatornameVar(Z) = operatornameVar(X + Y) = V(X) + V(Y) = 5 + 15 = 20$



      $$operatornameCorr(X, Z) = fracoperatornameCov(X, Z)sqrtoperatornameVar(X) sqrtoperatornameVar(Z) = frac5sqrt5 cdot 20 = 0.5$$




      I'm just wondering if this statement is true for all covariance values or just if independent.



      $operatornameCov(Z) = operatornameCov(X, X+Y) = operatornameCov(X, X) = operatornameVar(X) = 5$



      Couldn't find it on wiki







      share|cite|improve this question















      This question already has an answer here:



      • $XsimmathrmPoisson(lambda_1 = 5)$ and $YsimmathrmPoisson(lambda_2 = 15)$. Let $Z = X + Y$ . Compute $mathrmCorr(X, Z)$.

        2 answers



      Let $X$ and $Y$ be independent random variables such that $Xsim operatornamePoisson(lambda_1 = 5)$ and
      $Ysim operatornamePoisson(lambda_2 = 15)$. Let $Z = X + Y$ . Compute $operatornameCorr(X, Z)$.



      Answer:



      $operatornameVar(X) = 5$



      $operatornameVar(Y) = 15$



      $operatornameCov(Z) = operatornameCov(X, X+Y) = operatornameCov(X, X) = operatornameVar(X) = 5$



      $operatornameVar(Z) = operatornameVar(X + Y) = V(X) + V(Y) = 5 + 15 = 20$



      $$operatornameCorr(X, Z) = fracoperatornameCov(X, Z)sqrtoperatornameVar(X) sqrtoperatornameVar(Z) = frac5sqrt5 cdot 20 = 0.5$$




      I'm just wondering if this statement is true for all covariance values or just if independent.



      $operatornameCov(Z) = operatornameCov(X, X+Y) = operatornameCov(X, X) = operatornameVar(X) = 5$



      Couldn't find it on wiki





      This question already has an answer here:



      • $XsimmathrmPoisson(lambda_1 = 5)$ and $YsimmathrmPoisson(lambda_2 = 15)$. Let $Z = X + Y$ . Compute $mathrmCorr(X, Z)$.

        2 answers









      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Aug 10 at 21:41









      Bernard

      111k635103




      111k635103










      asked Aug 10 at 21:28









      Tinler

      404311




      404311




      marked as duplicate by heropup probability
      Users with the  probability badge can single-handedly close probability questions as duplicates and reopen them as needed.

      StackExchange.ready(function()
      if (StackExchange.options.isMobile) return;

      $('.dupe-hammer-message-hover:not(.hover-bound)').each(function()
      var $hover = $(this).addClass('hover-bound'),
      $msg = $hover.siblings('.dupe-hammer-message');

      $hover.hover(
      function()
      $hover.showInfoMessage('',
      messageElement: $msg.clone().show(),
      transient: false,
      position: my: 'bottom left', at: 'top center', offsetTop: -7 ,
      dismissable: false,
      relativeToBody: true
      );
      ,
      function()
      StackExchange.helpers.removeMessages();

      );
      );
      );
      Aug 10 at 21:56


      This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.






      marked as duplicate by heropup probability
      Users with the  probability badge can single-handedly close probability questions as duplicates and reopen them as needed.

      StackExchange.ready(function()
      if (StackExchange.options.isMobile) return;

      $('.dupe-hammer-message-hover:not(.hover-bound)').each(function()
      var $hover = $(this).addClass('hover-bound'),
      $msg = $hover.siblings('.dupe-hammer-message');

      $hover.hover(
      function()
      $hover.showInfoMessage('',
      messageElement: $msg.clone().show(),
      transient: false,
      position: my: 'bottom left', at: 'top center', offsetTop: -7 ,
      dismissable: false,
      relativeToBody: true
      );
      ,
      function()
      StackExchange.helpers.removeMessages();

      );
      );
      );
      Aug 10 at 21:56


      This question has been asked before and already has an answer. If those answers do not fully address your question, please ask a new question.






















          2 Answers
          2






          active

          oldest

          votes

















          up vote
          0
          down vote













          $$Cov(X,Z) = Cov(X,X+Y)$$
          Let $E(X) = mu_X$ and $E(Y) = mu_Y$ then
          $$Cov(X,X+Y) = E(X-mu_X)(X-mu_X + Y - mu_Y) $$
          which is
          $$Cov(X,X+Y) = E((X-mu_X)^2) + E(X-mu_X)(Y-mu_Y) $$
          hence
          $$Cov(X,X+Y)= Var X + E(XY) - mu_XEY -mu_YEX +mu_Xmu_Y $$
          Due to independence of $X$ and $Y$, we have
          $$Cov(X,X+Y) = Var X + E(X)E(Y) - mu_Xmu_Y -mu_Ymu_X +mu_Xmu_Y$$
          i.e.
          $$Cov(X,X+Y) = Var X + mu_Xmu_Y- mu_Xmu_Y -mu_Ymu_X +mu_Xmu_Y = Var X$$






          share|cite|improve this answer



























            up vote
            0
            down vote













            Lemma. $$textcov(X, Y) = 0$$ if $X$ and $Y$ are independent.



            Proof. $$textcov(X, Y) = mathopmathbbE[XY] - mathopmathbbE[X]mathopmathbbE[Y] $$



            which is 0 as the expected value of a product of random variables is the product of the expected values of those variables.



            Original question



            Now back to the original question:



            $$beginalign
            textcov(X, Z) &= textcov(X,X-Y)\
            &= textcov(X,X)-textcov(X,Y)\
            &= textvar(X,X) - 0\
            &= 5
            endalign$$






            share|cite|improve this answer



























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes








              up vote
              0
              down vote













              $$Cov(X,Z) = Cov(X,X+Y)$$
              Let $E(X) = mu_X$ and $E(Y) = mu_Y$ then
              $$Cov(X,X+Y) = E(X-mu_X)(X-mu_X + Y - mu_Y) $$
              which is
              $$Cov(X,X+Y) = E((X-mu_X)^2) + E(X-mu_X)(Y-mu_Y) $$
              hence
              $$Cov(X,X+Y)= Var X + E(XY) - mu_XEY -mu_YEX +mu_Xmu_Y $$
              Due to independence of $X$ and $Y$, we have
              $$Cov(X,X+Y) = Var X + E(X)E(Y) - mu_Xmu_Y -mu_Ymu_X +mu_Xmu_Y$$
              i.e.
              $$Cov(X,X+Y) = Var X + mu_Xmu_Y- mu_Xmu_Y -mu_Ymu_X +mu_Xmu_Y = Var X$$






              share|cite|improve this answer
























                up vote
                0
                down vote













                $$Cov(X,Z) = Cov(X,X+Y)$$
                Let $E(X) = mu_X$ and $E(Y) = mu_Y$ then
                $$Cov(X,X+Y) = E(X-mu_X)(X-mu_X + Y - mu_Y) $$
                which is
                $$Cov(X,X+Y) = E((X-mu_X)^2) + E(X-mu_X)(Y-mu_Y) $$
                hence
                $$Cov(X,X+Y)= Var X + E(XY) - mu_XEY -mu_YEX +mu_Xmu_Y $$
                Due to independence of $X$ and $Y$, we have
                $$Cov(X,X+Y) = Var X + E(X)E(Y) - mu_Xmu_Y -mu_Ymu_X +mu_Xmu_Y$$
                i.e.
                $$Cov(X,X+Y) = Var X + mu_Xmu_Y- mu_Xmu_Y -mu_Ymu_X +mu_Xmu_Y = Var X$$






                share|cite|improve this answer






















                  up vote
                  0
                  down vote










                  up vote
                  0
                  down vote









                  $$Cov(X,Z) = Cov(X,X+Y)$$
                  Let $E(X) = mu_X$ and $E(Y) = mu_Y$ then
                  $$Cov(X,X+Y) = E(X-mu_X)(X-mu_X + Y - mu_Y) $$
                  which is
                  $$Cov(X,X+Y) = E((X-mu_X)^2) + E(X-mu_X)(Y-mu_Y) $$
                  hence
                  $$Cov(X,X+Y)= Var X + E(XY) - mu_XEY -mu_YEX +mu_Xmu_Y $$
                  Due to independence of $X$ and $Y$, we have
                  $$Cov(X,X+Y) = Var X + E(X)E(Y) - mu_Xmu_Y -mu_Ymu_X +mu_Xmu_Y$$
                  i.e.
                  $$Cov(X,X+Y) = Var X + mu_Xmu_Y- mu_Xmu_Y -mu_Ymu_X +mu_Xmu_Y = Var X$$






                  share|cite|improve this answer












                  $$Cov(X,Z) = Cov(X,X+Y)$$
                  Let $E(X) = mu_X$ and $E(Y) = mu_Y$ then
                  $$Cov(X,X+Y) = E(X-mu_X)(X-mu_X + Y - mu_Y) $$
                  which is
                  $$Cov(X,X+Y) = E((X-mu_X)^2) + E(X-mu_X)(Y-mu_Y) $$
                  hence
                  $$Cov(X,X+Y)= Var X + E(XY) - mu_XEY -mu_YEX +mu_Xmu_Y $$
                  Due to independence of $X$ and $Y$, we have
                  $$Cov(X,X+Y) = Var X + E(X)E(Y) - mu_Xmu_Y -mu_Ymu_X +mu_Xmu_Y$$
                  i.e.
                  $$Cov(X,X+Y) = Var X + mu_Xmu_Y- mu_Xmu_Y -mu_Ymu_X +mu_Xmu_Y = Var X$$







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Aug 10 at 21:36









                  Ahmad Bazzi

                  3,0601419




                  3,0601419




















                      up vote
                      0
                      down vote













                      Lemma. $$textcov(X, Y) = 0$$ if $X$ and $Y$ are independent.



                      Proof. $$textcov(X, Y) = mathopmathbbE[XY] - mathopmathbbE[X]mathopmathbbE[Y] $$



                      which is 0 as the expected value of a product of random variables is the product of the expected values of those variables.



                      Original question



                      Now back to the original question:



                      $$beginalign
                      textcov(X, Z) &= textcov(X,X-Y)\
                      &= textcov(X,X)-textcov(X,Y)\
                      &= textvar(X,X) - 0\
                      &= 5
                      endalign$$






                      share|cite|improve this answer
























                        up vote
                        0
                        down vote













                        Lemma. $$textcov(X, Y) = 0$$ if $X$ and $Y$ are independent.



                        Proof. $$textcov(X, Y) = mathopmathbbE[XY] - mathopmathbbE[X]mathopmathbbE[Y] $$



                        which is 0 as the expected value of a product of random variables is the product of the expected values of those variables.



                        Original question



                        Now back to the original question:



                        $$beginalign
                        textcov(X, Z) &= textcov(X,X-Y)\
                        &= textcov(X,X)-textcov(X,Y)\
                        &= textvar(X,X) - 0\
                        &= 5
                        endalign$$






                        share|cite|improve this answer






















                          up vote
                          0
                          down vote










                          up vote
                          0
                          down vote









                          Lemma. $$textcov(X, Y) = 0$$ if $X$ and $Y$ are independent.



                          Proof. $$textcov(X, Y) = mathopmathbbE[XY] - mathopmathbbE[X]mathopmathbbE[Y] $$



                          which is 0 as the expected value of a product of random variables is the product of the expected values of those variables.



                          Original question



                          Now back to the original question:



                          $$beginalign
                          textcov(X, Z) &= textcov(X,X-Y)\
                          &= textcov(X,X)-textcov(X,Y)\
                          &= textvar(X,X) - 0\
                          &= 5
                          endalign$$






                          share|cite|improve this answer












                          Lemma. $$textcov(X, Y) = 0$$ if $X$ and $Y$ are independent.



                          Proof. $$textcov(X, Y) = mathopmathbbE[XY] - mathopmathbbE[X]mathopmathbbE[Y] $$



                          which is 0 as the expected value of a product of random variables is the product of the expected values of those variables.



                          Original question



                          Now back to the original question:



                          $$beginalign
                          textcov(X, Z) &= textcov(X,X-Y)\
                          &= textcov(X,X)-textcov(X,Y)\
                          &= textvar(X,X) - 0\
                          &= 5
                          endalign$$







                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered Aug 10 at 21:51









                          Peteris

                          48429




                          48429












                              這個網誌中的熱門文章

                              How to combine Bézier curves to a surface?

                              Carbon dioxide

                              Why am i infinitely getting the same tweet with the Twitter Search API?