pdf of the difference of two exponentially distributed random variables

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
24
down vote

favorite
21












Suppose we have $v$ and $u$, both are independent and exponentially distributed random variables with parameters $mu$ and $lambda$, respectively.



How can we calculate the pdf of $v-u$?







share|cite|improve this question




















  • You might rely on a standard procedure.
    – Did
    Feb 29 '12 at 23:03














up vote
24
down vote

favorite
21












Suppose we have $v$ and $u$, both are independent and exponentially distributed random variables with parameters $mu$ and $lambda$, respectively.



How can we calculate the pdf of $v-u$?







share|cite|improve this question




















  • You might rely on a standard procedure.
    – Did
    Feb 29 '12 at 23:03












up vote
24
down vote

favorite
21









up vote
24
down vote

favorite
21






21





Suppose we have $v$ and $u$, both are independent and exponentially distributed random variables with parameters $mu$ and $lambda$, respectively.



How can we calculate the pdf of $v-u$?







share|cite|improve this question












Suppose we have $v$ and $u$, both are independent and exponentially distributed random variables with parameters $mu$ and $lambda$, respectively.



How can we calculate the pdf of $v-u$?









share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Feb 29 '12 at 22:37









Mathematics Lover

3051210




3051210











  • You might rely on a standard procedure.
    – Did
    Feb 29 '12 at 23:03
















  • You might rely on a standard procedure.
    – Did
    Feb 29 '12 at 23:03















You might rely on a standard procedure.
– Did
Feb 29 '12 at 23:03




You might rely on a standard procedure.
– Did
Feb 29 '12 at 23:03










3 Answers
3






active

oldest

votes

















up vote
29
down vote



accepted










I too prefer to call the random variables $X$ and $Y$. You can think of $X$ and $Y$ as waiting times for two independent things (say $A$ and $B$ respectively) to happen. Suppose we wait until the first of these happens. If it is $A$, then (by the lack-of-memory property of the exponential distribution) the further waiting time until $B$ happens still has the same
exponential distribution as $Y$; if it is $B$, the further waiting time until $A$ happens still has the same exponential distribution as $X$. That says that the conditional distribution of $X-Y$ given $X > Y$ is the distribution of $X$, and the conditional distribution of $X-Y$ given $X < Y$ is the distribution of $-Y$. Since $P(X>Y) = fraclambdamu+lambda$, that says the PDF for $X-Y$ is
$$ f(x) = fraclambda mulambda+mu
casese^-mu x & if $x > 0$cr
e^lambda x & if $x < 0$cr$$






share|cite|improve this answer


















  • 11




    why $P(X>Y)=fraclambdamu+lambda$?
    – user65985
    Aug 14 '13 at 16:55






  • 4




    $X$ ~ $Exp(mu)$ and $Y$ ~ $Exp(lambda)$ then: $P(Y < X) = int_0^inftyint_0^xmu e^-mu ylambda e^-lambda xdydx$
    – YellowPillow
    Nov 5 '15 at 5:50






  • 1




    @YellowPillow Doesn't that give $mu over mu + lambda$?
    – Columbo
    Apr 13 '16 at 14:59











  • This can also be found by the convolution between two pdfs
    – Hunle
    Apr 28 '16 at 4:42

















up vote
21
down vote













The right answer depends very much on what your mathematical background is. I will assume that you have seen some calculus of several variables, and not much beyond that. Instead of using your $u$ and $v$, I will use $X$ and $Y$.



The density function of $X$ is $lambda e^-lambda x$ (for $x ge 0$), and $0$ elsewhere. There is a similar expression for the density function of $Y$. By independence, the joint density function of $X$ and $Y$ is
$$lambdamu e^-lambda xe^-mu y$$
in the first quadrant, and $0$ elsewhere.



Let $Z=Y-X$. We want to find the density function of $Z$. First we will find the cumulative distribution function $F_Z(z)$ of $Z$, that is, the probability that $Zle z$.



So we want the probability that $Y-X le z$. The geometry is a little different when $z$ is positive than when $z$ is negative. I will do $z$ positive, and you can take care of negative $z$.



Consider $z$ fixed and positive, and draw the line $y-x=z$. We want to find the probability that the ordered pair $(X,Y)$ ends up below that line or on it. The only relevant region is in the first quadrant. So let $D$ be the part of the first quadrant that lies below or on the line $y=x+z$. Then
$$P(Z le z)=iint_D lambdamu e^-lambda xe^-mu y,dx,dy.$$



We will evaluate this integral, by using an iterated integral. First we will integrate with respect to $y$, and then with respect to $x$. Note that $y$ travels from $0$ to $x+z$, and then $x$ travels from $0$ to infinity. Thus
$$P(Zle x)=int_0^infty lambda e^-lambda xleft(int_y=0^x+z mu e^-mu y,dyright)dx.$$



The inner integral turns out to be $1-e^-mu(x+z)$. So now we need to find
$$int_0^infty left(lambda e^-lambda x-lambda e^-mu z e^-(lambda+mu)xright)dx.$$
The first part is easy, it is $1$. The second part is fairly routine. We end up with
$$P(Z le z)=1-fraclambdalambda+mue^-mu z.$$
For the density function $f_Z(z)$ of $Z$, differentiate the cumulative distribution function. We get
$$f_Z(z)=fraclambdamulambda+mu e^-mu z quadtextfor $z ge 0$.$$
Please note that we only dealt with positive $z$. A very similar argument will get you $f_Z(z)$ at negative values of $z$. The main difference is that the final integration is from $x=-z$ on.






share|cite|improve this answer






















  • Before the first double integral, should that be "below or on the line $y = x + z$"?
    – Patrick
    Mar 1 '12 at 0:10










  • @Patrick: Thank you for spotting the typo! Fixed.
    – André Nicolas
    Mar 1 '12 at 0:23

















up vote
1
down vote













There is an alternative way to get the result by applying the the Law of Total Probability:



$$
P[W] = int_Z P[W mid Z = z]P[Z=z]dz
$$



As others have done, let $X sim exp(lambda)$ and $Y sim exp(mu)$. What follows is the only slightly unintuitive step: instead of directly calculating the PDF of $Y-X$, first calculate the CDF: $ P[X-Y leq t]$ (we can then differentiate at the end).



$$
P[Y - X leq t] = P[Y leq t+X]
$$



This is where we'll apply total probability to get



$$
= int_0^infty P[Y leq t+X mid X=x]P[X=x] dx
$$
$$
= int_0^infty P[Y leq t+x]P[X=x] dx
= int_0^infty F_Y(t+x) f_X(x) dx
$$
Note substituting the CDF here is only valid if $t geq 0$,
$$
= int_0^infty (1- e^-mu(t+x)) lambda e^-lambda x dx
= lambda int_0^infty e^-lambda x dx - lambda e^-mu t int_0^infty e^-(lambda+mu)x dx
$$
$$
= lambda left[ frace^-lambda x-lambda right]^infty_0 - lambda e^-mu t left[ frace^-(lambda+mu)x-(lambda+mu) right]^infty_0
=1 - fraclambda e^-mu tlambda+mu
$$



Differentiating this last expression will gives us the PDF:



$$
f_Y-X(t) = fraclambda mu e^-mu tlambda+mu quad textfor $t geq 0$
$$






share|cite|improve this answer






















    Your Answer




    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "69"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: false,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );








     

    draft saved


    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f115022%2fpdf-of-the-difference-of-two-exponentially-distributed-random-variables%23new-answer', 'question_page');

    );

    Post as a guest






























    3 Answers
    3






    active

    oldest

    votes








    3 Answers
    3






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    29
    down vote



    accepted










    I too prefer to call the random variables $X$ and $Y$. You can think of $X$ and $Y$ as waiting times for two independent things (say $A$ and $B$ respectively) to happen. Suppose we wait until the first of these happens. If it is $A$, then (by the lack-of-memory property of the exponential distribution) the further waiting time until $B$ happens still has the same
    exponential distribution as $Y$; if it is $B$, the further waiting time until $A$ happens still has the same exponential distribution as $X$. That says that the conditional distribution of $X-Y$ given $X > Y$ is the distribution of $X$, and the conditional distribution of $X-Y$ given $X < Y$ is the distribution of $-Y$. Since $P(X>Y) = fraclambdamu+lambda$, that says the PDF for $X-Y$ is
    $$ f(x) = fraclambda mulambda+mu
    casese^-mu x & if $x > 0$cr
    e^lambda x & if $x < 0$cr$$






    share|cite|improve this answer


















    • 11




      why $P(X>Y)=fraclambdamu+lambda$?
      – user65985
      Aug 14 '13 at 16:55






    • 4




      $X$ ~ $Exp(mu)$ and $Y$ ~ $Exp(lambda)$ then: $P(Y < X) = int_0^inftyint_0^xmu e^-mu ylambda e^-lambda xdydx$
      – YellowPillow
      Nov 5 '15 at 5:50






    • 1




      @YellowPillow Doesn't that give $mu over mu + lambda$?
      – Columbo
      Apr 13 '16 at 14:59











    • This can also be found by the convolution between two pdfs
      – Hunle
      Apr 28 '16 at 4:42














    up vote
    29
    down vote



    accepted










    I too prefer to call the random variables $X$ and $Y$. You can think of $X$ and $Y$ as waiting times for two independent things (say $A$ and $B$ respectively) to happen. Suppose we wait until the first of these happens. If it is $A$, then (by the lack-of-memory property of the exponential distribution) the further waiting time until $B$ happens still has the same
    exponential distribution as $Y$; if it is $B$, the further waiting time until $A$ happens still has the same exponential distribution as $X$. That says that the conditional distribution of $X-Y$ given $X > Y$ is the distribution of $X$, and the conditional distribution of $X-Y$ given $X < Y$ is the distribution of $-Y$. Since $P(X>Y) = fraclambdamu+lambda$, that says the PDF for $X-Y$ is
    $$ f(x) = fraclambda mulambda+mu
    casese^-mu x & if $x > 0$cr
    e^lambda x & if $x < 0$cr$$






    share|cite|improve this answer


















    • 11




      why $P(X>Y)=fraclambdamu+lambda$?
      – user65985
      Aug 14 '13 at 16:55






    • 4




      $X$ ~ $Exp(mu)$ and $Y$ ~ $Exp(lambda)$ then: $P(Y < X) = int_0^inftyint_0^xmu e^-mu ylambda e^-lambda xdydx$
      – YellowPillow
      Nov 5 '15 at 5:50






    • 1




      @YellowPillow Doesn't that give $mu over mu + lambda$?
      – Columbo
      Apr 13 '16 at 14:59











    • This can also be found by the convolution between two pdfs
      – Hunle
      Apr 28 '16 at 4:42












    up vote
    29
    down vote



    accepted







    up vote
    29
    down vote



    accepted






    I too prefer to call the random variables $X$ and $Y$. You can think of $X$ and $Y$ as waiting times for two independent things (say $A$ and $B$ respectively) to happen. Suppose we wait until the first of these happens. If it is $A$, then (by the lack-of-memory property of the exponential distribution) the further waiting time until $B$ happens still has the same
    exponential distribution as $Y$; if it is $B$, the further waiting time until $A$ happens still has the same exponential distribution as $X$. That says that the conditional distribution of $X-Y$ given $X > Y$ is the distribution of $X$, and the conditional distribution of $X-Y$ given $X < Y$ is the distribution of $-Y$. Since $P(X>Y) = fraclambdamu+lambda$, that says the PDF for $X-Y$ is
    $$ f(x) = fraclambda mulambda+mu
    casese^-mu x & if $x > 0$cr
    e^lambda x & if $x < 0$cr$$






    share|cite|improve this answer














    I too prefer to call the random variables $X$ and $Y$. You can think of $X$ and $Y$ as waiting times for two independent things (say $A$ and $B$ respectively) to happen. Suppose we wait until the first of these happens. If it is $A$, then (by the lack-of-memory property of the exponential distribution) the further waiting time until $B$ happens still has the same
    exponential distribution as $Y$; if it is $B$, the further waiting time until $A$ happens still has the same exponential distribution as $X$. That says that the conditional distribution of $X-Y$ given $X > Y$ is the distribution of $X$, and the conditional distribution of $X-Y$ given $X < Y$ is the distribution of $-Y$. Since $P(X>Y) = fraclambdamu+lambda$, that says the PDF for $X-Y$ is
    $$ f(x) = fraclambda mulambda+mu
    casese^-mu x & if $x > 0$cr
    e^lambda x & if $x < 0$cr$$







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Jul 16 '13 at 19:03









    Darko Veberic

    1053




    1053










    answered Mar 1 '12 at 3:36









    Robert Israel

    305k22201443




    305k22201443







    • 11




      why $P(X>Y)=fraclambdamu+lambda$?
      – user65985
      Aug 14 '13 at 16:55






    • 4




      $X$ ~ $Exp(mu)$ and $Y$ ~ $Exp(lambda)$ then: $P(Y < X) = int_0^inftyint_0^xmu e^-mu ylambda e^-lambda xdydx$
      – YellowPillow
      Nov 5 '15 at 5:50






    • 1




      @YellowPillow Doesn't that give $mu over mu + lambda$?
      – Columbo
      Apr 13 '16 at 14:59











    • This can also be found by the convolution between two pdfs
      – Hunle
      Apr 28 '16 at 4:42












    • 11




      why $P(X>Y)=fraclambdamu+lambda$?
      – user65985
      Aug 14 '13 at 16:55






    • 4




      $X$ ~ $Exp(mu)$ and $Y$ ~ $Exp(lambda)$ then: $P(Y < X) = int_0^inftyint_0^xmu e^-mu ylambda e^-lambda xdydx$
      – YellowPillow
      Nov 5 '15 at 5:50






    • 1




      @YellowPillow Doesn't that give $mu over mu + lambda$?
      – Columbo
      Apr 13 '16 at 14:59











    • This can also be found by the convolution between two pdfs
      – Hunle
      Apr 28 '16 at 4:42







    11




    11




    why $P(X>Y)=fraclambdamu+lambda$?
    – user65985
    Aug 14 '13 at 16:55




    why $P(X>Y)=fraclambdamu+lambda$?
    – user65985
    Aug 14 '13 at 16:55




    4




    4




    $X$ ~ $Exp(mu)$ and $Y$ ~ $Exp(lambda)$ then: $P(Y < X) = int_0^inftyint_0^xmu e^-mu ylambda e^-lambda xdydx$
    – YellowPillow
    Nov 5 '15 at 5:50




    $X$ ~ $Exp(mu)$ and $Y$ ~ $Exp(lambda)$ then: $P(Y < X) = int_0^inftyint_0^xmu e^-mu ylambda e^-lambda xdydx$
    – YellowPillow
    Nov 5 '15 at 5:50




    1




    1




    @YellowPillow Doesn't that give $mu over mu + lambda$?
    – Columbo
    Apr 13 '16 at 14:59





    @YellowPillow Doesn't that give $mu over mu + lambda$?
    – Columbo
    Apr 13 '16 at 14:59













    This can also be found by the convolution between two pdfs
    – Hunle
    Apr 28 '16 at 4:42




    This can also be found by the convolution between two pdfs
    – Hunle
    Apr 28 '16 at 4:42










    up vote
    21
    down vote













    The right answer depends very much on what your mathematical background is. I will assume that you have seen some calculus of several variables, and not much beyond that. Instead of using your $u$ and $v$, I will use $X$ and $Y$.



    The density function of $X$ is $lambda e^-lambda x$ (for $x ge 0$), and $0$ elsewhere. There is a similar expression for the density function of $Y$. By independence, the joint density function of $X$ and $Y$ is
    $$lambdamu e^-lambda xe^-mu y$$
    in the first quadrant, and $0$ elsewhere.



    Let $Z=Y-X$. We want to find the density function of $Z$. First we will find the cumulative distribution function $F_Z(z)$ of $Z$, that is, the probability that $Zle z$.



    So we want the probability that $Y-X le z$. The geometry is a little different when $z$ is positive than when $z$ is negative. I will do $z$ positive, and you can take care of negative $z$.



    Consider $z$ fixed and positive, and draw the line $y-x=z$. We want to find the probability that the ordered pair $(X,Y)$ ends up below that line or on it. The only relevant region is in the first quadrant. So let $D$ be the part of the first quadrant that lies below or on the line $y=x+z$. Then
    $$P(Z le z)=iint_D lambdamu e^-lambda xe^-mu y,dx,dy.$$



    We will evaluate this integral, by using an iterated integral. First we will integrate with respect to $y$, and then with respect to $x$. Note that $y$ travels from $0$ to $x+z$, and then $x$ travels from $0$ to infinity. Thus
    $$P(Zle x)=int_0^infty lambda e^-lambda xleft(int_y=0^x+z mu e^-mu y,dyright)dx.$$



    The inner integral turns out to be $1-e^-mu(x+z)$. So now we need to find
    $$int_0^infty left(lambda e^-lambda x-lambda e^-mu z e^-(lambda+mu)xright)dx.$$
    The first part is easy, it is $1$. The second part is fairly routine. We end up with
    $$P(Z le z)=1-fraclambdalambda+mue^-mu z.$$
    For the density function $f_Z(z)$ of $Z$, differentiate the cumulative distribution function. We get
    $$f_Z(z)=fraclambdamulambda+mu e^-mu z quadtextfor $z ge 0$.$$
    Please note that we only dealt with positive $z$. A very similar argument will get you $f_Z(z)$ at negative values of $z$. The main difference is that the final integration is from $x=-z$ on.






    share|cite|improve this answer






















    • Before the first double integral, should that be "below or on the line $y = x + z$"?
      – Patrick
      Mar 1 '12 at 0:10










    • @Patrick: Thank you for spotting the typo! Fixed.
      – André Nicolas
      Mar 1 '12 at 0:23














    up vote
    21
    down vote













    The right answer depends very much on what your mathematical background is. I will assume that you have seen some calculus of several variables, and not much beyond that. Instead of using your $u$ and $v$, I will use $X$ and $Y$.



    The density function of $X$ is $lambda e^-lambda x$ (for $x ge 0$), and $0$ elsewhere. There is a similar expression for the density function of $Y$. By independence, the joint density function of $X$ and $Y$ is
    $$lambdamu e^-lambda xe^-mu y$$
    in the first quadrant, and $0$ elsewhere.



    Let $Z=Y-X$. We want to find the density function of $Z$. First we will find the cumulative distribution function $F_Z(z)$ of $Z$, that is, the probability that $Zle z$.



    So we want the probability that $Y-X le z$. The geometry is a little different when $z$ is positive than when $z$ is negative. I will do $z$ positive, and you can take care of negative $z$.



    Consider $z$ fixed and positive, and draw the line $y-x=z$. We want to find the probability that the ordered pair $(X,Y)$ ends up below that line or on it. The only relevant region is in the first quadrant. So let $D$ be the part of the first quadrant that lies below or on the line $y=x+z$. Then
    $$P(Z le z)=iint_D lambdamu e^-lambda xe^-mu y,dx,dy.$$



    We will evaluate this integral, by using an iterated integral. First we will integrate with respect to $y$, and then with respect to $x$. Note that $y$ travels from $0$ to $x+z$, and then $x$ travels from $0$ to infinity. Thus
    $$P(Zle x)=int_0^infty lambda e^-lambda xleft(int_y=0^x+z mu e^-mu y,dyright)dx.$$



    The inner integral turns out to be $1-e^-mu(x+z)$. So now we need to find
    $$int_0^infty left(lambda e^-lambda x-lambda e^-mu z e^-(lambda+mu)xright)dx.$$
    The first part is easy, it is $1$. The second part is fairly routine. We end up with
    $$P(Z le z)=1-fraclambdalambda+mue^-mu z.$$
    For the density function $f_Z(z)$ of $Z$, differentiate the cumulative distribution function. We get
    $$f_Z(z)=fraclambdamulambda+mu e^-mu z quadtextfor $z ge 0$.$$
    Please note that we only dealt with positive $z$. A very similar argument will get you $f_Z(z)$ at negative values of $z$. The main difference is that the final integration is from $x=-z$ on.






    share|cite|improve this answer






















    • Before the first double integral, should that be "below or on the line $y = x + z$"?
      – Patrick
      Mar 1 '12 at 0:10










    • @Patrick: Thank you for spotting the typo! Fixed.
      – André Nicolas
      Mar 1 '12 at 0:23












    up vote
    21
    down vote










    up vote
    21
    down vote









    The right answer depends very much on what your mathematical background is. I will assume that you have seen some calculus of several variables, and not much beyond that. Instead of using your $u$ and $v$, I will use $X$ and $Y$.



    The density function of $X$ is $lambda e^-lambda x$ (for $x ge 0$), and $0$ elsewhere. There is a similar expression for the density function of $Y$. By independence, the joint density function of $X$ and $Y$ is
    $$lambdamu e^-lambda xe^-mu y$$
    in the first quadrant, and $0$ elsewhere.



    Let $Z=Y-X$. We want to find the density function of $Z$. First we will find the cumulative distribution function $F_Z(z)$ of $Z$, that is, the probability that $Zle z$.



    So we want the probability that $Y-X le z$. The geometry is a little different when $z$ is positive than when $z$ is negative. I will do $z$ positive, and you can take care of negative $z$.



    Consider $z$ fixed and positive, and draw the line $y-x=z$. We want to find the probability that the ordered pair $(X,Y)$ ends up below that line or on it. The only relevant region is in the first quadrant. So let $D$ be the part of the first quadrant that lies below or on the line $y=x+z$. Then
    $$P(Z le z)=iint_D lambdamu e^-lambda xe^-mu y,dx,dy.$$



    We will evaluate this integral, by using an iterated integral. First we will integrate with respect to $y$, and then with respect to $x$. Note that $y$ travels from $0$ to $x+z$, and then $x$ travels from $0$ to infinity. Thus
    $$P(Zle x)=int_0^infty lambda e^-lambda xleft(int_y=0^x+z mu e^-mu y,dyright)dx.$$



    The inner integral turns out to be $1-e^-mu(x+z)$. So now we need to find
    $$int_0^infty left(lambda e^-lambda x-lambda e^-mu z e^-(lambda+mu)xright)dx.$$
    The first part is easy, it is $1$. The second part is fairly routine. We end up with
    $$P(Z le z)=1-fraclambdalambda+mue^-mu z.$$
    For the density function $f_Z(z)$ of $Z$, differentiate the cumulative distribution function. We get
    $$f_Z(z)=fraclambdamulambda+mu e^-mu z quadtextfor $z ge 0$.$$
    Please note that we only dealt with positive $z$. A very similar argument will get you $f_Z(z)$ at negative values of $z$. The main difference is that the final integration is from $x=-z$ on.






    share|cite|improve this answer














    The right answer depends very much on what your mathematical background is. I will assume that you have seen some calculus of several variables, and not much beyond that. Instead of using your $u$ and $v$, I will use $X$ and $Y$.



    The density function of $X$ is $lambda e^-lambda x$ (for $x ge 0$), and $0$ elsewhere. There is a similar expression for the density function of $Y$. By independence, the joint density function of $X$ and $Y$ is
    $$lambdamu e^-lambda xe^-mu y$$
    in the first quadrant, and $0$ elsewhere.



    Let $Z=Y-X$. We want to find the density function of $Z$. First we will find the cumulative distribution function $F_Z(z)$ of $Z$, that is, the probability that $Zle z$.



    So we want the probability that $Y-X le z$. The geometry is a little different when $z$ is positive than when $z$ is negative. I will do $z$ positive, and you can take care of negative $z$.



    Consider $z$ fixed and positive, and draw the line $y-x=z$. We want to find the probability that the ordered pair $(X,Y)$ ends up below that line or on it. The only relevant region is in the first quadrant. So let $D$ be the part of the first quadrant that lies below or on the line $y=x+z$. Then
    $$P(Z le z)=iint_D lambdamu e^-lambda xe^-mu y,dx,dy.$$



    We will evaluate this integral, by using an iterated integral. First we will integrate with respect to $y$, and then with respect to $x$. Note that $y$ travels from $0$ to $x+z$, and then $x$ travels from $0$ to infinity. Thus
    $$P(Zle x)=int_0^infty lambda e^-lambda xleft(int_y=0^x+z mu e^-mu y,dyright)dx.$$



    The inner integral turns out to be $1-e^-mu(x+z)$. So now we need to find
    $$int_0^infty left(lambda e^-lambda x-lambda e^-mu z e^-(lambda+mu)xright)dx.$$
    The first part is easy, it is $1$. The second part is fairly routine. We end up with
    $$P(Z le z)=1-fraclambdalambda+mue^-mu z.$$
    For the density function $f_Z(z)$ of $Z$, differentiate the cumulative distribution function. We get
    $$f_Z(z)=fraclambdamulambda+mu e^-mu z quadtextfor $z ge 0$.$$
    Please note that we only dealt with positive $z$. A very similar argument will get you $f_Z(z)$ at negative values of $z$. The main difference is that the final integration is from $x=-z$ on.







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Mar 1 '12 at 0:21

























    answered Mar 1 '12 at 0:00









    André Nicolas

    446k36413790




    446k36413790











    • Before the first double integral, should that be "below or on the line $y = x + z$"?
      – Patrick
      Mar 1 '12 at 0:10










    • @Patrick: Thank you for spotting the typo! Fixed.
      – André Nicolas
      Mar 1 '12 at 0:23
















    • Before the first double integral, should that be "below or on the line $y = x + z$"?
      – Patrick
      Mar 1 '12 at 0:10










    • @Patrick: Thank you for spotting the typo! Fixed.
      – André Nicolas
      Mar 1 '12 at 0:23















    Before the first double integral, should that be "below or on the line $y = x + z$"?
    – Patrick
    Mar 1 '12 at 0:10




    Before the first double integral, should that be "below or on the line $y = x + z$"?
    – Patrick
    Mar 1 '12 at 0:10












    @Patrick: Thank you for spotting the typo! Fixed.
    – André Nicolas
    Mar 1 '12 at 0:23




    @Patrick: Thank you for spotting the typo! Fixed.
    – André Nicolas
    Mar 1 '12 at 0:23










    up vote
    1
    down vote













    There is an alternative way to get the result by applying the the Law of Total Probability:



    $$
    P[W] = int_Z P[W mid Z = z]P[Z=z]dz
    $$



    As others have done, let $X sim exp(lambda)$ and $Y sim exp(mu)$. What follows is the only slightly unintuitive step: instead of directly calculating the PDF of $Y-X$, first calculate the CDF: $ P[X-Y leq t]$ (we can then differentiate at the end).



    $$
    P[Y - X leq t] = P[Y leq t+X]
    $$



    This is where we'll apply total probability to get



    $$
    = int_0^infty P[Y leq t+X mid X=x]P[X=x] dx
    $$
    $$
    = int_0^infty P[Y leq t+x]P[X=x] dx
    = int_0^infty F_Y(t+x) f_X(x) dx
    $$
    Note substituting the CDF here is only valid if $t geq 0$,
    $$
    = int_0^infty (1- e^-mu(t+x)) lambda e^-lambda x dx
    = lambda int_0^infty e^-lambda x dx - lambda e^-mu t int_0^infty e^-(lambda+mu)x dx
    $$
    $$
    = lambda left[ frace^-lambda x-lambda right]^infty_0 - lambda e^-mu t left[ frace^-(lambda+mu)x-(lambda+mu) right]^infty_0
    =1 - fraclambda e^-mu tlambda+mu
    $$



    Differentiating this last expression will gives us the PDF:



    $$
    f_Y-X(t) = fraclambda mu e^-mu tlambda+mu quad textfor $t geq 0$
    $$






    share|cite|improve this answer


























      up vote
      1
      down vote













      There is an alternative way to get the result by applying the the Law of Total Probability:



      $$
      P[W] = int_Z P[W mid Z = z]P[Z=z]dz
      $$



      As others have done, let $X sim exp(lambda)$ and $Y sim exp(mu)$. What follows is the only slightly unintuitive step: instead of directly calculating the PDF of $Y-X$, first calculate the CDF: $ P[X-Y leq t]$ (we can then differentiate at the end).



      $$
      P[Y - X leq t] = P[Y leq t+X]
      $$



      This is where we'll apply total probability to get



      $$
      = int_0^infty P[Y leq t+X mid X=x]P[X=x] dx
      $$
      $$
      = int_0^infty P[Y leq t+x]P[X=x] dx
      = int_0^infty F_Y(t+x) f_X(x) dx
      $$
      Note substituting the CDF here is only valid if $t geq 0$,
      $$
      = int_0^infty (1- e^-mu(t+x)) lambda e^-lambda x dx
      = lambda int_0^infty e^-lambda x dx - lambda e^-mu t int_0^infty e^-(lambda+mu)x dx
      $$
      $$
      = lambda left[ frace^-lambda x-lambda right]^infty_0 - lambda e^-mu t left[ frace^-(lambda+mu)x-(lambda+mu) right]^infty_0
      =1 - fraclambda e^-mu tlambda+mu
      $$



      Differentiating this last expression will gives us the PDF:



      $$
      f_Y-X(t) = fraclambda mu e^-mu tlambda+mu quad textfor $t geq 0$
      $$






      share|cite|improve this answer
























        up vote
        1
        down vote










        up vote
        1
        down vote









        There is an alternative way to get the result by applying the the Law of Total Probability:



        $$
        P[W] = int_Z P[W mid Z = z]P[Z=z]dz
        $$



        As others have done, let $X sim exp(lambda)$ and $Y sim exp(mu)$. What follows is the only slightly unintuitive step: instead of directly calculating the PDF of $Y-X$, first calculate the CDF: $ P[X-Y leq t]$ (we can then differentiate at the end).



        $$
        P[Y - X leq t] = P[Y leq t+X]
        $$



        This is where we'll apply total probability to get



        $$
        = int_0^infty P[Y leq t+X mid X=x]P[X=x] dx
        $$
        $$
        = int_0^infty P[Y leq t+x]P[X=x] dx
        = int_0^infty F_Y(t+x) f_X(x) dx
        $$
        Note substituting the CDF here is only valid if $t geq 0$,
        $$
        = int_0^infty (1- e^-mu(t+x)) lambda e^-lambda x dx
        = lambda int_0^infty e^-lambda x dx - lambda e^-mu t int_0^infty e^-(lambda+mu)x dx
        $$
        $$
        = lambda left[ frace^-lambda x-lambda right]^infty_0 - lambda e^-mu t left[ frace^-(lambda+mu)x-(lambda+mu) right]^infty_0
        =1 - fraclambda e^-mu tlambda+mu
        $$



        Differentiating this last expression will gives us the PDF:



        $$
        f_Y-X(t) = fraclambda mu e^-mu tlambda+mu quad textfor $t geq 0$
        $$






        share|cite|improve this answer














        There is an alternative way to get the result by applying the the Law of Total Probability:



        $$
        P[W] = int_Z P[W mid Z = z]P[Z=z]dz
        $$



        As others have done, let $X sim exp(lambda)$ and $Y sim exp(mu)$. What follows is the only slightly unintuitive step: instead of directly calculating the PDF of $Y-X$, first calculate the CDF: $ P[X-Y leq t]$ (we can then differentiate at the end).



        $$
        P[Y - X leq t] = P[Y leq t+X]
        $$



        This is where we'll apply total probability to get



        $$
        = int_0^infty P[Y leq t+X mid X=x]P[X=x] dx
        $$
        $$
        = int_0^infty P[Y leq t+x]P[X=x] dx
        = int_0^infty F_Y(t+x) f_X(x) dx
        $$
        Note substituting the CDF here is only valid if $t geq 0$,
        $$
        = int_0^infty (1- e^-mu(t+x)) lambda e^-lambda x dx
        = lambda int_0^infty e^-lambda x dx - lambda e^-mu t int_0^infty e^-(lambda+mu)x dx
        $$
        $$
        = lambda left[ frace^-lambda x-lambda right]^infty_0 - lambda e^-mu t left[ frace^-(lambda+mu)x-(lambda+mu) right]^infty_0
        =1 - fraclambda e^-mu tlambda+mu
        $$



        Differentiating this last expression will gives us the PDF:



        $$
        f_Y-X(t) = fraclambda mu e^-mu tlambda+mu quad textfor $t geq 0$
        $$







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Aug 12 at 22:47









        RationalHusky

        1065




        1065










        answered Nov 11 '15 at 0:14









        Alex P. Miller

        1114




        1114






















             

            draft saved


            draft discarded


























             


            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f115022%2fpdf-of-the-difference-of-two-exponentially-distributed-random-variables%23new-answer', 'question_page');

            );

            Post as a guest













































































            這個網誌中的熱門文章

            How to combine Bézier curves to a surface?

            Mutual Information Always Non-negative

            Why am i infinitely getting the same tweet with the Twitter Search API?