Prove that the measure-theoretic definition of probability aligns with the basic one?
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
I often see people write $$P(X in A) = P(X in A | Y in B)P(Y in B) + P(X in A | Y in B^c)P(Y in B^c)$$
I want to formally justify this in a measure theory setting.
We can write $$P(X in A) = int 1_A(X) dP = int 1_A(X) left[ 1_B(Y) + 1_B^c(Y)right] dP $$$$= int 1_A(X)1_B(Y) dP + int 1_A(X) 1_B^c(Y)dP$$ $$ = P(X in A, Y in B) + P(X in A, Y in B^c)$$
And so we would be done if only I could prove that $$P(X in A, Y in B) = P(X in A | Y in B) P(Y in B).$$
This is true by definition in basic probability theory. Is it also true in measure theory?
So I am looking for two things:
- What is the definition of conditional probability in measure theory? Personally, I was only ever introduced to a conditional expectation, not a conditional probability.
- How does one prove that this abstract definition is identical to the equation above?
probability measure-theory conditional-expectation
add a comment |Â
up vote
1
down vote
favorite
I often see people write $$P(X in A) = P(X in A | Y in B)P(Y in B) + P(X in A | Y in B^c)P(Y in B^c)$$
I want to formally justify this in a measure theory setting.
We can write $$P(X in A) = int 1_A(X) dP = int 1_A(X) left[ 1_B(Y) + 1_B^c(Y)right] dP $$$$= int 1_A(X)1_B(Y) dP + int 1_A(X) 1_B^c(Y)dP$$ $$ = P(X in A, Y in B) + P(X in A, Y in B^c)$$
And so we would be done if only I could prove that $$P(X in A, Y in B) = P(X in A | Y in B) P(Y in B).$$
This is true by definition in basic probability theory. Is it also true in measure theory?
So I am looking for two things:
- What is the definition of conditional probability in measure theory? Personally, I was only ever introduced to a conditional expectation, not a conditional probability.
- How does one prove that this abstract definition is identical to the equation above?
probability measure-theory conditional-expectation
The notions of conditional probabilites, conditional expectations and independence are only available in Probability theory. Trying to generalize them to general measure spaces leads to all kinds of complications with hardly any use in Analysis.
â Kavi Rama Murthy
Sep 8 at 11:54
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I often see people write $$P(X in A) = P(X in A | Y in B)P(Y in B) + P(X in A | Y in B^c)P(Y in B^c)$$
I want to formally justify this in a measure theory setting.
We can write $$P(X in A) = int 1_A(X) dP = int 1_A(X) left[ 1_B(Y) + 1_B^c(Y)right] dP $$$$= int 1_A(X)1_B(Y) dP + int 1_A(X) 1_B^c(Y)dP$$ $$ = P(X in A, Y in B) + P(X in A, Y in B^c)$$
And so we would be done if only I could prove that $$P(X in A, Y in B) = P(X in A | Y in B) P(Y in B).$$
This is true by definition in basic probability theory. Is it also true in measure theory?
So I am looking for two things:
- What is the definition of conditional probability in measure theory? Personally, I was only ever introduced to a conditional expectation, not a conditional probability.
- How does one prove that this abstract definition is identical to the equation above?
probability measure-theory conditional-expectation
I often see people write $$P(X in A) = P(X in A | Y in B)P(Y in B) + P(X in A | Y in B^c)P(Y in B^c)$$
I want to formally justify this in a measure theory setting.
We can write $$P(X in A) = int 1_A(X) dP = int 1_A(X) left[ 1_B(Y) + 1_B^c(Y)right] dP $$$$= int 1_A(X)1_B(Y) dP + int 1_A(X) 1_B^c(Y)dP$$ $$ = P(X in A, Y in B) + P(X in A, Y in B^c)$$
And so we would be done if only I could prove that $$P(X in A, Y in B) = P(X in A | Y in B) P(Y in B).$$
This is true by definition in basic probability theory. Is it also true in measure theory?
So I am looking for two things:
- What is the definition of conditional probability in measure theory? Personally, I was only ever introduced to a conditional expectation, not a conditional probability.
- How does one prove that this abstract definition is identical to the equation above?
probability measure-theory conditional-expectation
probability measure-theory conditional-expectation
asked Sep 8 at 11:12
Dalu
92
92
The notions of conditional probabilites, conditional expectations and independence are only available in Probability theory. Trying to generalize them to general measure spaces leads to all kinds of complications with hardly any use in Analysis.
â Kavi Rama Murthy
Sep 8 at 11:54
add a comment |Â
The notions of conditional probabilites, conditional expectations and independence are only available in Probability theory. Trying to generalize them to general measure spaces leads to all kinds of complications with hardly any use in Analysis.
â Kavi Rama Murthy
Sep 8 at 11:54
The notions of conditional probabilites, conditional expectations and independence are only available in Probability theory. Trying to generalize them to general measure spaces leads to all kinds of complications with hardly any use in Analysis.
â Kavi Rama Murthy
Sep 8 at 11:54
The notions of conditional probabilites, conditional expectations and independence are only available in Probability theory. Trying to generalize them to general measure spaces leads to all kinds of complications with hardly any use in Analysis.
â Kavi Rama Murthy
Sep 8 at 11:54
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
0
down vote
You're overthinking things. Just like in naive probability theory, in measure theory the definition of $P(A|B)$ is $fracP(Acap B)P(B)$. Of course, this only works when $P(B)>0$, but that's a separate matter. As long as $P(B)>0$, your equation (the "total probability formula") is entirely justified in measure theory since it just falls out algebraically once you plug in the definition of conditional probability.
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
You're overthinking things. Just like in naive probability theory, in measure theory the definition of $P(A|B)$ is $fracP(Acap B)P(B)$. Of course, this only works when $P(B)>0$, but that's a separate matter. As long as $P(B)>0$, your equation (the "total probability formula") is entirely justified in measure theory since it just falls out algebraically once you plug in the definition of conditional probability.
add a comment |Â
up vote
0
down vote
You're overthinking things. Just like in naive probability theory, in measure theory the definition of $P(A|B)$ is $fracP(Acap B)P(B)$. Of course, this only works when $P(B)>0$, but that's a separate matter. As long as $P(B)>0$, your equation (the "total probability formula") is entirely justified in measure theory since it just falls out algebraically once you plug in the definition of conditional probability.
add a comment |Â
up vote
0
down vote
up vote
0
down vote
You're overthinking things. Just like in naive probability theory, in measure theory the definition of $P(A|B)$ is $fracP(Acap B)P(B)$. Of course, this only works when $P(B)>0$, but that's a separate matter. As long as $P(B)>0$, your equation (the "total probability formula") is entirely justified in measure theory since it just falls out algebraically once you plug in the definition of conditional probability.
You're overthinking things. Just like in naive probability theory, in measure theory the definition of $P(A|B)$ is $fracP(Acap B)P(B)$. Of course, this only works when $P(B)>0$, but that's a separate matter. As long as $P(B)>0$, your equation (the "total probability formula") is entirely justified in measure theory since it just falls out algebraically once you plug in the definition of conditional probability.
answered Sep 8 at 12:09
Jack M
17.8k33674
17.8k33674
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2909520%2fprove-that-the-measure-theoretic-definition-of-probability-aligns-with-the-basic%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
The notions of conditional probabilites, conditional expectations and independence are only available in Probability theory. Trying to generalize them to general measure spaces leads to all kinds of complications with hardly any use in Analysis.
â Kavi Rama Murthy
Sep 8 at 11:54