matching hats problem, expected value of number of matching hats
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
I've been reading a book where the following problem is stated
"Suppose that $n$ people throw their hats in
a box and then each picks up one hat at random. What is the expected value of
X, the number of people that get back their own hat?
"
It then continues to say:
"
For the $i$th person, we introduce a random variable $X_i$ that takes the value
$1$ if the person selects his/her own hat, and takes the value $0$ otherwise
Since
$P(X_i = 1) = frac1n$ and $P(X_i = 0) = 1 â frac1n$, the mean of $X_i$ is
$E[X_i] = sum_k k cdot P(x_i = k) = frac1n$
The total number of people with their own hat is
$X = X_1 + X_2 + ÷÷÷ + X_n$
and
$E[X] = E[X_1] + E[X_2] + ÷÷÷ + E[X_n] = ncdot frac1n = 1$
"
However, this seems a little suspicious to me.
The probabilities of each $X_i$ are not independent of one another. If they were, the probability of having exactly $n-1$ people grabbing their hats would be non-zero. However, we know it has to be zero since you cannot everybody but one grabbing their hats, that would mean the last person also grabs theirs. So there's an interaction among the different possibilities.
But the calculation above seems to work! Running on a simulation, the result is actually 1. So I'm a bit confused as to why the argument above works? To me, this cannot be modeled by $n$ random variables that happily take values independently from one another.
combinatorics statistics
add a comment |Â
up vote
1
down vote
favorite
I've been reading a book where the following problem is stated
"Suppose that $n$ people throw their hats in
a box and then each picks up one hat at random. What is the expected value of
X, the number of people that get back their own hat?
"
It then continues to say:
"
For the $i$th person, we introduce a random variable $X_i$ that takes the value
$1$ if the person selects his/her own hat, and takes the value $0$ otherwise
Since
$P(X_i = 1) = frac1n$ and $P(X_i = 0) = 1 â frac1n$, the mean of $X_i$ is
$E[X_i] = sum_k k cdot P(x_i = k) = frac1n$
The total number of people with their own hat is
$X = X_1 + X_2 + ÷÷÷ + X_n$
and
$E[X] = E[X_1] + E[X_2] + ÷÷÷ + E[X_n] = ncdot frac1n = 1$
"
However, this seems a little suspicious to me.
The probabilities of each $X_i$ are not independent of one another. If they were, the probability of having exactly $n-1$ people grabbing their hats would be non-zero. However, we know it has to be zero since you cannot everybody but one grabbing their hats, that would mean the last person also grabs theirs. So there's an interaction among the different possibilities.
But the calculation above seems to work! Running on a simulation, the result is actually 1. So I'm a bit confused as to why the argument above works? To me, this cannot be modeled by $n$ random variables that happily take values independently from one another.
combinatorics statistics
Think about it as everyone getting assigned a random hat at the same time. Or as everyone grabbing a random hat at the same time. This is the same as choosing a random hat from the box for each person. We expect this to result in exactly one person coincidentally getting their own hat.
â The Count
Aug 9 at 16:47
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I've been reading a book where the following problem is stated
"Suppose that $n$ people throw their hats in
a box and then each picks up one hat at random. What is the expected value of
X, the number of people that get back their own hat?
"
It then continues to say:
"
For the $i$th person, we introduce a random variable $X_i$ that takes the value
$1$ if the person selects his/her own hat, and takes the value $0$ otherwise
Since
$P(X_i = 1) = frac1n$ and $P(X_i = 0) = 1 â frac1n$, the mean of $X_i$ is
$E[X_i] = sum_k k cdot P(x_i = k) = frac1n$
The total number of people with their own hat is
$X = X_1 + X_2 + ÷÷÷ + X_n$
and
$E[X] = E[X_1] + E[X_2] + ÷÷÷ + E[X_n] = ncdot frac1n = 1$
"
However, this seems a little suspicious to me.
The probabilities of each $X_i$ are not independent of one another. If they were, the probability of having exactly $n-1$ people grabbing their hats would be non-zero. However, we know it has to be zero since you cannot everybody but one grabbing their hats, that would mean the last person also grabs theirs. So there's an interaction among the different possibilities.
But the calculation above seems to work! Running on a simulation, the result is actually 1. So I'm a bit confused as to why the argument above works? To me, this cannot be modeled by $n$ random variables that happily take values independently from one another.
combinatorics statistics
I've been reading a book where the following problem is stated
"Suppose that $n$ people throw their hats in
a box and then each picks up one hat at random. What is the expected value of
X, the number of people that get back their own hat?
"
It then continues to say:
"
For the $i$th person, we introduce a random variable $X_i$ that takes the value
$1$ if the person selects his/her own hat, and takes the value $0$ otherwise
Since
$P(X_i = 1) = frac1n$ and $P(X_i = 0) = 1 â frac1n$, the mean of $X_i$ is
$E[X_i] = sum_k k cdot P(x_i = k) = frac1n$
The total number of people with their own hat is
$X = X_1 + X_2 + ÷÷÷ + X_n$
and
$E[X] = E[X_1] + E[X_2] + ÷÷÷ + E[X_n] = ncdot frac1n = 1$
"
However, this seems a little suspicious to me.
The probabilities of each $X_i$ are not independent of one another. If they were, the probability of having exactly $n-1$ people grabbing their hats would be non-zero. However, we know it has to be zero since you cannot everybody but one grabbing their hats, that would mean the last person also grabs theirs. So there's an interaction among the different possibilities.
But the calculation above seems to work! Running on a simulation, the result is actually 1. So I'm a bit confused as to why the argument above works? To me, this cannot be modeled by $n$ random variables that happily take values independently from one another.
combinatorics statistics
asked Aug 9 at 16:36
elelias
1103
1103
Think about it as everyone getting assigned a random hat at the same time. Or as everyone grabbing a random hat at the same time. This is the same as choosing a random hat from the box for each person. We expect this to result in exactly one person coincidentally getting their own hat.
â The Count
Aug 9 at 16:47
add a comment |Â
Think about it as everyone getting assigned a random hat at the same time. Or as everyone grabbing a random hat at the same time. This is the same as choosing a random hat from the box for each person. We expect this to result in exactly one person coincidentally getting their own hat.
â The Count
Aug 9 at 16:47
Think about it as everyone getting assigned a random hat at the same time. Or as everyone grabbing a random hat at the same time. This is the same as choosing a random hat from the box for each person. We expect this to result in exactly one person coincidentally getting their own hat.
â The Count
Aug 9 at 16:47
Think about it as everyone getting assigned a random hat at the same time. Or as everyone grabbing a random hat at the same time. This is the same as choosing a random hat from the box for each person. We expect this to result in exactly one person coincidentally getting their own hat.
â The Count
Aug 9 at 16:47
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
2
down vote
accepted
This is a crucial concept called the linearity of expectation. You don't need the variables to be independent to be able to add the expectations. As an example, think of flipping a coin. The expected number of heads is $frac 12$, as is the expected number of tails. These are not independent, they are perfectly anticorrelated. Even so, the expected number of faces is $frac 12+frac 12=1$
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
accepted
This is a crucial concept called the linearity of expectation. You don't need the variables to be independent to be able to add the expectations. As an example, think of flipping a coin. The expected number of heads is $frac 12$, as is the expected number of tails. These are not independent, they are perfectly anticorrelated. Even so, the expected number of faces is $frac 12+frac 12=1$
add a comment |Â
up vote
2
down vote
accepted
This is a crucial concept called the linearity of expectation. You don't need the variables to be independent to be able to add the expectations. As an example, think of flipping a coin. The expected number of heads is $frac 12$, as is the expected number of tails. These are not independent, they are perfectly anticorrelated. Even so, the expected number of faces is $frac 12+frac 12=1$
add a comment |Â
up vote
2
down vote
accepted
up vote
2
down vote
accepted
This is a crucial concept called the linearity of expectation. You don't need the variables to be independent to be able to add the expectations. As an example, think of flipping a coin. The expected number of heads is $frac 12$, as is the expected number of tails. These are not independent, they are perfectly anticorrelated. Even so, the expected number of faces is $frac 12+frac 12=1$
This is a crucial concept called the linearity of expectation. You don't need the variables to be independent to be able to add the expectations. As an example, think of flipping a coin. The expected number of heads is $frac 12$, as is the expected number of tails. These are not independent, they are perfectly anticorrelated. Even so, the expected number of faces is $frac 12+frac 12=1$
answered Aug 9 at 16:44
Ross Millikan
277k21187352
277k21187352
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2877418%2fmatching-hats-problem-expected-value-of-number-of-matching-hats%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Think about it as everyone getting assigned a random hat at the same time. Or as everyone grabbing a random hat at the same time. This is the same as choosing a random hat from the box for each person. We expect this to result in exactly one person coincidentally getting their own hat.
â The Count
Aug 9 at 16:47