Determine the variance of estimator T
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
I'm having trouble figuring out how to find the variance of the following estimator.
Let $X_1,X_2,...,X_n$ denote random sample from a population which has a normal distribution with unknown mean $mu$ and unknown variance $sigma^2$. The statistic below is an estimator for $sigma^2$, where $c$ is a constant.
$$T_c = sum_j=1^n frac(X_j - bar X)^2c$$
I found the expectation of $T_c$ to be $fracsigma^2(n-1)c$ using the definition of $T_c$, however I am stumped over how to determine the $Var(T_c)$.
I started to try to determine it like so but got stuck:
$operatornameVar(T_c) = mathbb E (T_c^2)+ mathbb E (T_c)^2$
$operatornameVar(T_c) = mathbb E ((sum_j=1^n frac(X_j - bar X)^2c)^2) + fracsigma^2(n-1)c$
Any tips/solutions?
estimation variance mean-square-error expected-value
add a comment |Â
up vote
1
down vote
favorite
I'm having trouble figuring out how to find the variance of the following estimator.
Let $X_1,X_2,...,X_n$ denote random sample from a population which has a normal distribution with unknown mean $mu$ and unknown variance $sigma^2$. The statistic below is an estimator for $sigma^2$, where $c$ is a constant.
$$T_c = sum_j=1^n frac(X_j - bar X)^2c$$
I found the expectation of $T_c$ to be $fracsigma^2(n-1)c$ using the definition of $T_c$, however I am stumped over how to determine the $Var(T_c)$.
I started to try to determine it like so but got stuck:
$operatornameVar(T_c) = mathbb E (T_c^2)+ mathbb E (T_c)^2$
$operatornameVar(T_c) = mathbb E ((sum_j=1^n frac(X_j - bar X)^2c)^2) + fracsigma^2(n-1)c$
Any tips/solutions?
estimation variance mean-square-error expected-value
Hi, welcome to math.SE. You can get displayed equations by enclosing them in double instead of single dollar signs; that makes them a lot easier to read, especially when you're mixing fractions, subscripts and superscripts. You can also get proper formatting for operators like $operatornameVar$ by usingoperatornameVar
. For more information on how to typeset math on this site, please see this tutorial and reference.
â joriki
Aug 30 at 7:02
Hi, are the samples independants ? since they are normal it would be represented by the fact that the covariance is $0$ for different samples.
â P. Quinton
Aug 30 at 7:56
Anyway you can always write $X_j-barX$ as a Gaussian random variable (because a sum of jointly Gaussians RVs is a Gaussian RV), then you would need to determine the covariance between any two of those. Finally you can apply the method in this post math.stackexchange.com/questions/442472/â¦
â P. Quinton
Aug 30 at 7:59
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I'm having trouble figuring out how to find the variance of the following estimator.
Let $X_1,X_2,...,X_n$ denote random sample from a population which has a normal distribution with unknown mean $mu$ and unknown variance $sigma^2$. The statistic below is an estimator for $sigma^2$, where $c$ is a constant.
$$T_c = sum_j=1^n frac(X_j - bar X)^2c$$
I found the expectation of $T_c$ to be $fracsigma^2(n-1)c$ using the definition of $T_c$, however I am stumped over how to determine the $Var(T_c)$.
I started to try to determine it like so but got stuck:
$operatornameVar(T_c) = mathbb E (T_c^2)+ mathbb E (T_c)^2$
$operatornameVar(T_c) = mathbb E ((sum_j=1^n frac(X_j - bar X)^2c)^2) + fracsigma^2(n-1)c$
Any tips/solutions?
estimation variance mean-square-error expected-value
I'm having trouble figuring out how to find the variance of the following estimator.
Let $X_1,X_2,...,X_n$ denote random sample from a population which has a normal distribution with unknown mean $mu$ and unknown variance $sigma^2$. The statistic below is an estimator for $sigma^2$, where $c$ is a constant.
$$T_c = sum_j=1^n frac(X_j - bar X)^2c$$
I found the expectation of $T_c$ to be $fracsigma^2(n-1)c$ using the definition of $T_c$, however I am stumped over how to determine the $Var(T_c)$.
I started to try to determine it like so but got stuck:
$operatornameVar(T_c) = mathbb E (T_c^2)+ mathbb E (T_c)^2$
$operatornameVar(T_c) = mathbb E ((sum_j=1^n frac(X_j - bar X)^2c)^2) + fracsigma^2(n-1)c$
Any tips/solutions?
estimation variance mean-square-error expected-value
estimation variance mean-square-error expected-value
edited Aug 30 at 7:37
asked Aug 30 at 5:41
Marty
83
83
Hi, welcome to math.SE. You can get displayed equations by enclosing them in double instead of single dollar signs; that makes them a lot easier to read, especially when you're mixing fractions, subscripts and superscripts. You can also get proper formatting for operators like $operatornameVar$ by usingoperatornameVar
. For more information on how to typeset math on this site, please see this tutorial and reference.
â joriki
Aug 30 at 7:02
Hi, are the samples independants ? since they are normal it would be represented by the fact that the covariance is $0$ for different samples.
â P. Quinton
Aug 30 at 7:56
Anyway you can always write $X_j-barX$ as a Gaussian random variable (because a sum of jointly Gaussians RVs is a Gaussian RV), then you would need to determine the covariance between any two of those. Finally you can apply the method in this post math.stackexchange.com/questions/442472/â¦
â P. Quinton
Aug 30 at 7:59
add a comment |Â
Hi, welcome to math.SE. You can get displayed equations by enclosing them in double instead of single dollar signs; that makes them a lot easier to read, especially when you're mixing fractions, subscripts and superscripts. You can also get proper formatting for operators like $operatornameVar$ by usingoperatornameVar
. For more information on how to typeset math on this site, please see this tutorial and reference.
â joriki
Aug 30 at 7:02
Hi, are the samples independants ? since they are normal it would be represented by the fact that the covariance is $0$ for different samples.
â P. Quinton
Aug 30 at 7:56
Anyway you can always write $X_j-barX$ as a Gaussian random variable (because a sum of jointly Gaussians RVs is a Gaussian RV), then you would need to determine the covariance between any two of those. Finally you can apply the method in this post math.stackexchange.com/questions/442472/â¦
â P. Quinton
Aug 30 at 7:59
Hi, welcome to math.SE. You can get displayed equations by enclosing them in double instead of single dollar signs; that makes them a lot easier to read, especially when you're mixing fractions, subscripts and superscripts. You can also get proper formatting for operators like $operatornameVar$ by using
operatornameVar
. For more information on how to typeset math on this site, please see this tutorial and reference.â joriki
Aug 30 at 7:02
Hi, welcome to math.SE. You can get displayed equations by enclosing them in double instead of single dollar signs; that makes them a lot easier to read, especially when you're mixing fractions, subscripts and superscripts. You can also get proper formatting for operators like $operatornameVar$ by using
operatornameVar
. For more information on how to typeset math on this site, please see this tutorial and reference.â joriki
Aug 30 at 7:02
Hi, are the samples independants ? since they are normal it would be represented by the fact that the covariance is $0$ for different samples.
â P. Quinton
Aug 30 at 7:56
Hi, are the samples independants ? since they are normal it would be represented by the fact that the covariance is $0$ for different samples.
â P. Quinton
Aug 30 at 7:56
Anyway you can always write $X_j-barX$ as a Gaussian random variable (because a sum of jointly Gaussians RVs is a Gaussian RV), then you would need to determine the covariance between any two of those. Finally you can apply the method in this post math.stackexchange.com/questions/442472/â¦
â P. Quinton
Aug 30 at 7:59
Anyway you can always write $X_j-barX$ as a Gaussian random variable (because a sum of jointly Gaussians RVs is a Gaussian RV), then you would need to determine the covariance between any two of those. Finally you can apply the method in this post math.stackexchange.com/questions/442472/â¦
â P. Quinton
Aug 30 at 7:59
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
0
down vote
accepted
I suppose that the samples are independents, so they have $0$ covariance.
Observe that $X_i-barX$ is a Gaussian of mean $0$, let us compute the covariance of two of those
beginalign*
operatornameCov[X_i-barX, X_j-barX] &= mathbbE[(X_i-barX)(X_j-barX)]\
&= mathbbE[((X_i-mu)-(barX-mu))((X_j-mu)-(barX-mu))]\
&=mathbbE[(X_i-mu) (X_j-mu)] - 2 mathbbE[(X_i-mu) (barX-mu)] + mathbbE[(barX-mu)(barX-mu)]\
&= sigma_ij - 2 fracsigma^2n + fracsigma^2n\
&= sigma_ij-fracsigma^2n
endalign*
Where $sigma_ii=sigma^2$ and for $ineq j$, $sigma_ij=0$.
Now let's apply the trick in this post sum of squares of dependent gaussian random variables
the goal is to determine the coefficient $lambda_i$ in front of the chi-squared variables and then use the variance of variance of $chi_1^2$. So we must find the eigen decomposition of $Sigma$ where $Sigma_ij=operatornameCov[X_i-barX, X_j-barX]$. Observe that $Sigma = sigma^2 (pmbI-1/n cdot pmb1)$, we are looking at the eigen values of this. First note that $Sigma-sigma^2 pmbI=1/n cdot pmb1$ (where $pmb1$ denote the all $1$ matrix or all $1$ vector depending on context) have all same rows and so $Sigma$ have eigen value $sigma^2$ with multiplicity $n-1$ (If a symmetric matrix $A$ has $m$ identical rows show that $0$ is an eigen value of $A$ whose geometric multiplicity is atleast $m-1$.).
Then observe that $Sigma pmb1 = sigma^2 (pmbI-1/n cdot pmb1) pmb1 = sigma^2 (pmb1 - pmb1) = 0 cdot pmb1$, so that $0$ is also an eigen value.
Applying the forum trick we get that $T$ have the distribution of a sum of $n-1$ independant $chi^2_1$ multiplied by $sigma^2/c$. The variance of $chi_1^2$ is $2$ and so the variance you seek (modulo all the errors I made) is
$$2(n-1) fracsigma^4c^2$$
Comment : I am not satisfied by not being able to distinguish in my notation the matrix of all ones and the vector of all one, someone have a suggestion ?
I just realized this post : math.stackexchange.com/questions/72975/⦠Which confirms my result, but I don't think it is explained why the $chi^2$ property
â P. Quinton
Aug 30 at 9:24
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
accepted
I suppose that the samples are independents, so they have $0$ covariance.
Observe that $X_i-barX$ is a Gaussian of mean $0$, let us compute the covariance of two of those
beginalign*
operatornameCov[X_i-barX, X_j-barX] &= mathbbE[(X_i-barX)(X_j-barX)]\
&= mathbbE[((X_i-mu)-(barX-mu))((X_j-mu)-(barX-mu))]\
&=mathbbE[(X_i-mu) (X_j-mu)] - 2 mathbbE[(X_i-mu) (barX-mu)] + mathbbE[(barX-mu)(barX-mu)]\
&= sigma_ij - 2 fracsigma^2n + fracsigma^2n\
&= sigma_ij-fracsigma^2n
endalign*
Where $sigma_ii=sigma^2$ and for $ineq j$, $sigma_ij=0$.
Now let's apply the trick in this post sum of squares of dependent gaussian random variables
the goal is to determine the coefficient $lambda_i$ in front of the chi-squared variables and then use the variance of variance of $chi_1^2$. So we must find the eigen decomposition of $Sigma$ where $Sigma_ij=operatornameCov[X_i-barX, X_j-barX]$. Observe that $Sigma = sigma^2 (pmbI-1/n cdot pmb1)$, we are looking at the eigen values of this. First note that $Sigma-sigma^2 pmbI=1/n cdot pmb1$ (where $pmb1$ denote the all $1$ matrix or all $1$ vector depending on context) have all same rows and so $Sigma$ have eigen value $sigma^2$ with multiplicity $n-1$ (If a symmetric matrix $A$ has $m$ identical rows show that $0$ is an eigen value of $A$ whose geometric multiplicity is atleast $m-1$.).
Then observe that $Sigma pmb1 = sigma^2 (pmbI-1/n cdot pmb1) pmb1 = sigma^2 (pmb1 - pmb1) = 0 cdot pmb1$, so that $0$ is also an eigen value.
Applying the forum trick we get that $T$ have the distribution of a sum of $n-1$ independant $chi^2_1$ multiplied by $sigma^2/c$. The variance of $chi_1^2$ is $2$ and so the variance you seek (modulo all the errors I made) is
$$2(n-1) fracsigma^4c^2$$
Comment : I am not satisfied by not being able to distinguish in my notation the matrix of all ones and the vector of all one, someone have a suggestion ?
I just realized this post : math.stackexchange.com/questions/72975/⦠Which confirms my result, but I don't think it is explained why the $chi^2$ property
â P. Quinton
Aug 30 at 9:24
add a comment |Â
up vote
0
down vote
accepted
I suppose that the samples are independents, so they have $0$ covariance.
Observe that $X_i-barX$ is a Gaussian of mean $0$, let us compute the covariance of two of those
beginalign*
operatornameCov[X_i-barX, X_j-barX] &= mathbbE[(X_i-barX)(X_j-barX)]\
&= mathbbE[((X_i-mu)-(barX-mu))((X_j-mu)-(barX-mu))]\
&=mathbbE[(X_i-mu) (X_j-mu)] - 2 mathbbE[(X_i-mu) (barX-mu)] + mathbbE[(barX-mu)(barX-mu)]\
&= sigma_ij - 2 fracsigma^2n + fracsigma^2n\
&= sigma_ij-fracsigma^2n
endalign*
Where $sigma_ii=sigma^2$ and for $ineq j$, $sigma_ij=0$.
Now let's apply the trick in this post sum of squares of dependent gaussian random variables
the goal is to determine the coefficient $lambda_i$ in front of the chi-squared variables and then use the variance of variance of $chi_1^2$. So we must find the eigen decomposition of $Sigma$ where $Sigma_ij=operatornameCov[X_i-barX, X_j-barX]$. Observe that $Sigma = sigma^2 (pmbI-1/n cdot pmb1)$, we are looking at the eigen values of this. First note that $Sigma-sigma^2 pmbI=1/n cdot pmb1$ (where $pmb1$ denote the all $1$ matrix or all $1$ vector depending on context) have all same rows and so $Sigma$ have eigen value $sigma^2$ with multiplicity $n-1$ (If a symmetric matrix $A$ has $m$ identical rows show that $0$ is an eigen value of $A$ whose geometric multiplicity is atleast $m-1$.).
Then observe that $Sigma pmb1 = sigma^2 (pmbI-1/n cdot pmb1) pmb1 = sigma^2 (pmb1 - pmb1) = 0 cdot pmb1$, so that $0$ is also an eigen value.
Applying the forum trick we get that $T$ have the distribution of a sum of $n-1$ independant $chi^2_1$ multiplied by $sigma^2/c$. The variance of $chi_1^2$ is $2$ and so the variance you seek (modulo all the errors I made) is
$$2(n-1) fracsigma^4c^2$$
Comment : I am not satisfied by not being able to distinguish in my notation the matrix of all ones and the vector of all one, someone have a suggestion ?
I just realized this post : math.stackexchange.com/questions/72975/⦠Which confirms my result, but I don't think it is explained why the $chi^2$ property
â P. Quinton
Aug 30 at 9:24
add a comment |Â
up vote
0
down vote
accepted
up vote
0
down vote
accepted
I suppose that the samples are independents, so they have $0$ covariance.
Observe that $X_i-barX$ is a Gaussian of mean $0$, let us compute the covariance of two of those
beginalign*
operatornameCov[X_i-barX, X_j-barX] &= mathbbE[(X_i-barX)(X_j-barX)]\
&= mathbbE[((X_i-mu)-(barX-mu))((X_j-mu)-(barX-mu))]\
&=mathbbE[(X_i-mu) (X_j-mu)] - 2 mathbbE[(X_i-mu) (barX-mu)] + mathbbE[(barX-mu)(barX-mu)]\
&= sigma_ij - 2 fracsigma^2n + fracsigma^2n\
&= sigma_ij-fracsigma^2n
endalign*
Where $sigma_ii=sigma^2$ and for $ineq j$, $sigma_ij=0$.
Now let's apply the trick in this post sum of squares of dependent gaussian random variables
the goal is to determine the coefficient $lambda_i$ in front of the chi-squared variables and then use the variance of variance of $chi_1^2$. So we must find the eigen decomposition of $Sigma$ where $Sigma_ij=operatornameCov[X_i-barX, X_j-barX]$. Observe that $Sigma = sigma^2 (pmbI-1/n cdot pmb1)$, we are looking at the eigen values of this. First note that $Sigma-sigma^2 pmbI=1/n cdot pmb1$ (where $pmb1$ denote the all $1$ matrix or all $1$ vector depending on context) have all same rows and so $Sigma$ have eigen value $sigma^2$ with multiplicity $n-1$ (If a symmetric matrix $A$ has $m$ identical rows show that $0$ is an eigen value of $A$ whose geometric multiplicity is atleast $m-1$.).
Then observe that $Sigma pmb1 = sigma^2 (pmbI-1/n cdot pmb1) pmb1 = sigma^2 (pmb1 - pmb1) = 0 cdot pmb1$, so that $0$ is also an eigen value.
Applying the forum trick we get that $T$ have the distribution of a sum of $n-1$ independant $chi^2_1$ multiplied by $sigma^2/c$. The variance of $chi_1^2$ is $2$ and so the variance you seek (modulo all the errors I made) is
$$2(n-1) fracsigma^4c^2$$
Comment : I am not satisfied by not being able to distinguish in my notation the matrix of all ones and the vector of all one, someone have a suggestion ?
I suppose that the samples are independents, so they have $0$ covariance.
Observe that $X_i-barX$ is a Gaussian of mean $0$, let us compute the covariance of two of those
beginalign*
operatornameCov[X_i-barX, X_j-barX] &= mathbbE[(X_i-barX)(X_j-barX)]\
&= mathbbE[((X_i-mu)-(barX-mu))((X_j-mu)-(barX-mu))]\
&=mathbbE[(X_i-mu) (X_j-mu)] - 2 mathbbE[(X_i-mu) (barX-mu)] + mathbbE[(barX-mu)(barX-mu)]\
&= sigma_ij - 2 fracsigma^2n + fracsigma^2n\
&= sigma_ij-fracsigma^2n
endalign*
Where $sigma_ii=sigma^2$ and for $ineq j$, $sigma_ij=0$.
Now let's apply the trick in this post sum of squares of dependent gaussian random variables
the goal is to determine the coefficient $lambda_i$ in front of the chi-squared variables and then use the variance of variance of $chi_1^2$. So we must find the eigen decomposition of $Sigma$ where $Sigma_ij=operatornameCov[X_i-barX, X_j-barX]$. Observe that $Sigma = sigma^2 (pmbI-1/n cdot pmb1)$, we are looking at the eigen values of this. First note that $Sigma-sigma^2 pmbI=1/n cdot pmb1$ (where $pmb1$ denote the all $1$ matrix or all $1$ vector depending on context) have all same rows and so $Sigma$ have eigen value $sigma^2$ with multiplicity $n-1$ (If a symmetric matrix $A$ has $m$ identical rows show that $0$ is an eigen value of $A$ whose geometric multiplicity is atleast $m-1$.).
Then observe that $Sigma pmb1 = sigma^2 (pmbI-1/n cdot pmb1) pmb1 = sigma^2 (pmb1 - pmb1) = 0 cdot pmb1$, so that $0$ is also an eigen value.
Applying the forum trick we get that $T$ have the distribution of a sum of $n-1$ independant $chi^2_1$ multiplied by $sigma^2/c$. The variance of $chi_1^2$ is $2$ and so the variance you seek (modulo all the errors I made) is
$$2(n-1) fracsigma^4c^2$$
Comment : I am not satisfied by not being able to distinguish in my notation the matrix of all ones and the vector of all one, someone have a suggestion ?
edited Aug 30 at 9:21
answered Aug 30 at 9:14
P. Quinton
45410
45410
I just realized this post : math.stackexchange.com/questions/72975/⦠Which confirms my result, but I don't think it is explained why the $chi^2$ property
â P. Quinton
Aug 30 at 9:24
add a comment |Â
I just realized this post : math.stackexchange.com/questions/72975/⦠Which confirms my result, but I don't think it is explained why the $chi^2$ property
â P. Quinton
Aug 30 at 9:24
I just realized this post : math.stackexchange.com/questions/72975/⦠Which confirms my result, but I don't think it is explained why the $chi^2$ property
â P. Quinton
Aug 30 at 9:24
I just realized this post : math.stackexchange.com/questions/72975/⦠Which confirms my result, but I don't think it is explained why the $chi^2$ property
â P. Quinton
Aug 30 at 9:24
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2899176%2fdetermine-the-variance-of-estimator-t%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Hi, welcome to math.SE. You can get displayed equations by enclosing them in double instead of single dollar signs; that makes them a lot easier to read, especially when you're mixing fractions, subscripts and superscripts. You can also get proper formatting for operators like $operatornameVar$ by using
operatornameVar
. For more information on how to typeset math on this site, please see this tutorial and reference.â joriki
Aug 30 at 7:02
Hi, are the samples independants ? since they are normal it would be represented by the fact that the covariance is $0$ for different samples.
â P. Quinton
Aug 30 at 7:56
Anyway you can always write $X_j-barX$ as a Gaussian random variable (because a sum of jointly Gaussians RVs is a Gaussian RV), then you would need to determine the covariance between any two of those. Finally you can apply the method in this post math.stackexchange.com/questions/442472/â¦
â P. Quinton
Aug 30 at 7:59