How to calculate the bias of the statistic

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite












A given statistic : $T_c = sum_j=1^n frac(X_j - bar X)^2c$, where $c$ is a constant, as an estimator of variance $sigma^2.$



$X_1,ldots, X_n$ denote a random sample from a population which has normal distribution with unknown mean $mu$ and unknown variance $sigma^2$.



The statistic is distributed as $x^2_n-1$ (a chi-squared variate with $n-1$ degrees of freedom).



I am tasked to find the bias of $T_c$.



I know the formula for bias is $mathbb E hat theta - theta$.



I found $theta$ as $mu = n - 1$ for a chi-squared distribution of $n-1$ degrees of freedom.



However, I am confused as to how to calculate $mathbb E hat theta$.



What i thought of doing is to calculate $mathbb E T_c$, but i got stuck halfway through, so I am not sure if I am doing the right thing. Any help please. thanks in advance.










share|cite|improve this question























  • @StubbornAtom ah sorry just added the squared bracket for my $T_c$
    – Wei Xiong Yeo
    Sep 1 at 13:07










  • Are you sure $T_c$ has distribution $mathcalX^2_n-1$. Or did you mean to say that the $X_i$'s are $mathcalX^2_n-1$ ? Also are the $X_i$'s independent ?
    – Digitalis
    Sep 1 at 18:11










  • @Digitalis Yes I would assume they are independent. Also, I have edited my original post to include the distribution of $X_i$'s
    – Wei Xiong Yeo
    Sep 2 at 3:43














up vote
1
down vote

favorite












A given statistic : $T_c = sum_j=1^n frac(X_j - bar X)^2c$, where $c$ is a constant, as an estimator of variance $sigma^2.$



$X_1,ldots, X_n$ denote a random sample from a population which has normal distribution with unknown mean $mu$ and unknown variance $sigma^2$.



The statistic is distributed as $x^2_n-1$ (a chi-squared variate with $n-1$ degrees of freedom).



I am tasked to find the bias of $T_c$.



I know the formula for bias is $mathbb E hat theta - theta$.



I found $theta$ as $mu = n - 1$ for a chi-squared distribution of $n-1$ degrees of freedom.



However, I am confused as to how to calculate $mathbb E hat theta$.



What i thought of doing is to calculate $mathbb E T_c$, but i got stuck halfway through, so I am not sure if I am doing the right thing. Any help please. thanks in advance.










share|cite|improve this question























  • @StubbornAtom ah sorry just added the squared bracket for my $T_c$
    – Wei Xiong Yeo
    Sep 1 at 13:07










  • Are you sure $T_c$ has distribution $mathcalX^2_n-1$. Or did you mean to say that the $X_i$'s are $mathcalX^2_n-1$ ? Also are the $X_i$'s independent ?
    – Digitalis
    Sep 1 at 18:11










  • @Digitalis Yes I would assume they are independent. Also, I have edited my original post to include the distribution of $X_i$'s
    – Wei Xiong Yeo
    Sep 2 at 3:43












up vote
1
down vote

favorite









up vote
1
down vote

favorite











A given statistic : $T_c = sum_j=1^n frac(X_j - bar X)^2c$, where $c$ is a constant, as an estimator of variance $sigma^2.$



$X_1,ldots, X_n$ denote a random sample from a population which has normal distribution with unknown mean $mu$ and unknown variance $sigma^2$.



The statistic is distributed as $x^2_n-1$ (a chi-squared variate with $n-1$ degrees of freedom).



I am tasked to find the bias of $T_c$.



I know the formula for bias is $mathbb E hat theta - theta$.



I found $theta$ as $mu = n - 1$ for a chi-squared distribution of $n-1$ degrees of freedom.



However, I am confused as to how to calculate $mathbb E hat theta$.



What i thought of doing is to calculate $mathbb E T_c$, but i got stuck halfway through, so I am not sure if I am doing the right thing. Any help please. thanks in advance.










share|cite|improve this question















A given statistic : $T_c = sum_j=1^n frac(X_j - bar X)^2c$, where $c$ is a constant, as an estimator of variance $sigma^2.$



$X_1,ldots, X_n$ denote a random sample from a population which has normal distribution with unknown mean $mu$ and unknown variance $sigma^2$.



The statistic is distributed as $x^2_n-1$ (a chi-squared variate with $n-1$ degrees of freedom).



I am tasked to find the bias of $T_c$.



I know the formula for bias is $mathbb E hat theta - theta$.



I found $theta$ as $mu = n - 1$ for a chi-squared distribution of $n-1$ degrees of freedom.



However, I am confused as to how to calculate $mathbb E hat theta$.



What i thought of doing is to calculate $mathbb E T_c$, but i got stuck halfway through, so I am not sure if I am doing the right thing. Any help please. thanks in advance.







statistics estimation






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Sep 2 at 19:48









Michael Hardy

206k23187466




206k23187466










asked Sep 1 at 8:30









Wei Xiong Yeo

1057




1057











  • @StubbornAtom ah sorry just added the squared bracket for my $T_c$
    – Wei Xiong Yeo
    Sep 1 at 13:07










  • Are you sure $T_c$ has distribution $mathcalX^2_n-1$. Or did you mean to say that the $X_i$'s are $mathcalX^2_n-1$ ? Also are the $X_i$'s independent ?
    – Digitalis
    Sep 1 at 18:11










  • @Digitalis Yes I would assume they are independent. Also, I have edited my original post to include the distribution of $X_i$'s
    – Wei Xiong Yeo
    Sep 2 at 3:43
















  • @StubbornAtom ah sorry just added the squared bracket for my $T_c$
    – Wei Xiong Yeo
    Sep 1 at 13:07










  • Are you sure $T_c$ has distribution $mathcalX^2_n-1$. Or did you mean to say that the $X_i$'s are $mathcalX^2_n-1$ ? Also are the $X_i$'s independent ?
    – Digitalis
    Sep 1 at 18:11










  • @Digitalis Yes I would assume they are independent. Also, I have edited my original post to include the distribution of $X_i$'s
    – Wei Xiong Yeo
    Sep 2 at 3:43















@StubbornAtom ah sorry just added the squared bracket for my $T_c$
– Wei Xiong Yeo
Sep 1 at 13:07




@StubbornAtom ah sorry just added the squared bracket for my $T_c$
– Wei Xiong Yeo
Sep 1 at 13:07












Are you sure $T_c$ has distribution $mathcalX^2_n-1$. Or did you mean to say that the $X_i$'s are $mathcalX^2_n-1$ ? Also are the $X_i$'s independent ?
– Digitalis
Sep 1 at 18:11




Are you sure $T_c$ has distribution $mathcalX^2_n-1$. Or did you mean to say that the $X_i$'s are $mathcalX^2_n-1$ ? Also are the $X_i$'s independent ?
– Digitalis
Sep 1 at 18:11












@Digitalis Yes I would assume they are independent. Also, I have edited my original post to include the distribution of $X_i$'s
– Wei Xiong Yeo
Sep 2 at 3:43




@Digitalis Yes I would assume they are independent. Also, I have edited my original post to include the distribution of $X_i$'s
– Wei Xiong Yeo
Sep 2 at 3:43










2 Answers
2






active

oldest

votes

















up vote
0
down vote



accepted










Since $T_c$ is an estimator of $sigma^2$. $T_c$ will be unbiased for $theta = sigma^2$ if $mathbbE_mu,sigma^2(T_c) = sigma^2 quad forall mu, sigma^2 in mathbbR times mathbbR_0^+$



beginalign*
mathbbE(T_c)&= frac1cmathbbE Big( sum_j=1^n X_j^2 + barX^2 -2X_j barX Big) \
&= frac1c sum_j=1^n mathbbE(X_j^2) + mathbbE(barX^2) - 2 mathbbE (X_j barX)
endalign*



Now calculate $mathbbE(X_j^2),mathbbE(barX^2)$ & $ mathbbE (X_j barX)$.



1.
Using Var$(X) = mathbbE(X^2) - mathbbE(X)^2$ we find $mathbbE(X_j^2) = sigma^2 + mu^2$.



2.
beginalign*
mathbbE(barX^2) &= mathbbEBig( big(frac1n sum_i^n X_ibig) big(frac1n sum_i^n X_ibig)Big) \
&= frac1n^2 mathbbEBig( big(sum_i^n X_ibig) big(sum_i^n X_ibig)Big)\
&= frac1n^2 mathbbEBig( sum_i neq j X_iX_j + sum_i X_i^2Big) \
&=frac1n^2 Big(sum_i neq j mathbbE(X_i)mathbbE(X_j) + sum_i mathbbE(X_i^2)Big)\
&= frac1n^2 Big( n(n-1)mu^2 + n(mu^2 + sigma^2) Big)\
&= frac(n-1)mu^2 + mu^2 + sigma^2n =mu^2 + fracsigma^2n
endalign*



3.



beginalign*
mathbbE (X_j barX)&= frac1n sum_i^n mathbbE(X_jX_i) \
&=frac1n big( sum_ineq jmathbbE(X_j)mathbbE(X_i) big) + frac1n mathbbE(X_j^2)\
&= frac1n (n-1) mu^2 + frac1n(mu^2 + sigma^2)\
&= mu^2 + fracsigma^2n
endalign*
By using the values we have found we obtain



beginalign*
mathbbE(T_c)
&= frac1c sum_j=1^n mathbbE(X_j^2) + mathbbE(barX^2) - 2 mathbbE (X_j barX)\
&= frac1c sum_j=1^n (sigma^2 + mu^2) + ( mu^2 + fracsigma^2n ) - 2 (mu^2 + fracsigma^2n) \
&= frac(n-1)sigma^2c
endalign*



So $T_c$ is biased for $sigma^2$ for all values of $c neq n-1$






share|cite|improve this answer




















  • Thanks so much for the help!! I have a few questions though. For 2, I ended up trying something with Var($bar X$) (similar to 1.) and ended up with the same answer as yours. Is it possible to use Var($bar X$) = $mathbb E (bar X^2 )$ - $mathbb E ( bar X )^2$ ? For Var($bar X$), i took out 1/n and it became 1/$n^2$ , which eventually gives me the $sigma^2$ / n. In addition, I'm rather confused on your working for 2, I'm not exactly sure how you got from the 2nd to 3rd step. I'm quite confused on your working for 3 in general as well (from 1st to 2nd step, and first part of 3rd step)
    – Wei Xiong Yeo
    Sep 2 at 16:18










  • Yes it is possible to use that formula and it is actually much easier to calculate that way :). Concerning how to get from 2nd to 3rd in 2 it's just explicitly multiplying all the terms in both sums. Concerning number 3: It comes down to calculating $mathbbE(X_jX_i)$ in two different cases. If $i = j$ then $mathbbE(X_jX_i) = mathbbE(X_j^2)$. If $i neq j$ then $X_j$ and $X_i$ are independent and we have $mathbbE(X_jX_i) =mathbbE(X_j)mathbbE(X_i)$.
    – Digitalis
    Sep 2 at 16:44











  • I don't know if I've ever before down-voted a clear and correct answer, but this seems to make the problem horribly more complicated than it really is.
    – Michael Hardy
    Sep 2 at 20:00

















up vote
1
down vote













$$
frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2 sim chi^2_n-1.
$$
Therefore
$$
operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i - overline X)^2 right) = n-1.
$$
So
$$
operatorname Eleft( frac 1 c sum_i=1^n (X_i-overline X)^2 right) = fracsigma^2 c (n-1).
$$
Subtract $sigma^2$ from that to get the bias.






share|cite|improve this answer




















  • Hi, I understand why the first expectation is n-1 (because of chi-squared), but how did you manage to get the second result without doing the lengthy working as mentioned above? Is it by manipulating the (n-1) result algebraically? So the first expectation found is solely used to find $mathbb E hat theta$, and I just subtract $theta$ which is $sigma^2$ from it as per the bias formula?
    – Wei Xiong Yeo
    Sep 3 at 2:05







  • 1




    @WeiXiongYeo : $$ beginalign & textWe have operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i - overline X)^2 right) = n-1. \ \ & textMultiplying both sides by frac sigma^2 c, text we get fracsigma^2 c operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2right) = fracsigma^2 c (n-1). \ \ & textUsing linearity of expectation we get operatorname Eleft( fracsigma^2 c cdot frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2 right) = fracsigma^2 c (n-1). endalign $$ Then cancel $sigma^2$ from the top and the bottom.
    – Michael Hardy
    Sep 3 at 2:40







  • 1




    Thank you so much for the help too!!
    – Wei Xiong Yeo
    Sep 3 at 3:22






  • 1




    $$ beginalign & textWe have \ & operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i - overline X)^2 right) = n-1. \ \ & textMultiplying both sides by frac sigma^2 c, text we get \ \ & fracsigma^2 c operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2right) = fracsigma^2 c (n-1). \ \ & textUsing linearity of expectation we get \ \ & operatorname Eleft( fracsigma^2 c cdot frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2 right) = fracsigma^2 c (n-1). endalign $$
    – Michael Hardy
    Sep 5 at 2:26







  • 1




    @WeiXiongYeo : You have $operatornamevar(chi^2_k) = 2k.$ So the variance of the first expression in my answer is $2(n-1).$ If you multiply both sides by $sigma^2/c,$ you multiply the variance by $sigma^4/c^2,$ so you get $$ operatornamevarleft( frac 1 c sum_i=1^n (X_i - overline X right)^2 = fracsigma^4c^2 2(n-1). $$
    – Michael Hardy
    Sep 5 at 2:36










Your Answer




StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2901494%2fhow-to-calculate-the-bias-of-the-statistic%23new-answer', 'question_page');

);

Post as a guest






























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
0
down vote



accepted










Since $T_c$ is an estimator of $sigma^2$. $T_c$ will be unbiased for $theta = sigma^2$ if $mathbbE_mu,sigma^2(T_c) = sigma^2 quad forall mu, sigma^2 in mathbbR times mathbbR_0^+$



beginalign*
mathbbE(T_c)&= frac1cmathbbE Big( sum_j=1^n X_j^2 + barX^2 -2X_j barX Big) \
&= frac1c sum_j=1^n mathbbE(X_j^2) + mathbbE(barX^2) - 2 mathbbE (X_j barX)
endalign*



Now calculate $mathbbE(X_j^2),mathbbE(barX^2)$ & $ mathbbE (X_j barX)$.



1.
Using Var$(X) = mathbbE(X^2) - mathbbE(X)^2$ we find $mathbbE(X_j^2) = sigma^2 + mu^2$.



2.
beginalign*
mathbbE(barX^2) &= mathbbEBig( big(frac1n sum_i^n X_ibig) big(frac1n sum_i^n X_ibig)Big) \
&= frac1n^2 mathbbEBig( big(sum_i^n X_ibig) big(sum_i^n X_ibig)Big)\
&= frac1n^2 mathbbEBig( sum_i neq j X_iX_j + sum_i X_i^2Big) \
&=frac1n^2 Big(sum_i neq j mathbbE(X_i)mathbbE(X_j) + sum_i mathbbE(X_i^2)Big)\
&= frac1n^2 Big( n(n-1)mu^2 + n(mu^2 + sigma^2) Big)\
&= frac(n-1)mu^2 + mu^2 + sigma^2n =mu^2 + fracsigma^2n
endalign*



3.



beginalign*
mathbbE (X_j barX)&= frac1n sum_i^n mathbbE(X_jX_i) \
&=frac1n big( sum_ineq jmathbbE(X_j)mathbbE(X_i) big) + frac1n mathbbE(X_j^2)\
&= frac1n (n-1) mu^2 + frac1n(mu^2 + sigma^2)\
&= mu^2 + fracsigma^2n
endalign*
By using the values we have found we obtain



beginalign*
mathbbE(T_c)
&= frac1c sum_j=1^n mathbbE(X_j^2) + mathbbE(barX^2) - 2 mathbbE (X_j barX)\
&= frac1c sum_j=1^n (sigma^2 + mu^2) + ( mu^2 + fracsigma^2n ) - 2 (mu^2 + fracsigma^2n) \
&= frac(n-1)sigma^2c
endalign*



So $T_c$ is biased for $sigma^2$ for all values of $c neq n-1$






share|cite|improve this answer




















  • Thanks so much for the help!! I have a few questions though. For 2, I ended up trying something with Var($bar X$) (similar to 1.) and ended up with the same answer as yours. Is it possible to use Var($bar X$) = $mathbb E (bar X^2 )$ - $mathbb E ( bar X )^2$ ? For Var($bar X$), i took out 1/n and it became 1/$n^2$ , which eventually gives me the $sigma^2$ / n. In addition, I'm rather confused on your working for 2, I'm not exactly sure how you got from the 2nd to 3rd step. I'm quite confused on your working for 3 in general as well (from 1st to 2nd step, and first part of 3rd step)
    – Wei Xiong Yeo
    Sep 2 at 16:18










  • Yes it is possible to use that formula and it is actually much easier to calculate that way :). Concerning how to get from 2nd to 3rd in 2 it's just explicitly multiplying all the terms in both sums. Concerning number 3: It comes down to calculating $mathbbE(X_jX_i)$ in two different cases. If $i = j$ then $mathbbE(X_jX_i) = mathbbE(X_j^2)$. If $i neq j$ then $X_j$ and $X_i$ are independent and we have $mathbbE(X_jX_i) =mathbbE(X_j)mathbbE(X_i)$.
    – Digitalis
    Sep 2 at 16:44











  • I don't know if I've ever before down-voted a clear and correct answer, but this seems to make the problem horribly more complicated than it really is.
    – Michael Hardy
    Sep 2 at 20:00














up vote
0
down vote



accepted










Since $T_c$ is an estimator of $sigma^2$. $T_c$ will be unbiased for $theta = sigma^2$ if $mathbbE_mu,sigma^2(T_c) = sigma^2 quad forall mu, sigma^2 in mathbbR times mathbbR_0^+$



beginalign*
mathbbE(T_c)&= frac1cmathbbE Big( sum_j=1^n X_j^2 + barX^2 -2X_j barX Big) \
&= frac1c sum_j=1^n mathbbE(X_j^2) + mathbbE(barX^2) - 2 mathbbE (X_j barX)
endalign*



Now calculate $mathbbE(X_j^2),mathbbE(barX^2)$ & $ mathbbE (X_j barX)$.



1.
Using Var$(X) = mathbbE(X^2) - mathbbE(X)^2$ we find $mathbbE(X_j^2) = sigma^2 + mu^2$.



2.
beginalign*
mathbbE(barX^2) &= mathbbEBig( big(frac1n sum_i^n X_ibig) big(frac1n sum_i^n X_ibig)Big) \
&= frac1n^2 mathbbEBig( big(sum_i^n X_ibig) big(sum_i^n X_ibig)Big)\
&= frac1n^2 mathbbEBig( sum_i neq j X_iX_j + sum_i X_i^2Big) \
&=frac1n^2 Big(sum_i neq j mathbbE(X_i)mathbbE(X_j) + sum_i mathbbE(X_i^2)Big)\
&= frac1n^2 Big( n(n-1)mu^2 + n(mu^2 + sigma^2) Big)\
&= frac(n-1)mu^2 + mu^2 + sigma^2n =mu^2 + fracsigma^2n
endalign*



3.



beginalign*
mathbbE (X_j barX)&= frac1n sum_i^n mathbbE(X_jX_i) \
&=frac1n big( sum_ineq jmathbbE(X_j)mathbbE(X_i) big) + frac1n mathbbE(X_j^2)\
&= frac1n (n-1) mu^2 + frac1n(mu^2 + sigma^2)\
&= mu^2 + fracsigma^2n
endalign*
By using the values we have found we obtain



beginalign*
mathbbE(T_c)
&= frac1c sum_j=1^n mathbbE(X_j^2) + mathbbE(barX^2) - 2 mathbbE (X_j barX)\
&= frac1c sum_j=1^n (sigma^2 + mu^2) + ( mu^2 + fracsigma^2n ) - 2 (mu^2 + fracsigma^2n) \
&= frac(n-1)sigma^2c
endalign*



So $T_c$ is biased for $sigma^2$ for all values of $c neq n-1$






share|cite|improve this answer




















  • Thanks so much for the help!! I have a few questions though. For 2, I ended up trying something with Var($bar X$) (similar to 1.) and ended up with the same answer as yours. Is it possible to use Var($bar X$) = $mathbb E (bar X^2 )$ - $mathbb E ( bar X )^2$ ? For Var($bar X$), i took out 1/n and it became 1/$n^2$ , which eventually gives me the $sigma^2$ / n. In addition, I'm rather confused on your working for 2, I'm not exactly sure how you got from the 2nd to 3rd step. I'm quite confused on your working for 3 in general as well (from 1st to 2nd step, and first part of 3rd step)
    – Wei Xiong Yeo
    Sep 2 at 16:18










  • Yes it is possible to use that formula and it is actually much easier to calculate that way :). Concerning how to get from 2nd to 3rd in 2 it's just explicitly multiplying all the terms in both sums. Concerning number 3: It comes down to calculating $mathbbE(X_jX_i)$ in two different cases. If $i = j$ then $mathbbE(X_jX_i) = mathbbE(X_j^2)$. If $i neq j$ then $X_j$ and $X_i$ are independent and we have $mathbbE(X_jX_i) =mathbbE(X_j)mathbbE(X_i)$.
    – Digitalis
    Sep 2 at 16:44











  • I don't know if I've ever before down-voted a clear and correct answer, but this seems to make the problem horribly more complicated than it really is.
    – Michael Hardy
    Sep 2 at 20:00












up vote
0
down vote



accepted







up vote
0
down vote



accepted






Since $T_c$ is an estimator of $sigma^2$. $T_c$ will be unbiased for $theta = sigma^2$ if $mathbbE_mu,sigma^2(T_c) = sigma^2 quad forall mu, sigma^2 in mathbbR times mathbbR_0^+$



beginalign*
mathbbE(T_c)&= frac1cmathbbE Big( sum_j=1^n X_j^2 + barX^2 -2X_j barX Big) \
&= frac1c sum_j=1^n mathbbE(X_j^2) + mathbbE(barX^2) - 2 mathbbE (X_j barX)
endalign*



Now calculate $mathbbE(X_j^2),mathbbE(barX^2)$ & $ mathbbE (X_j barX)$.



1.
Using Var$(X) = mathbbE(X^2) - mathbbE(X)^2$ we find $mathbbE(X_j^2) = sigma^2 + mu^2$.



2.
beginalign*
mathbbE(barX^2) &= mathbbEBig( big(frac1n sum_i^n X_ibig) big(frac1n sum_i^n X_ibig)Big) \
&= frac1n^2 mathbbEBig( big(sum_i^n X_ibig) big(sum_i^n X_ibig)Big)\
&= frac1n^2 mathbbEBig( sum_i neq j X_iX_j + sum_i X_i^2Big) \
&=frac1n^2 Big(sum_i neq j mathbbE(X_i)mathbbE(X_j) + sum_i mathbbE(X_i^2)Big)\
&= frac1n^2 Big( n(n-1)mu^2 + n(mu^2 + sigma^2) Big)\
&= frac(n-1)mu^2 + mu^2 + sigma^2n =mu^2 + fracsigma^2n
endalign*



3.



beginalign*
mathbbE (X_j barX)&= frac1n sum_i^n mathbbE(X_jX_i) \
&=frac1n big( sum_ineq jmathbbE(X_j)mathbbE(X_i) big) + frac1n mathbbE(X_j^2)\
&= frac1n (n-1) mu^2 + frac1n(mu^2 + sigma^2)\
&= mu^2 + fracsigma^2n
endalign*
By using the values we have found we obtain



beginalign*
mathbbE(T_c)
&= frac1c sum_j=1^n mathbbE(X_j^2) + mathbbE(barX^2) - 2 mathbbE (X_j barX)\
&= frac1c sum_j=1^n (sigma^2 + mu^2) + ( mu^2 + fracsigma^2n ) - 2 (mu^2 + fracsigma^2n) \
&= frac(n-1)sigma^2c
endalign*



So $T_c$ is biased for $sigma^2$ for all values of $c neq n-1$






share|cite|improve this answer












Since $T_c$ is an estimator of $sigma^2$. $T_c$ will be unbiased for $theta = sigma^2$ if $mathbbE_mu,sigma^2(T_c) = sigma^2 quad forall mu, sigma^2 in mathbbR times mathbbR_0^+$



beginalign*
mathbbE(T_c)&= frac1cmathbbE Big( sum_j=1^n X_j^2 + barX^2 -2X_j barX Big) \
&= frac1c sum_j=1^n mathbbE(X_j^2) + mathbbE(barX^2) - 2 mathbbE (X_j barX)
endalign*



Now calculate $mathbbE(X_j^2),mathbbE(barX^2)$ & $ mathbbE (X_j barX)$.



1.
Using Var$(X) = mathbbE(X^2) - mathbbE(X)^2$ we find $mathbbE(X_j^2) = sigma^2 + mu^2$.



2.
beginalign*
mathbbE(barX^2) &= mathbbEBig( big(frac1n sum_i^n X_ibig) big(frac1n sum_i^n X_ibig)Big) \
&= frac1n^2 mathbbEBig( big(sum_i^n X_ibig) big(sum_i^n X_ibig)Big)\
&= frac1n^2 mathbbEBig( sum_i neq j X_iX_j + sum_i X_i^2Big) \
&=frac1n^2 Big(sum_i neq j mathbbE(X_i)mathbbE(X_j) + sum_i mathbbE(X_i^2)Big)\
&= frac1n^2 Big( n(n-1)mu^2 + n(mu^2 + sigma^2) Big)\
&= frac(n-1)mu^2 + mu^2 + sigma^2n =mu^2 + fracsigma^2n
endalign*



3.



beginalign*
mathbbE (X_j barX)&= frac1n sum_i^n mathbbE(X_jX_i) \
&=frac1n big( sum_ineq jmathbbE(X_j)mathbbE(X_i) big) + frac1n mathbbE(X_j^2)\
&= frac1n (n-1) mu^2 + frac1n(mu^2 + sigma^2)\
&= mu^2 + fracsigma^2n
endalign*
By using the values we have found we obtain



beginalign*
mathbbE(T_c)
&= frac1c sum_j=1^n mathbbE(X_j^2) + mathbbE(barX^2) - 2 mathbbE (X_j barX)\
&= frac1c sum_j=1^n (sigma^2 + mu^2) + ( mu^2 + fracsigma^2n ) - 2 (mu^2 + fracsigma^2n) \
&= frac(n-1)sigma^2c
endalign*



So $T_c$ is biased for $sigma^2$ for all values of $c neq n-1$







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Sep 2 at 14:34









Digitalis

318114




318114











  • Thanks so much for the help!! I have a few questions though. For 2, I ended up trying something with Var($bar X$) (similar to 1.) and ended up with the same answer as yours. Is it possible to use Var($bar X$) = $mathbb E (bar X^2 )$ - $mathbb E ( bar X )^2$ ? For Var($bar X$), i took out 1/n and it became 1/$n^2$ , which eventually gives me the $sigma^2$ / n. In addition, I'm rather confused on your working for 2, I'm not exactly sure how you got from the 2nd to 3rd step. I'm quite confused on your working for 3 in general as well (from 1st to 2nd step, and first part of 3rd step)
    – Wei Xiong Yeo
    Sep 2 at 16:18










  • Yes it is possible to use that formula and it is actually much easier to calculate that way :). Concerning how to get from 2nd to 3rd in 2 it's just explicitly multiplying all the terms in both sums. Concerning number 3: It comes down to calculating $mathbbE(X_jX_i)$ in two different cases. If $i = j$ then $mathbbE(X_jX_i) = mathbbE(X_j^2)$. If $i neq j$ then $X_j$ and $X_i$ are independent and we have $mathbbE(X_jX_i) =mathbbE(X_j)mathbbE(X_i)$.
    – Digitalis
    Sep 2 at 16:44











  • I don't know if I've ever before down-voted a clear and correct answer, but this seems to make the problem horribly more complicated than it really is.
    – Michael Hardy
    Sep 2 at 20:00
















  • Thanks so much for the help!! I have a few questions though. For 2, I ended up trying something with Var($bar X$) (similar to 1.) and ended up with the same answer as yours. Is it possible to use Var($bar X$) = $mathbb E (bar X^2 )$ - $mathbb E ( bar X )^2$ ? For Var($bar X$), i took out 1/n and it became 1/$n^2$ , which eventually gives me the $sigma^2$ / n. In addition, I'm rather confused on your working for 2, I'm not exactly sure how you got from the 2nd to 3rd step. I'm quite confused on your working for 3 in general as well (from 1st to 2nd step, and first part of 3rd step)
    – Wei Xiong Yeo
    Sep 2 at 16:18










  • Yes it is possible to use that formula and it is actually much easier to calculate that way :). Concerning how to get from 2nd to 3rd in 2 it's just explicitly multiplying all the terms in both sums. Concerning number 3: It comes down to calculating $mathbbE(X_jX_i)$ in two different cases. If $i = j$ then $mathbbE(X_jX_i) = mathbbE(X_j^2)$. If $i neq j$ then $X_j$ and $X_i$ are independent and we have $mathbbE(X_jX_i) =mathbbE(X_j)mathbbE(X_i)$.
    – Digitalis
    Sep 2 at 16:44











  • I don't know if I've ever before down-voted a clear and correct answer, but this seems to make the problem horribly more complicated than it really is.
    – Michael Hardy
    Sep 2 at 20:00















Thanks so much for the help!! I have a few questions though. For 2, I ended up trying something with Var($bar X$) (similar to 1.) and ended up with the same answer as yours. Is it possible to use Var($bar X$) = $mathbb E (bar X^2 )$ - $mathbb E ( bar X )^2$ ? For Var($bar X$), i took out 1/n and it became 1/$n^2$ , which eventually gives me the $sigma^2$ / n. In addition, I'm rather confused on your working for 2, I'm not exactly sure how you got from the 2nd to 3rd step. I'm quite confused on your working for 3 in general as well (from 1st to 2nd step, and first part of 3rd step)
– Wei Xiong Yeo
Sep 2 at 16:18




Thanks so much for the help!! I have a few questions though. For 2, I ended up trying something with Var($bar X$) (similar to 1.) and ended up with the same answer as yours. Is it possible to use Var($bar X$) = $mathbb E (bar X^2 )$ - $mathbb E ( bar X )^2$ ? For Var($bar X$), i took out 1/n and it became 1/$n^2$ , which eventually gives me the $sigma^2$ / n. In addition, I'm rather confused on your working for 2, I'm not exactly sure how you got from the 2nd to 3rd step. I'm quite confused on your working for 3 in general as well (from 1st to 2nd step, and first part of 3rd step)
– Wei Xiong Yeo
Sep 2 at 16:18












Yes it is possible to use that formula and it is actually much easier to calculate that way :). Concerning how to get from 2nd to 3rd in 2 it's just explicitly multiplying all the terms in both sums. Concerning number 3: It comes down to calculating $mathbbE(X_jX_i)$ in two different cases. If $i = j$ then $mathbbE(X_jX_i) = mathbbE(X_j^2)$. If $i neq j$ then $X_j$ and $X_i$ are independent and we have $mathbbE(X_jX_i) =mathbbE(X_j)mathbbE(X_i)$.
– Digitalis
Sep 2 at 16:44





Yes it is possible to use that formula and it is actually much easier to calculate that way :). Concerning how to get from 2nd to 3rd in 2 it's just explicitly multiplying all the terms in both sums. Concerning number 3: It comes down to calculating $mathbbE(X_jX_i)$ in two different cases. If $i = j$ then $mathbbE(X_jX_i) = mathbbE(X_j^2)$. If $i neq j$ then $X_j$ and $X_i$ are independent and we have $mathbbE(X_jX_i) =mathbbE(X_j)mathbbE(X_i)$.
– Digitalis
Sep 2 at 16:44













I don't know if I've ever before down-voted a clear and correct answer, but this seems to make the problem horribly more complicated than it really is.
– Michael Hardy
Sep 2 at 20:00




I don't know if I've ever before down-voted a clear and correct answer, but this seems to make the problem horribly more complicated than it really is.
– Michael Hardy
Sep 2 at 20:00










up vote
1
down vote













$$
frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2 sim chi^2_n-1.
$$
Therefore
$$
operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i - overline X)^2 right) = n-1.
$$
So
$$
operatorname Eleft( frac 1 c sum_i=1^n (X_i-overline X)^2 right) = fracsigma^2 c (n-1).
$$
Subtract $sigma^2$ from that to get the bias.






share|cite|improve this answer




















  • Hi, I understand why the first expectation is n-1 (because of chi-squared), but how did you manage to get the second result without doing the lengthy working as mentioned above? Is it by manipulating the (n-1) result algebraically? So the first expectation found is solely used to find $mathbb E hat theta$, and I just subtract $theta$ which is $sigma^2$ from it as per the bias formula?
    – Wei Xiong Yeo
    Sep 3 at 2:05







  • 1




    @WeiXiongYeo : $$ beginalign & textWe have operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i - overline X)^2 right) = n-1. \ \ & textMultiplying both sides by frac sigma^2 c, text we get fracsigma^2 c operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2right) = fracsigma^2 c (n-1). \ \ & textUsing linearity of expectation we get operatorname Eleft( fracsigma^2 c cdot frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2 right) = fracsigma^2 c (n-1). endalign $$ Then cancel $sigma^2$ from the top and the bottom.
    – Michael Hardy
    Sep 3 at 2:40







  • 1




    Thank you so much for the help too!!
    – Wei Xiong Yeo
    Sep 3 at 3:22






  • 1




    $$ beginalign & textWe have \ & operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i - overline X)^2 right) = n-1. \ \ & textMultiplying both sides by frac sigma^2 c, text we get \ \ & fracsigma^2 c operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2right) = fracsigma^2 c (n-1). \ \ & textUsing linearity of expectation we get \ \ & operatorname Eleft( fracsigma^2 c cdot frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2 right) = fracsigma^2 c (n-1). endalign $$
    – Michael Hardy
    Sep 5 at 2:26







  • 1




    @WeiXiongYeo : You have $operatornamevar(chi^2_k) = 2k.$ So the variance of the first expression in my answer is $2(n-1).$ If you multiply both sides by $sigma^2/c,$ you multiply the variance by $sigma^4/c^2,$ so you get $$ operatornamevarleft( frac 1 c sum_i=1^n (X_i - overline X right)^2 = fracsigma^4c^2 2(n-1). $$
    – Michael Hardy
    Sep 5 at 2:36














up vote
1
down vote













$$
frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2 sim chi^2_n-1.
$$
Therefore
$$
operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i - overline X)^2 right) = n-1.
$$
So
$$
operatorname Eleft( frac 1 c sum_i=1^n (X_i-overline X)^2 right) = fracsigma^2 c (n-1).
$$
Subtract $sigma^2$ from that to get the bias.






share|cite|improve this answer




















  • Hi, I understand why the first expectation is n-1 (because of chi-squared), but how did you manage to get the second result without doing the lengthy working as mentioned above? Is it by manipulating the (n-1) result algebraically? So the first expectation found is solely used to find $mathbb E hat theta$, and I just subtract $theta$ which is $sigma^2$ from it as per the bias formula?
    – Wei Xiong Yeo
    Sep 3 at 2:05







  • 1




    @WeiXiongYeo : $$ beginalign & textWe have operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i - overline X)^2 right) = n-1. \ \ & textMultiplying both sides by frac sigma^2 c, text we get fracsigma^2 c operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2right) = fracsigma^2 c (n-1). \ \ & textUsing linearity of expectation we get operatorname Eleft( fracsigma^2 c cdot frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2 right) = fracsigma^2 c (n-1). endalign $$ Then cancel $sigma^2$ from the top and the bottom.
    – Michael Hardy
    Sep 3 at 2:40







  • 1




    Thank you so much for the help too!!
    – Wei Xiong Yeo
    Sep 3 at 3:22






  • 1




    $$ beginalign & textWe have \ & operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i - overline X)^2 right) = n-1. \ \ & textMultiplying both sides by frac sigma^2 c, text we get \ \ & fracsigma^2 c operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2right) = fracsigma^2 c (n-1). \ \ & textUsing linearity of expectation we get \ \ & operatorname Eleft( fracsigma^2 c cdot frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2 right) = fracsigma^2 c (n-1). endalign $$
    – Michael Hardy
    Sep 5 at 2:26







  • 1




    @WeiXiongYeo : You have $operatornamevar(chi^2_k) = 2k.$ So the variance of the first expression in my answer is $2(n-1).$ If you multiply both sides by $sigma^2/c,$ you multiply the variance by $sigma^4/c^2,$ so you get $$ operatornamevarleft( frac 1 c sum_i=1^n (X_i - overline X right)^2 = fracsigma^4c^2 2(n-1). $$
    – Michael Hardy
    Sep 5 at 2:36












up vote
1
down vote










up vote
1
down vote









$$
frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2 sim chi^2_n-1.
$$
Therefore
$$
operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i - overline X)^2 right) = n-1.
$$
So
$$
operatorname Eleft( frac 1 c sum_i=1^n (X_i-overline X)^2 right) = fracsigma^2 c (n-1).
$$
Subtract $sigma^2$ from that to get the bias.






share|cite|improve this answer












$$
frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2 sim chi^2_n-1.
$$
Therefore
$$
operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i - overline X)^2 right) = n-1.
$$
So
$$
operatorname Eleft( frac 1 c sum_i=1^n (X_i-overline X)^2 right) = fracsigma^2 c (n-1).
$$
Subtract $sigma^2$ from that to get the bias.







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Sep 2 at 19:59









Michael Hardy

206k23187466




206k23187466











  • Hi, I understand why the first expectation is n-1 (because of chi-squared), but how did you manage to get the second result without doing the lengthy working as mentioned above? Is it by manipulating the (n-1) result algebraically? So the first expectation found is solely used to find $mathbb E hat theta$, and I just subtract $theta$ which is $sigma^2$ from it as per the bias formula?
    – Wei Xiong Yeo
    Sep 3 at 2:05







  • 1




    @WeiXiongYeo : $$ beginalign & textWe have operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i - overline X)^2 right) = n-1. \ \ & textMultiplying both sides by frac sigma^2 c, text we get fracsigma^2 c operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2right) = fracsigma^2 c (n-1). \ \ & textUsing linearity of expectation we get operatorname Eleft( fracsigma^2 c cdot frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2 right) = fracsigma^2 c (n-1). endalign $$ Then cancel $sigma^2$ from the top and the bottom.
    – Michael Hardy
    Sep 3 at 2:40







  • 1




    Thank you so much for the help too!!
    – Wei Xiong Yeo
    Sep 3 at 3:22






  • 1




    $$ beginalign & textWe have \ & operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i - overline X)^2 right) = n-1. \ \ & textMultiplying both sides by frac sigma^2 c, text we get \ \ & fracsigma^2 c operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2right) = fracsigma^2 c (n-1). \ \ & textUsing linearity of expectation we get \ \ & operatorname Eleft( fracsigma^2 c cdot frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2 right) = fracsigma^2 c (n-1). endalign $$
    – Michael Hardy
    Sep 5 at 2:26







  • 1




    @WeiXiongYeo : You have $operatornamevar(chi^2_k) = 2k.$ So the variance of the first expression in my answer is $2(n-1).$ If you multiply both sides by $sigma^2/c,$ you multiply the variance by $sigma^4/c^2,$ so you get $$ operatornamevarleft( frac 1 c sum_i=1^n (X_i - overline X right)^2 = fracsigma^4c^2 2(n-1). $$
    – Michael Hardy
    Sep 5 at 2:36
















  • Hi, I understand why the first expectation is n-1 (because of chi-squared), but how did you manage to get the second result without doing the lengthy working as mentioned above? Is it by manipulating the (n-1) result algebraically? So the first expectation found is solely used to find $mathbb E hat theta$, and I just subtract $theta$ which is $sigma^2$ from it as per the bias formula?
    – Wei Xiong Yeo
    Sep 3 at 2:05







  • 1




    @WeiXiongYeo : $$ beginalign & textWe have operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i - overline X)^2 right) = n-1. \ \ & textMultiplying both sides by frac sigma^2 c, text we get fracsigma^2 c operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2right) = fracsigma^2 c (n-1). \ \ & textUsing linearity of expectation we get operatorname Eleft( fracsigma^2 c cdot frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2 right) = fracsigma^2 c (n-1). endalign $$ Then cancel $sigma^2$ from the top and the bottom.
    – Michael Hardy
    Sep 3 at 2:40







  • 1




    Thank you so much for the help too!!
    – Wei Xiong Yeo
    Sep 3 at 3:22






  • 1




    $$ beginalign & textWe have \ & operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i - overline X)^2 right) = n-1. \ \ & textMultiplying both sides by frac sigma^2 c, text we get \ \ & fracsigma^2 c operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2right) = fracsigma^2 c (n-1). \ \ & textUsing linearity of expectation we get \ \ & operatorname Eleft( fracsigma^2 c cdot frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2 right) = fracsigma^2 c (n-1). endalign $$
    – Michael Hardy
    Sep 5 at 2:26







  • 1




    @WeiXiongYeo : You have $operatornamevar(chi^2_k) = 2k.$ So the variance of the first expression in my answer is $2(n-1).$ If you multiply both sides by $sigma^2/c,$ you multiply the variance by $sigma^4/c^2,$ so you get $$ operatornamevarleft( frac 1 c sum_i=1^n (X_i - overline X right)^2 = fracsigma^4c^2 2(n-1). $$
    – Michael Hardy
    Sep 5 at 2:36















Hi, I understand why the first expectation is n-1 (because of chi-squared), but how did you manage to get the second result without doing the lengthy working as mentioned above? Is it by manipulating the (n-1) result algebraically? So the first expectation found is solely used to find $mathbb E hat theta$, and I just subtract $theta$ which is $sigma^2$ from it as per the bias formula?
– Wei Xiong Yeo
Sep 3 at 2:05





Hi, I understand why the first expectation is n-1 (because of chi-squared), but how did you manage to get the second result without doing the lengthy working as mentioned above? Is it by manipulating the (n-1) result algebraically? So the first expectation found is solely used to find $mathbb E hat theta$, and I just subtract $theta$ which is $sigma^2$ from it as per the bias formula?
– Wei Xiong Yeo
Sep 3 at 2:05





1




1




@WeiXiongYeo : $$ beginalign & textWe have operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i - overline X)^2 right) = n-1. \ \ & textMultiplying both sides by frac sigma^2 c, text we get fracsigma^2 c operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2right) = fracsigma^2 c (n-1). \ \ & textUsing linearity of expectation we get operatorname Eleft( fracsigma^2 c cdot frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2 right) = fracsigma^2 c (n-1). endalign $$ Then cancel $sigma^2$ from the top and the bottom.
– Michael Hardy
Sep 3 at 2:40





@WeiXiongYeo : $$ beginalign & textWe have operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i - overline X)^2 right) = n-1. \ \ & textMultiplying both sides by frac sigma^2 c, text we get fracsigma^2 c operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2right) = fracsigma^2 c (n-1). \ \ & textUsing linearity of expectation we get operatorname Eleft( fracsigma^2 c cdot frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2 right) = fracsigma^2 c (n-1). endalign $$ Then cancel $sigma^2$ from the top and the bottom.
– Michael Hardy
Sep 3 at 2:40





1




1




Thank you so much for the help too!!
– Wei Xiong Yeo
Sep 3 at 3:22




Thank you so much for the help too!!
– Wei Xiong Yeo
Sep 3 at 3:22




1




1




$$ beginalign & textWe have \ & operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i - overline X)^2 right) = n-1. \ \ & textMultiplying both sides by frac sigma^2 c, text we get \ \ & fracsigma^2 c operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2right) = fracsigma^2 c (n-1). \ \ & textUsing linearity of expectation we get \ \ & operatorname Eleft( fracsigma^2 c cdot frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2 right) = fracsigma^2 c (n-1). endalign $$
– Michael Hardy
Sep 5 at 2:26





$$ beginalign & textWe have \ & operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i - overline X)^2 right) = n-1. \ \ & textMultiplying both sides by frac sigma^2 c, text we get \ \ & fracsigma^2 c operatorname Eleft( frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2right) = fracsigma^2 c (n-1). \ \ & textUsing linearity of expectation we get \ \ & operatorname Eleft( fracsigma^2 c cdot frac 1 sigma^2 sum_i=1^n (X_i-overline X)^2 right) = fracsigma^2 c (n-1). endalign $$
– Michael Hardy
Sep 5 at 2:26





1




1




@WeiXiongYeo : You have $operatornamevar(chi^2_k) = 2k.$ So the variance of the first expression in my answer is $2(n-1).$ If you multiply both sides by $sigma^2/c,$ you multiply the variance by $sigma^4/c^2,$ so you get $$ operatornamevarleft( frac 1 c sum_i=1^n (X_i - overline X right)^2 = fracsigma^4c^2 2(n-1). $$
– Michael Hardy
Sep 5 at 2:36




@WeiXiongYeo : You have $operatornamevar(chi^2_k) = 2k.$ So the variance of the first expression in my answer is $2(n-1).$ If you multiply both sides by $sigma^2/c,$ you multiply the variance by $sigma^4/c^2,$ so you get $$ operatornamevarleft( frac 1 c sum_i=1^n (X_i - overline X right)^2 = fracsigma^4c^2 2(n-1). $$
– Michael Hardy
Sep 5 at 2:36

















 

draft saved


draft discarded















































 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2901494%2fhow-to-calculate-the-bias-of-the-statistic%23new-answer', 'question_page');

);

Post as a guest













































































這個網誌中的熱門文章

How to combine Bézier curves to a surface?

Mutual Information Always Non-negative

Why am i infinitely getting the same tweet with the Twitter Search API?