Combining unbiased estimators
Let
and
be unbiased estimators of
with non-singular variances
and
respectively.
Then the minimum variance linear unbiased estimator of
is obtained by combining
and
using weights that are proportional to the inverses of their variances. The result can be expressed in a variety of ways:
The proof is an application of the principle of Generalized Least-Squares. The problem can be formulated as a GLS problem by considering that:
with
Applying the GLS formula yields:
Help:Math
Expected value of SSH
Consider one-way MANOVA with
groups, each with
observations. Let
and let
![{\displaystyle D={\begin{bmatrix}1_{n_{1}}&\cdots &0\\\vdots &\ddots &\vdots \\0&\cdots &1_{n_{g}}\end{bmatrix}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/5d0ffb98a86755c74fc04385a8d3121c256d4317)
be the design matrix.
Let
be the
residual projection matrix defined by
![{\displaystyle Q=I-1_{N}(1_{N}'1_{N})^{-1}1_{N}'=I-{\frac {1}{N}}U}](https://wikimedia.org/api/rest_v1/media/math/render/svg/10f4c9b140f7b70b735e301f9d41ee8735fbd74a)
Analyzing SSH
We can find expressions for SSH in terms of the data and find expected values for SSH under a fixed effects or under a random effects model.
The following formula is used repeatedly to find the expected value of a quadratic form. If
is a random vector with
and
, and
is symmetric, then
![{\displaystyle \operatorname {E} (Y'QY)=\mu 'Q\mu +\operatorname {tr} (Q\Psi )\!}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8a3d1574d889ea68944fe6b0b2271059919aab20)
We can model:
![{\displaystyle \mathbf {Y} =D\mathbf {\mu } +\epsilon \!}](https://wikimedia.org/api/rest_v1/media/math/render/svg/5461ae9c056ea87c32cddf7aacbe70d83c905119)
where
![{\displaystyle \mu \sim N(1\psi ,\phi ^{2}I)\!}](https://wikimedia.org/api/rest_v1/media/math/render/svg/4c88aa4c588f8545409ab7b58426c7c3ac057b40)
and
![{\displaystyle \epsilon \sim N(0,\sigma ^{2}I)\!}](https://wikimedia.org/api/rest_v1/media/math/render/svg/4001143365dfc1fc84fe809971a392240a1c2d9a)
and
is independent of
.
Thus
and ![{\displaystyle \operatorname {Var} (Y)=\phi ^{2}DD'+\sigma ^{2}I\!}](https://wikimedia.org/api/rest_v1/media/math/render/svg/21134081c9551250d3b47fb2f4c46c8aa1eb0713)
Consequently
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
where
is the group-size weighted mean of group sizes.
With equal groups
and
![{\displaystyle E(SSTO)=\phi ^{2}N{\frac {G-1}{G}}+\sigma ^{2}(N-1)=\phi ^{2}(N-n)+\sigma ^{2}(N-1)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/563142f970ac348cbc66383c93303994afa0fd7c)
Thus
|
=
|
|
|
=
|
|
|
=
|
|
Multivariate response
If we are sampling from a p-variate distribution in which
![{\displaystyle \mathbf {Y} _{ig}\sim {\mbox{i.i.d.}}N(\mathbf {\mu } _{g},\Sigma )}](https://wikimedia.org/api/rest_v1/media/math/render/svg/783b0eb36b147a0a597487272db70fc056eac5ee)
and
![{\displaystyle \mathbf {\mu } _{1},...,\mathbf {\mu } _{G}\sim {\mbox{ i.i.d. }}N(\mathbf {\psi } ,\Phi ),{\mbox{ independently of }}\mathbf {Y} _{ig}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/2e9db72dfb8042d63183803de052bfdf8e12650f)
then the analogous results are:
![{\displaystyle E(SSE)=(N-G)\Sigma }](https://wikimedia.org/api/rest_v1/media/math/render/svg/f217da70dce794e85f81644eb73fa1548f2409df)
and
![{\displaystyle E(SSH)=(N-{\overset {\sim }{n}})\Phi +(G-1)\Sigma }](https://wikimedia.org/api/rest_v1/media/math/render/svg/6b5bbf6c87788e2afc05439dc7abe0363960e52f)
Note that
![{\displaystyle Var({\bar {\mathbf {Y} }}_{\cdot g})=\Phi +{\frac {1}{n_{g}}}\Sigma }](https://wikimedia.org/api/rest_v1/media/math/render/svg/3705323705ab5a77b8c00c712e668fa85531e2ce)
and that the group-size weighted average of these variances is:
![{\displaystyle \sum _{g=1}^{G}{\frac {n_{g}}{N}}Var({\bar {\mathbf {Y} }}_{\cdot g})=\sum _{g=1}^{G}{\frac {n_{g}}{N}}\left[\Phi +{\frac {1}{n_{g}}}\Sigma \right]=\Phi +{\frac {G}{N}}\Sigma }](https://wikimedia.org/api/rest_v1/media/math/render/svg/6553d672a30a62e8be3c9d0fb75834f67f1f4675)
The expectation of combinations of
and
of the form
:
|
|
|
1
|
0
|
|
0
|
1
|
|
|
0
|
|
|
0
|
|
|
|
|