1
$\begingroup$

This is concerning Eq. (3.7) of C R Rao's 1945 paper (see p.81 of this article). Can someone help me in figuring out the second equality in Eq. (3.7)?

His claim is (since $\phi(x,\theta) = \Phi(T,\theta) \psi(x_1,\dots,x_n)$ from Eq. (3.6)) can be written as $$\theta = \int t \phi \pi dx_i = \int t \Phi(T,\theta) \psi(x_1,\dots,x_n) \pi dx_i = \int f(T)\Phi(T,\theta)dT,$$ for some function $f(T)$ of $T$, independent of $\theta$.

My question is concerning the last equality. Prof Rao seems to regard $t\psi(x_1,\dots,x_n) \pi dx_i$ as $f(T)dT$. Since $\psi(x_1,\dots,x_n)$ is essentially the conditional distribution of $x_1,\dots, x_n$ given $T$ and since $T$ is a sufficient statistics, it is true that $\psi(x_1,\dots,x_n)$ is a function depending on $T$, but is independent of $\theta$. Also since the conditional distribution of $x_1,\dots, x_n$ given $T$, $t\psi(x_1,\dots,x_n)$ resembles $E[t|T]$ which is a function of $T$. However, I am not able to get these rigorously. Any help in this connection is greatly appreciated.

PS: I asked the same in math.stackexchange here, but did not receive any response, so asking here.

$\endgroup$

1 Answer 1

2
$\begingroup$

$\newcommand\th\theta$$T$ is a sufficient statistic and thus a random variable. So, the integral $\int f(T)\Phi(T,\theta)dT$ cannot possibly have a meaning.

The conclusions that Rao is trying to reach here are true, though:

(i) "there exists a function $f(T)$ of $T$, independent of $\theta$ and is an unbiased estimate of $\theta$".

This is true, leaving aside the faulty grammar of this statement. Indeed, $T$ is an abbreviation of $T(X)$, where $X$ is a random sample from the distribution $P_\theta$ and $T$ is a Borel-measurable function. That $T$ is sufficient means that (some version of the conditional expectation) $E_\th(t(X)|T(X))$ does not depend on $\th$ for any Borel-measurable function $t$ such that $E_\th t(X)$ exists for all $\th$. So (in view of the Doob--Dynkin_lemma), we can write $E_\th(t(X)|T(X))=f(T(X))$ for some Borel-measurable function $f$, which is the same for all $\th$. Therefore, $$E_\th f(T(X))=E_\th E_\th(t(X)|T(X))=E_\th t(X).$$ So, if $t(X)$ is unbiased for $\th$ -- that is, if $E_\th t(X)=\th$ for all $\th$ -- then $f(T(X))$ is also unbiased for $\th$.

(ii) "the best unbiased estimate of $\th$ is an explicit function of the sufficient statistic".

This is true because $$Var_\th f(T(X))=Var_\th E_\th(t(X)|T(X))\le Var_\th t(X),$$ since, in general, $Var\,E(Y|Z)\le Var\, Y$.


As noted in the preface to this paper, "The author was just 25, and did not have a PhD degree!" This may be the reason why the paper was written in a very imprecise language, which was archaic even in 1945, when the paper was written. Later writings by Rao are much more clear and precise.

$\endgroup$
2
  • $\begingroup$ Thanks Iosif. How do we prove that the $f(T(X))$ he is referring to is indeed $E_\theta(t(X)|T(X))$? $\endgroup$ Commented Jan 7, 2022 at 1:31
  • $\begingroup$ @Ashok : There is no way to prove that. Indeed, as I said, the integral $\int f(T)\Phi(T,\theta)dT$, where his $f(T)$ is introduced in his paper, cannot possibly have a meaning. Therefore, there is no rigorous way to relate his non-rigorous $f(T)$ with the rigorous notion of the conditional expectation $E_\th(t(X)|T(X))$. In general, one cannot possibly prove rigorously that something rigorous is the same as something non-rigorous. What my answer gives is a rigorous interpretation of non-rigorous results in that non-rigorous paper. $\endgroup$ Commented Jan 7, 2022 at 2:00

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.