3
$\begingroup$

My question is about f-divergences and Richard Jeffrey's (1965) rule for updating probabilities in the light of partial information.

The set-up:

  • Let $p: \mathcal{F} \rightarrow [0,1]$ be a probability function on a finite algebra of propositions.
  • Suppose that the probability of $E$ in $\mathcal{F}$ shifts from its prior value $p(E)$ to its posterior value $p'(E) = k$.
  • Jeffrey's Rule then says, for all $X$ in $\mathcal{F}$, $p'(X) = \sum_{E}p(X|E)p'(E)$. In other words, it offers a fairly straightforward generalization of Bayesian conditioning for partial information.

The concept of an f-divergence seems to be a fairly natural generalization of the Kullback-Leiber divergence. And minimizing the Kullback-Leibler divergence between prior and posterior probability functions is known to agree with Jeffrey's Rule (Williams, 1980).

Here is where I get stuck. I have seen it written that "minimizing an arbitrary f-divergence subject to the constraint $p(E_{i}) = k$ is equivalent to updating by Jeffrey's Rule". However, I can only find proofs going in one direction, namely, all f-divergences agree with Jeffrey's Rule (e.g., Diaconis and Zabell, 1982, Theorem 6.1).

Q: Is it also true that only f-divergences agree with Jeffrey's Rule? Or might there be some non-f-divergence $\mathcal{D}$ such that minimizing it subject to the same constraint also agrees with Jeffrey's Rule?

Any pointers would be awesome.

Refs:

$\endgroup$

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.