2
$\begingroup$

Fix $m$ arbitrary values $x_1, x_2, ..., x_m$ in $[0,1]$, and an integer $n$. Obtain $n$-set $S$ by drawing $n \le m$ times randomly without replacement from $\{1,2,..,m\}$. Define r.v. $X = \sum_{i \in S} x_i$ and $\mu=E[X]$.

Is it the case that

$$\Pr[X \ge \mu + \epsilon\mu] ~=~ \exp(- \Omega(\epsilon^2 \mu))~?~~~~~(1)$$

Following the hint here, we can use McDiarmid's inequality to prove the weaker bound

$$\Pr[X \ge \mu + \epsilon n] ~=~ \exp(-\Omega(\epsilon^2 n)),~~~~~(2)$$

but this bound depends on $n$ and is weaker when $\mu \ll n$.

A related but different question was asked here.


[EDIT: My original question asked first if bound (1) holds in all applications of of McDiarmid's inequality where $X=f(y_1,..,y_n)$ is a function of $n$ independent random variables, and $f$ changes by at most 1 when any $x_i$ is changed. For the record, the answer to that seems to be NO. For a counter-example, take $$\textstyle X = f(x_1, x_2, \ldots, x_n) = \big|k \,-\, \big((\sum_{i=1}^n x_i) \bmod 2k\big)\big| $$ where each $x_i$ is i.i.d. uniformly over $[0,1]$ and, say, $k=\lceil \sqrt n\rceil$. Then $X$ is roughly uniformly distributed over $[0,k]$, so $\mu \approx k/2$, and $\Pr[X \ge \mu + \mu/2] = \Omega(1) \ne \exp(-\Omega(\mu))$.]

$\endgroup$
5
  • $\begingroup$ It seems to me that multiplicative or additive are the same, of the form $\mathbb{P}(X \geqslant \mu + A)$ with $ A = \varepsilon n $ in one case and $ A = \varepsilon \mu $ in the other case (the wikipedia link states that this is for all $ A > 0$). If you want to state the MacDiarmid inequality in a multiplicative form, replace the $A$ accordingly in the second case, i.e. take $ \mu t = \varepsilon k $ and the bound in $ \exp(-\varepsilon^2 k/3) $ becomes $ \exp( - ( \mu t / k )^2 k/3) = \exp( - t^2 \mu^2/(3 k) ) $ (hence $c = 3k/\mu $). Of course, $c$ depends of $ \mu $ here... $\endgroup$ Commented Sep 13, 2017 at 18:53
  • $\begingroup$ @Synia, In the form stated on Wikipedia, the bound is $$\Pr[X \ge \mu + A] \le \exp(-A^2/n),$$ where $n$ is the number of random variables. So, (in the general case) if you substitute $A=\epsilon\mu$, the bound is $\exp(-\epsilon^2 \mu^2/n)$, not $\exp(-\epsilon^2 \mu)$. (And note that, likewise, in the application above we want $c$ to be a constant independent of $k$ and $\mu$, e.g. $c=3$.) Also, if you look at the proof of McDiarmid, the Doob martingale that it uses can either increase or decrease in each step, which forces the use of an additive error bound. $\endgroup$ Commented Sep 13, 2017 at 21:07
  • $\begingroup$ Yes, I was suspecting something like that, namely that $c $ be independent of $n $ (no need to edit, I guess). Stated like this, the question becomes of course much more interesting. There may be some inequalities available in the literature in the case of a sum that "beat" the classical MacDiarmid inequality (I am thinking of Chatterjee's method with exchangeable pairs). I will try to look it up. $\endgroup$ Commented Sep 13, 2017 at 21:44
  • $\begingroup$ I realized that the answer to the first general question I asked is NO. I've edited the post to reflect this. $\endgroup$ Commented Sep 13, 2017 at 22:14
  • 1
    $\begingroup$ Looks likely that the answer to the current question is yes, from Theorem 10 of this paper and Proposition 5 of this one. $\endgroup$ Commented Sep 14, 2017 at 2:37

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.