Skip to main content
incorporate the comment by Sandeep Silwal
Source Link

$\newcommand\PP{\mathbb P}$

Surely Surely $n_\min \lesssim d$, because it works for $c = 1/4$ and $n=160d$.

We use that the number of "distinct" $v$ with respect to the classifiers $\textrm{sgn}\langle \cdot, z_i \rangle$ is $$ \sum_{i=0}^{d-1} \binom{n-1}{i} \le \left( \frac{ne}{d} \right)^d $$

The proof can be found in [Bürgisser and Cucker (2013), Lemma 13.7]. (Anyone has a better reference?)

Let $X = \lvert \left\{ i ~:~ \langle z_i, e_1 \rangle \right\} \rvert$ for a basis vector $e_1$. Then, by the union bound $$ \PP \left[ ~\exists v ~~ \lvert \left\{ i ~:~ \langle z_i, v \rangle > 0 \right\}\rvert < cn \right] \\ \le \left( \frac{ne}{d} \right)^d \PP \left[ X < cn \right] \\ \le \exp(d\log(n) - \frac{n}{16} + d - d \log{d}) \\ = \exp(\log(160)d - 9d) \xrightarrow{d \to \infty} 0$$

where we used the Chernoff bound on $\PP[X < cn]$, in the form $$ \PP\left[X \le (1 - \delta) \mathbb E[X] \right] \le \exp(-\frac12 \delta^2 \mathbb E[X]). $$

I can't say ifAs noted in the linear dependency iscomment by Sandeep Silwal, $n_{\text{min}} \ge d$ due to the beststrict inequality sign in the question. So the answer to the original question is $n_{\text{min}} = \Theta(d)$.

$\newcommand\PP{\mathbb P}$

Surely $n_\min \lesssim d$, because it works for $c = 1/4$ and $n=160d$.

We use that the number of "distinct" $v$ with respect to the classifiers $\textrm{sgn}\langle \cdot, z_i \rangle$ is $$ \sum_{i=0}^{d-1} \binom{n-1}{i} \le \left( \frac{ne}{d} \right)^d $$

The proof can be found in [Bürgisser and Cucker (2013), Lemma 13.7]. (Anyone has a better reference?)

Let $X = \lvert \left\{ i ~:~ \langle z_i, e_1 \rangle \right\} \rvert$ for a basis vector $e_1$. Then, by the union bound $$ \PP \left[ ~\exists v ~~ \lvert \left\{ i ~:~ \langle z_i, v \rangle > 0 \right\}\rvert < cn \right] \\ \le \left( \frac{ne}{d} \right)^d \PP \left[ X < cn \right] \\ \le \exp(d\log(n) - \frac{n}{16} + d - d \log{d}) \\ = \exp(\log(160)d - 9d) \xrightarrow{d \to \infty} 0$$

where we used the Chernoff bound on $\PP[X < cn]$, in the form $$ \PP\left[X \le (1 - \delta) \mathbb E[X] \right] \le \exp(-\frac12 \delta^2 \mathbb E[X]). $$

I can't say if the linear dependency is the best.

$\newcommand\PP{\mathbb P}$ Surely $n_\min \lesssim d$, because it works for $c = 1/4$ and $n=160d$.

We use that the number of "distinct" $v$ with respect to the classifiers $\textrm{sgn}\langle \cdot, z_i \rangle$ is $$ \sum_{i=0}^{d-1} \binom{n-1}{i} \le \left( \frac{ne}{d} \right)^d $$

The proof can be found in [Bürgisser and Cucker (2013), Lemma 13.7]. (Anyone has a better reference?)

Let $X = \lvert \left\{ i ~:~ \langle z_i, e_1 \rangle \right\} \rvert$ for a basis vector $e_1$. Then, by the union bound $$ \PP \left[ ~\exists v ~~ \lvert \left\{ i ~:~ \langle z_i, v \rangle > 0 \right\}\rvert < cn \right] \\ \le \left( \frac{ne}{d} \right)^d \PP \left[ X < cn \right] \\ \le \exp(d\log(n) - \frac{n}{16} + d - d \log{d}) \\ = \exp(\log(160)d - 9d) \xrightarrow{d \to \infty} 0$$

where we used the Chernoff bound on $\PP[X < cn]$, in the form $$ \PP\left[X \le (1 - \delta) \mathbb E[X] \right] \le \exp(-\frac12 \delta^2 \mathbb E[X]). $$

As noted in the comment by Sandeep Silwal, $n_{\text{min}} \ge d$ due to the strict inequality sign in the question. So the answer to the original question is $n_{\text{min}} = \Theta(d)$.

$\newcommand\PP{\mathbb P}$

Surely $n_\min \lesssim d$, because it works for $c = 1/4$ and $n=160d$.

We use that the number of "distinct" $v$ with respect to the classifiers $\textrm{sgn}\langle \cdot, z_i \rangle$ is $$ \sum_{i=0}^{d-1} \binom{n-1}{i} \le \left( \frac{ne}{d} \right)^d $$

The proof can be found in [Bürgisser and Cucker, 2013 (2013), TheoremLemma 13.6]7]. (Anyone has a better reference?)

Let $X = \lvert \left\{ i ~:~ \langle z_i, e_1 \rangle \right\} \rvert$ for a basis vector $e_1$. Then, by the union bound $$ \PP \left[ ~\exists v ~~ \lvert \left\{ i ~:~ \langle z_i, v \rangle > 0 \right\}\rvert < cn \right] \\ \le \left( \frac{ne}{d} \right)^d \PP \left[ X < cn \right] \\ \le \exp(d\log(n) - \frac{n}{16} + d - d \log{d}) \\ = \exp(\log(160)d - 9d) \xrightarrow{d \to \infty} 0$$

where we used the Chernoff bound on $\PP[X < cn]$, in the form $$ \PP\left[X \le (1 - \delta) \mathbb E[X] \right] \le \exp(-\frac12 \delta^2 \mathbb E[X]). $$

I can't say if the linear dependency is the best.

$\newcommand\PP{\mathbb P}$

Surely $n_\min \lesssim d$, because it works for $c = 1/4$ and $n=160d$.

We use that the number of "distinct" $v$ with respect to the classifiers $\textrm{sgn}\langle \cdot, z_i \rangle$ is $$ \sum_{i=0}^{d-1} \binom{n-1}{i} \le \left( \frac{ne}{d} \right)^d $$

The proof can be found in [Bürgisser and Cucker, 2013, Theorem 13.6]. (Anyone has a better reference?)

Let $X = \lvert \left\{ i ~:~ \langle z_i, e_1 \rangle \right\} \rvert$ for a basis vector $e_1$. Then, by the union bound $$ \PP \left[ ~\exists v ~~ \lvert \left\{ i ~:~ \langle z_i, v \rangle > 0 \right\}\rvert < cn \right] \\ \le \left( \frac{ne}{d} \right)^d \PP \left[ X < cn \right] \\ \le \exp(d\log(n) - \frac{n}{16} + d - d \log{d}) \\ = \exp(\log(160)d - 9d) \xrightarrow{d \to \infty} 0$$

where we used the Chernoff bound on $\PP[X < cn]$, in the form $$ \PP\left[X \le (1 - \delta) \mathbb E[X] \right] \le \exp(-\frac12 \delta^2 \mathbb E[X]). $$

I can't say if the linear dependency is the best.

$\newcommand\PP{\mathbb P}$

Surely $n_\min \lesssim d$, because it works for $c = 1/4$ and $n=160d$.

We use that the number of "distinct" $v$ with respect to the classifiers $\textrm{sgn}\langle \cdot, z_i \rangle$ is $$ \sum_{i=0}^{d-1} \binom{n-1}{i} \le \left( \frac{ne}{d} \right)^d $$

The proof can be found in [Bürgisser and Cucker (2013), Lemma 13.7]. (Anyone has a better reference?)

Let $X = \lvert \left\{ i ~:~ \langle z_i, e_1 \rangle \right\} \rvert$ for a basis vector $e_1$. Then, by the union bound $$ \PP \left[ ~\exists v ~~ \lvert \left\{ i ~:~ \langle z_i, v \rangle > 0 \right\}\rvert < cn \right] \\ \le \left( \frac{ne}{d} \right)^d \PP \left[ X < cn \right] \\ \le \exp(d\log(n) - \frac{n}{16} + d - d \log{d}) \\ = \exp(\log(160)d - 9d) \xrightarrow{d \to \infty} 0$$

where we used the Chernoff bound on $\PP[X < cn]$, in the form $$ \PP\left[X \le (1 - \delta) \mathbb E[X] \right] \le \exp(-\frac12 \delta^2 \mathbb E[X]). $$

I can't say if the linear dependency is the best.

typos
Source Link

$\newcommand\PP{\mathbb P}$

Surely $n_\min \lesssim d$, because it works for $c = 1/4$ and $n=160d$.

We use that the number of "distinct" $v$ with respect to the classifiers $\textrm{sgn}\langle \cdot, z_i \rangle$ is $$ \sum_{i=0}^{d-1} \binom{n-1}{i} \le \left( \frac{ne}{d} \right)^d $$

The proof can be found in [Bürgisser and Cucker, 2013, Theorem 13.6]. (Anyone has a better reference?)

Let $X = \lvert \left\{ i ~:~ \langle z_i, e_1 \rangle \right\} \rvert$ for a basis vector $e_1$. Then, by the union bound $$ \PP \left[ \lvert \left\{ i ~:~ \langle z_i, v \rangle \right\}\rvert > cn ~\forall v \right] \\ \le \left( \frac{ne}{d} \right)^d \PP \left[ X < cn \right] \\ \le \exp(d\log(n) - \frac{n}{16} + d - d \log{d}) \\ = \exp(\log(160)d - 9d) \xrightarrow{d \to \infty} 0$$$$ \PP \left[ ~\exists v ~~ \lvert \left\{ i ~:~ \langle z_i, v \rangle > 0 \right\}\rvert < cn \right] \\ \le \left( \frac{ne}{d} \right)^d \PP \left[ X < cn \right] \\ \le \exp(d\log(n) - \frac{n}{16} + d - d \log{d}) \\ = \exp(\log(160)d - 9d) \xrightarrow{d \to \infty} 0$$

where we used the Chernoff bound on $\PP[X < cn]$, in the form $$ \PP\left[X \le (1 - \delta) \mathbb E[X] \right] \le \exp(-\frac12 \delta^2 \mathbb E[X]). $$

I can't say if the linear dependency is the best.

$\newcommand\PP{\mathbb P}$

Surely $n_\min \lesssim d$, because it works for $c = 1/4$ and $n=160d$.

We use that the number of "distinct" $v$ with respect to the classifiers $\textrm{sgn}\langle \cdot, z_i \rangle$ is $$ \sum_{i=0}^{d-1} \binom{n-1}{i} \le \left( \frac{ne}{d} \right)^d $$

The proof can be found in [Bürgisser and Cucker, 2013, Theorem 13.6]. (Anyone has a better reference?)

Let $X = \lvert \left\{ i ~:~ \langle z_i, e_1 \rangle \right\} \rvert$ for a basis vector $e_1$. Then, by the union bound $$ \PP \left[ \lvert \left\{ i ~:~ \langle z_i, v \rangle \right\}\rvert > cn ~\forall v \right] \\ \le \left( \frac{ne}{d} \right)^d \PP \left[ X < cn \right] \\ \le \exp(d\log(n) - \frac{n}{16} + d - d \log{d}) \\ = \exp(\log(160)d - 9d) \xrightarrow{d \to \infty} 0$$

where we used the Chernoff bound on $\PP[X < cn]$, in the form $$ \PP\left[X \le (1 - \delta) \mathbb E[X] \right] \le \exp(-\frac12 \delta^2 \mathbb E[X]). $$

I can't say if the linear dependency is the best.

$\newcommand\PP{\mathbb P}$

Surely $n_\min \lesssim d$, because it works for $c = 1/4$ and $n=160d$.

We use that the number of "distinct" $v$ with respect to the classifiers $\textrm{sgn}\langle \cdot, z_i \rangle$ is $$ \sum_{i=0}^{d-1} \binom{n-1}{i} \le \left( \frac{ne}{d} \right)^d $$

The proof can be found in [Bürgisser and Cucker, 2013, Theorem 13.6]. (Anyone has a better reference?)

Let $X = \lvert \left\{ i ~:~ \langle z_i, e_1 \rangle \right\} \rvert$ for a basis vector $e_1$. Then, by the union bound $$ \PP \left[ ~\exists v ~~ \lvert \left\{ i ~:~ \langle z_i, v \rangle > 0 \right\}\rvert < cn \right] \\ \le \left( \frac{ne}{d} \right)^d \PP \left[ X < cn \right] \\ \le \exp(d\log(n) - \frac{n}{16} + d - d \log{d}) \\ = \exp(\log(160)d - 9d) \xrightarrow{d \to \infty} 0$$

where we used the Chernoff bound on $\PP[X < cn]$, in the form $$ \PP\left[X \le (1 - \delta) \mathbb E[X] \right] \le \exp(-\frac12 \delta^2 \mathbb E[X]). $$

I can't say if the linear dependency is the best.

Source Link
Loading