3
$\begingroup$

Can we find $\sum_i x_i$ given $\{\sum_i x_i^{2n}\}_{n\in \mathbb{N}}$, assuming $\{x_i\}_{i\in\mathbb{N}}$ is a set of positive real numbers? Perhaps an easier question is, can we find $\sum_i x_i$ given $\{\sum_i x_i^{n}\}_{n\in \mathbb{N}\setminus 1}$? Intuitively it seems impossible without further assumptions on $\{x_i\}_{i\in\mathbb{N}}$. Is that true?

$\endgroup$
1

5 Answers 5

13
$\begingroup$

If you plot $\log f_n$ versus $n$, with $f_n=\sum_i x_i^{2n}$, then the asymptotic slope for large $n$ will give you the largest of the $x_i$; subtracting that contribution from $f_n$ and repeating will give you the next-to-largest $x_i$, and so on; there will be issues of numerical accuracy, but as a matter of principle it should work.

$\endgroup$
1
  • $\begingroup$ This is neat. Thank you. $\endgroup$ Commented Apr 11, 2024 at 17:04
11
$\begingroup$

If $\sum x_i^2$ is finite, the sum $f(z)=\sum \frac{x_i^2}{1-x_i^2z}$ is a meromorphic function on the complex plane, and we know its Taylor series at 0. Thus we know $f$, hence the poles of $f$, hence $x_i$'s.

$\endgroup$
2
$\begingroup$

In terms of the finite and compactly supported measure $\mu=\sum x_j^2 \delta_{x_j^2}$, we are given the moments $\int t^k\, d\mu(t)$, $k=0,1,2 \ldots$.

Moment problems with compactly supported solutions are determinate, that is, they have unique solutions.

In fact, if this is combined with the Muntz-Szasz theorem, we see that it would be enough to know $\sum x_j^{a+bN_n}$ to recover the $x_j$ as long as $\sum 1/N_n=\infty$. (In this case, define $\mu=\sum x_j^a \delta_{x_j^b}$.)

$\endgroup$
0
2
$\begingroup$

[edit: completed] Assuming $x_i\ge0$ with $ \sum_i x_i <\infty$, we have that $\phi(t):=\sum_i(e^{x_it}-1)=\sum_{k\ge1} \big(\sum_i x_i^k\big)t^k/k!$ is an entire function (we can expand the exponentials and exchange order of summation, by absolute summability).

Now if for an other non-negative sequence $y_i$ we have $\sum y_i <\infty$ and $\sum_i x_i^k=\sum y_i^k$ for all $k> N$ we consider the corresponding entire function $\psi(t):=\sum_i(e^{y_it}-1)$: then $\phi(t)-\psi(t)=\sum_{k=1}^N(\sum_i x_i^k - \sum_i y_i^k)t^k/k!$ is a polynomial. It is sufficient to prove that $\phi(t) -\psi(t) =o(|t|)$ for $t\to-\infty$, and it follows it is identically zero, so $\sum_i x_i^k=\sum_i y_i^k$ for $1\le k\le N$ too.

To show $\phi(t)-\psi(t)=o(|t|)$ for (real) $t\to -\infty$: each term $e^{x_it}-e^{y_it}$ tends to $0$ as $t\to-\infty$, and $|e^{x_it}-e^{y_it}|\le |t||x_i-y_i|\le |t|(x_i+y_i)$ because the function $\exp$ is $1$-Lipschitz on $\mathbb R_-$. Therefore by dominated convergence of series, $(\phi(t)-\psi(t))/t\to0$ as $t\to-\infty$, that is $\phi(t)-\psi(t)=o(|t|)$.

$\endgroup$
4
  • $\begingroup$ Please ask, is something is not clear $\endgroup$ Commented Apr 12, 2024 at 16:15
  • 1
    $\begingroup$ Thank you. This is surprising and counter-intuitive to me. In your proof, the step that I am slightly confused with is the part where you show $\phi(t)-\psi(t) = o(1)$ by the dominated convergence theorem. It might be something trivial, but could you please elaborate? $\endgroup$ Commented Apr 12, 2024 at 17:30
  • 1
    $\begingroup$ You are right: I realised that $\phi(t)-\psi(t)=o(t)$ as $t\to-\infty$ is much easier to prove, and it is sufficient to conclude $\endgroup$ Commented Apr 13, 2024 at 4:57
  • 1
    $\begingroup$ I think you have $\psi-\phi$ in one place, where you want $\phi-\psi$. $\endgroup$ Commented Apr 13, 2024 at 6:40
1
$\begingroup$

Here is what happens without assumptions.

Functions $p_n:=\sum_i x_i^n$ represent power-sum symmetric polynomials. By Newton's identities, we have $$E(t):=\sum_{k=0}^\infty e_k \,t^k = \exp\left(\sum_{k=1}^\infty \frac{(-1)^{k+1}p_k}{k} \,t^k \right).$$ Knowing all $p_{2n}$ is equivalent to knowing $$E(t)E(-t) = \exp\left(-\sum_{k=1}^\infty \frac{p_{2k}}{k} \,t^{2k} \right).$$ Hence, the question amounts to computing series $E(t)$ from the known series $E(t)E(-t)$.

Let $z:=t^2$ and $E(t) = f(z) + tg(z)$ be the sum of even and odd parts. Then $$E(t)E(-t) = f(z)^2 - zg(z)^2 = \exp\left(-\sum_{k=1}^\infty \frac{p_{2k}}{k} \,z^k \right).$$ To solve this, we can take an arbitrary series $g(z)$, and then compute $f(z)$ as $$f(z) = \sqrt{\exp\left(-\sum_{k=1}^\infty \frac{p_{2k}}{k} \,z^k \right) + zg(z)^2}.$$ Then we recover values $p_n$ from $$\sum_{k=1}^\infty \frac{(-1)^{k+1}p_k}{k} \,t^k = \log( f(t^2) + tg(t^2)).$$

ADDED. If the set $\{x_i\}$ is finite, then $E(t)$ is a polynomial and so is $E(t)E(-t)$, and the problem can be solved by factoring of the latter polynomial.

$\endgroup$
5
  • $\begingroup$ The question remains whether we can find positive real sequences $x_1,x_2,\dots$ and $y_1,y_2,\dots$ such that $\sum_i x_i^{2k}=\sum_i y_i^{2k}$ for all $k\geq 1$, but $\sum x_i\neq \sum y_i$; or even more strongly, $\sum_i x_i^k=\sum_i y_i^k$ for all $k\geq 2$, but $\sum x_i\neq \sum y_i$. $\endgroup$ Commented Apr 11, 2024 at 18:27
  • $\begingroup$ $x_i$ being positive is an assumption, while as I put in the disclaimer my answer is assumption-free. Surely, assumptions like this one make the question harder. $\endgroup$ Commented Apr 11, 2024 at 18:32
  • $\begingroup$ Yes, I wasn't criticizing your answer but just emphasizing that it doesn't answer the original question. $\endgroup$ Commented Apr 11, 2024 at 19:26
  • $\begingroup$ @RichardStanley I would conjecture that the equality of the series for a set of exponents $k$ with $\sum 1/k=\infty$, implies the equality for all $k\ge1$ $\endgroup$ Commented Apr 12, 2024 at 16:19
  • $\begingroup$ @RichardStanley, re, doesn't @‍FedorPetrov's answer address that? (Based on the timestamps, it might have appeared slightly after your comment.) $\endgroup$ Commented Apr 12, 2024 at 17:25

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.