5
$\begingroup$

Story

I'm trying to prove following identity

$$\int_0^\infty \frac{\Xi(t)}{t^2 + \frac{1}{4}} \cos(xt) dt = \frac{1}{2} \pi (e^{\frac{1}{2}x} - 2e^{-\frac{1}{2}x} \psi(e^{-2x}))$$

where

$$\psi(x)=\sum_{n = 1}^{\infty} e^{-n^2 \pi x}$$

and

$$\Xi(t) = \xi(\frac{1}{2} + it)$$

is the xi function on the critical line.

Problem

I only don't understand the following equality

$$-\frac{1}{4 i \sqrt{y}} \int_{\frac{1}{2} - i\infty}^{\frac{1}{2} + i\infty} \Gamma(\frac{1}{2}s) \pi^{-\frac{1}{2}s} \zeta(s) y^s ds = -\frac{\pi}{\sqrt{y}} \psi(\frac{1}{y^2}) + \frac{1}{2} \pi \sqrt{y}.$$

Where does latter summand $\frac{1}{2} \pi \sqrt{y}$ come from? When I write $(\frac{1}{y})^{-s}$ and substitute with $s = 2w$ I only get the first summand. We know that by the Mellin transform theorem $\psi(x)$ can be recovered by the Mellin transform integral over $\Gamma(s) \pi^{-s} \zeta(2s)$.

Source

This is from Titchmarsh's book "The Theory of the Riemann Zeta-function" p. 35−36

enter image description here

$\endgroup$
1
  • $\begingroup$ you are allowed to upvote and accept my answer :) (the second part makes everything rigorous) and you easily get $\int_0^\infty \frac{\Xi(t)}{t^2 + \frac{1}{4}} \cos(xt) dt = \frac{1}{2} \pi (e^{\frac{1}{2}x} - 2e^{-\frac{1}{2}x} \psi(e^{-2x}))$ from it, since $\pi\int_0^\infty \frac{\Xi(t)}{t^2 + \frac{1}{4}} \cos(xt) dt =\frac{1}{2 i \pi} \int_{1/2-i\infty}^{1/2+i \infty} E(s) e^{x (s-1/2)} ds$ $\endgroup$ Commented Jun 29, 2016 at 3:28

1 Answer 1

3
$\begingroup$

Let $f(x) = 2\sum_{n = 1}^\infty e^{- \pi n^2 x^2}$ and $E(s) = \pi^{-s/2} \Gamma(s/2)\zeta(s)$.

For $Re(s) >0$ : $2\int_0^\infty x^{s-1} e^{-\pi n^2 x^2} dx = n^{-s} \pi^{-s/2}\Gamma(s/2) $ so we have for $Re(s) > 1$ :

$$E(s) = \Gamma(s/2) \pi^{-s/2} \zeta(s) = 2\sum_{n=1}^\infty \int_0^\infty x^{s-1} e^{-\pi n^2 x^2} dx = \int_0^\infty f(x) x^{s-1} dx \quad (1)$$

By inverse Mellin transform : $$f(x) = \frac{1}{2 i \pi}\int_{\sigma-i\infty}^{\sigma+i\infty} E(s) x^{-s} ds$$

but this is only true for $\sigma > 1$, since $E(s)$ has a pole at $s=1$.

Note that for $Re(s) > 1$ : $\displaystyle\quad\frac{1}{s-1} = \int_0^1 x^{s-2} dx = \int_0^\infty \frac{1_{x < 1}}{x} x^{s-1} dx $,

and for $Re(s) < 1$ : $\displaystyle\quad-\frac{1}{s-1} = \int_1^\infty x^{s-2} dx = \int_0^\infty \frac{1_{x > 1}}{x} x^{s-1} dx $

Hence, at least for $Re(s) > 1$ :

$$E(s) - \frac{1}{s-1} = \int_0^\infty \left( f(x)- \frac{1_{x < 1}}{x}\right) x^{s-1} dx \qquad (2) $$

indeed, this is true also for $Re(s) > 0$ (see below) and we get, for $Re(s) \in (0,1)$ :

$$E(s) = \int_0^\infty \left( f(x) - \frac{1}{x}\right) x^{s-1} dx$$

Finally, by inverse Mellin transform, for $\sigma \in (0,1)$ : $$f(x) - \frac{1}{x} = \frac{1}{2 i\pi}\int_{\sigma -i \infty}^{\sigma +i \infty} E(s) x^{-s}dx$$ and $\displaystyle f(1/y) - y = \frac{1}{2 i\pi} \int_{\sigma -i \infty}^{\sigma +i \infty} E(s) y^{s}dy$ as expected.


You can show $(2)$ converges for $Re(s) > 0$ by proving $\displaystyle\theta(x) = 1+ f(x) = \sum_{n =- \infty}^\infty e^{-\pi n^2 x^2}$ fulfills the functional equation $$\theta(1/x) = x\,\theta(x)$$ using the Poisson summation formula (see this proof).

So that $\displaystyle f(1/x) = \theta(1/x)-1= x \,\theta(x)-1 = x(f(x)+1)-1$ and : $$\int_0^1 \left(f(x)- \frac{1}{x}\right) x^{s-1} dx = \int_1^\infty (f(1/y) - y) y^{-s+1} \frac{dy}{y^2} $$ $$= \int_1^\infty (y (f(y)+1)-1-y) y^{-s-1} dy = \int_1^\infty (y f(y)-1) y^{-s-1} dy$$ whence $$E(s) - \frac{1}{s-1} = \int_0^1+\int_1^\infty \left(f(x)-\frac{1_{x<1}}{x}\right) x^{s-1}dx $$ $$= \int_1^\infty \left( f(x) (x^{s-1} + x^{-s}) - x^{-s-1}\right) dx$$ converges for $Re(s) > 0$.

Riemann obtained the functional equation $E(s) = E(1-s)$ from $$E(s) - \frac{1}{s-1} + \frac{1}{s} = \int_1^\infty f(x) (x^{s-1} + x^{-s}) dx$$ which is entire.

$\endgroup$
0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.