0
$\begingroup$

Suppose I have a numerical estimation of discrete samples of a smooth function $C(t)$ at $t = a, \dots, T = Na$ and I want to (numerically) compute the integral of $f(t) = \frac{C(t)}{\sqrt{t}}$. In particular, I am interested in computing $$ I = \int_{t=0}^T dt \, f(t) \approx a \sum_{n=1}^{N} f(na) \equiv I_N \,. $$ If I apply standard error analysis in the computation of integrals, essentially I know that the error on the integral is bounded by $$ \vert I-I_N \vert \leq \frac{M T}{12} a^2 $$ where $$ M \equiv \max{ \vert f''(t) \vert }_{t \in [0,T]} . $$ In this case, the function $f(t)$ has an integrable singularity due to $1/\sqrt{t}$, which should not affect the numerical computation of the integral, while $M$ seems to diverge.

How can I compute a bound on $\vert I-I_N \vert$ for this case?

$\endgroup$
1
  • $\begingroup$ There have been several papers written on numerical integration in the presence of singularities. Here's one that I wrote: G Myerson, On ignoring the singularity, SIAM Journal on Numerical Analysis, 28 (1991) 1803 to 1807. Searching for similar titles will get you many more. $\endgroup$ Commented Apr 9, 2024 at 4:32

1 Answer 1

1
$\begingroup$

(i) Your purported error bound is of course incorrect: consider e.g. $f(t)=t$.

(ii) To get rid of the singularity, make the substitution $u=\sqrt t$, so that $$\int_0^T dt\,f(t)=\int_0^T \frac{dt}{\sqrt t}\,C(t)=2\int_0^{\sqrt T} du\,C(u^2)=2\int_0^{\sqrt T} du\,g(u),$$ where $g(u):=C(u^2)$, and then approximate $\int_0^{\sqrt T} du\,g(u)$ by an integral sum.


Let us show that the error $|I-I_N|$ of your approximation will be on the order of $1/\sqrt N$ for large $N$ unless $C(0)=0$. Indeed, by rescaling, without loss of generality $T=1$ and then $a=1/N$. By the linearity of both $I$ and $I_N$ in $f$, it suffices to consider two cases:

  • Case 1, when $C(0)=0$

  • Case 2, when $C(t)=1$ for all $t$

In Case 1, the singularity disappears, and we have $|I-I_N|=O(1/N)$, the error bound for the rectangle method.

In Case 2, $$I-I_N=\int_0^1\frac{dt}{\sqrt t}-\frac1N\,\sum_{n=1}^N\frac1{\sqrt{n/N}}\sim-\frac{\zeta(1/2)}{\sqrt N},$$ and $\zeta(1/2)=-1.4603\dots$.

Thus, in Case 2 and whenever $C(0)\ne0$, we have $|I-I_N|\asymp1/\sqrt N$. $\quad\Box$


If the value of $C(0)$ is available as well, then one can use the "sample" values $C(n/N)$ much more effectively. Indeed, letting $f_0(t):=\frac{C(t)-C(0)}t$, one has a smooth function $f_0$ on $(0,1]$ with a bounded second derivative, and $$I=\int_0^1\frac{dt}{\sqrt t}\,C(t) =C(0)\int_0^1\frac{dt}{\sqrt t}+\int_0^1 dt\,f_0(t) =2C(0)+J \approx 2C(0)+J_N,$$ where $$J:=\int_0^1 dt\,f_0(t),\quad J_N:=\frac1N\,\sum_{n=1}^N f_0(n/N).$$ So, $$|I-(2C(0)+J_N)|=|J-J_N|=O(1/N),$$ again the error bound for the rectangle method without singularities. So, we get an $O(1/N)$ error, instead of $\asymp1/\sqrt N$.

If the value of $f_0(0+)=C'(0)$ is available as well, then, using high-order quadratures, we can get an $O(1/N^p)$ error for any natural $p$.

$\endgroup$
6
  • $\begingroup$ Thanks!! I have a few observations. (i) The error bound applies to all functions belonging to $C^2([0,T])$ with non-zero derivative of course. This is a well-known bound for trapezoidal rule, and it applies for this case. (ii) I know how to show that the integral is finite, but for your case I do not have direct samples of $g(u)$, only of $C(t)$. Are you implicitly suggesting that it should be better (in terms of errors on the numerical integration) to use less points to estimate the last integral, compared to using more points to estimate the first? $\endgroup$ Commented Apr 8, 2024 at 21:51
  • $\begingroup$ @knuth : (ia) If your purported error bound applied to all functions in $C^2([0,T])$ with non-zero (second?) derivative, then, by continuity, it would apply to all functions in $C^2([0,T])$ with zero second derivative (in my example, $f'=1\ne0$ and $f''=0$). (ib) Your approximation formula is not the one from the trapezoidal rule. $\endgroup$ Commented Apr 9, 2024 at 2:48
  • $\begingroup$ @knuth : The error of your approximation will be on the order of $1/\sqrt N$, even when $C(t)=1$ for all $t$. One can do significantly better than that. $\endgroup$ Commented Apr 9, 2024 at 3:30
  • $\begingroup$ Thank you for your very detailed answer!! Yes, I meant second derivative, thanks for pointing it out. I have a residual question, which is unrelated to how I first stated the problem but might be relevant and I feel might complicate your previous analysis. Suppose that C(t) is now an oscillating function. Is the same, discrete sampling (namely, at fixed spacing) going to again provide errors of $\mathcal{O}\left(\frac1N\right)$, or does the oscillation make the approximation worse/impossible? In other words: is $J_N$ still a good approximation, even if $f_0$ oscillates? $\endgroup$ Commented Apr 9, 2024 at 17:52
  • $\begingroup$ I know that for oscillating integrands there exist specific libraries to perform numerical integration, and therefore I expect a simple, linear sampling of the function not to be enough. Though, your analysis seems (to me, at least) not to depend on this property. $\endgroup$ Commented Apr 9, 2024 at 17:54

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.