2
$\begingroup$

I would like to find an optimal probability density function (PDF) $f$. Given $b$,

$$ \begin{array}{ll} \underset {f} {\text{minimize}} & C \\ \text{subject to} & 1 + \frac{b}{x} \displaystyle\int_{0}^{x}f(t)dt \leq C, \text{ for any } x \in [0,\infty) \\\\ & \frac{1}{b} \left(\displaystyle\int_{0}^{x}(t+b)\cdot f(t)dt+x\cdot\int_{x}^{\infty}f(t)dt \right) \leq C, \text{ for any }x\in(b,\infty) \\\\& \displaystyle\int_{0}^{\infty}f(t)dt=1, \\ & f(t)\geq 0, \ \text{for any $t\geq 0$.}\end{array} $$

I do not know how to solve it. Are there any systematic ways or general ideas of dealing with such a functional optimization problem?


My idea

Here I can introduce another optimization variable $B\geq b$ such that the probability density is restricted within the range of $[0,B]$, i.e. $\int_{0}^{B}f(t)dt=1$. Then the original optimization problem can be rewritten as

$\min\limits_{B,f(t)} C$

s.t. $1+\frac{b\cdot\int_{0}^{x}f(t)dt}{x}\leq C, \text{ for any }x\in[0,B]$

$\quad\frac{\int_{0}^{x}(t+b)\cdot f(t)dt+x\cdot\int_{x}^{B}f(t)dt}{b}\leq C, \text{ for any }x\in(b,B]$

$\quad\int_{0}^{B}f(t)dt=1$,

$\quad B\geq b$

$\quad f(t)\geq 0, \ \text{for any $t\geq 0$.}$

We take a look at the second constraint $\frac{\int_{0}^{x}(t+b)\cdot f(t)dt+x\cdot\int_{x}^{B}f(t)dt}{b}\leq C$. Taking the derivative of the left hand side with respect to x, we find that its derivative is $f(x)\geq 0$, which implies that the left hand side is non-decreasing with respect to $x\in (b,B]$. Hence the second constraint can be reduced to $1+\frac{\int_{0}^{B}t\cdot f(t)dt}{b}\leq C$ (by taking $x=B$). Hence the optimization problem can be rewritten as

$\min\limits_{B,f(t)} C$

s.t. $1+\frac{b\cdot\int_{0}^{x}f(t)dt}{x}\leq C, \text{ for any }x\in[0,B]$

$\quad 1+\frac{\int_{0}^{B}t\cdot f(t)dt}{b}\leq C$

$\quad\int_{0}^{B}f(t)dt=1$,

$\quad B\geq b$

$\quad f(t)\geq 0, \ \text{for any $t\geq 0$.}$


The reason why I have this optimization problem

The reason why I have this optimization problem is that I am trying to solve a cost optimization problem for a distributed storage system, constant $b$ here is the cost of transferring the data object among different servers. $f(t)$ here is the pdf of the storage duration of the data copy in a server (and storage cost is proportional to the storage duration, here we assume that the storage cost rate is 1 per unit of time), and after time $t$ the data will be dropped in the server so that there will be no storage cost anymore for our server. So our algorithm is something like a randomized online algorithm, whose target is to find the optimal pdf $f(t)$ to minimized the overall cost of storing and transferring data in the system. Through some simplifications of my modelling, I finally derive this functional optimization problem, but do not know what to do next.

This problem has infinitely many constraints (i.e., for each continuous value $x\in [0,\infty)$, we have constraints like the first and the second constraint), and this is the mainly obstacle of solving this optimization problem.

$\endgroup$
3
  • $\begingroup$ How is this related to your previous question? $\endgroup$ Commented May 7, 2023 at 10:42
  • $\begingroup$ Is it possible to prove that the optimal solution of PDF must be in the form of a constant? (i.e., t is uniformly distributed?) $\endgroup$ Commented May 8, 2023 at 2:31
  • $\begingroup$ I find that if we make the first constraint to be a equality constraint, the we can get f(t) to be a constant. $\endgroup$ Commented May 8, 2023 at 2:32

1 Answer 1

0
$\begingroup$

Differentiating the left-hand side of the second condition I get $b^{-1}(f(x) + \int_x^\infty f(t) dt)$, not just $b^{-1} f(x)$, but it is still $\ge 0$. So the second condition can be replaced by its version at $x=\infty$, $X:=\int_0^\infty t f(t) dt \le b(C-1)$. Here we have named $X$ the yet unknown first moment of $f(t)dt$. Since $X$ gives a lower bound on $C$, you'd want $X$ to be as small as possible, in other words concentrating the weight of $f(t)dt$ near $t=0$.

Next, the first condition can be rewritten as $\int_0^x f(t) dt \le b^{-1} (C-1) x$. It counterbalances the idea of concentrating the weight of $f(t)dt$ near the origin by the bound $f(t)\le b^{-1}(C-1)$ (more precisely, if $f(t)\ge F>0$ on some interval $[0,T]$, then it must be that $F\le b^{-1}(C-1)$).

At this point, one reasonable assumption would be that the first condition does not increase the lower bound on $C$ obtained from the second condition, which translates to the bound $f(t) \le b^{-2} X$. Saturating this bound gives a step function distribution $f(t) = b^{-2} X \Theta(B-t)$ for some $B>0$. Using this $f(t)$ to check its normalization and compute the moment $X$ results in the relations $b^{-2} BX = 1$, $X = \frac{1}{2} (B/b)^2 X$, or $B=\sqrt{2} b$, $X=b/\sqrt{2}$. Then, solving $X=b(C-1)$ gives $C=1+1/\sqrt{2} \approx 1.7$, corresponding to $f(t) = \frac{b^{-1}}{\sqrt{2}} \Theta(\sqrt{2}b - t)$.

The above argument used a heuristic assumption, so it is not airtight, but it does give you a feasible pair of $C$ and $f(t)$. Perhaps showing that the assumption is necessary for optimality would give you a complete argument.

$\endgroup$
1
  • $\begingroup$ But what I need is the justified optimal solution of the PDF.... Not just feasible solution.... $\endgroup$ Commented May 8, 2023 at 2:18

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.