Skip to main content
5 of 12
deleted 8 characters in body
Jackson
  • 33
  • 1
  • 4

max-convolution of sum of functions ?

Hello.

$R, F_1, F_2, F_3$ are random (not-convex, not-concave) 2D matrices of size 100x100.
$R$ is a linear combination of $F_1, F_2, F_3$.

Explicitly, $R = w_1 F_1 + w_2 F_2 + w_3 F_3$ where $w_1, w_2, w_3$ are real numbers $ -1 \leq w_i \leq 1 $

Now, max-convolution in our problem is defined as follows using $L_1$ and $L_2$ norms:

$D_R(x,y) = \max_{dx,dy} \left( R(x+dx, y+dy) - d_1 |dx| - d_2 |dy| - d_3(dx)^2 - d_4(dy)^2 \right)$

The question here is if we can express the above max-convolution $D_R$ as weighted sum of max-convolutions of $F$ 's, namely, $D_{F_1}$, $D_{F_2}$, $D_{F_3}$.

The motivation for coming up with this problem is that we have a lot of $R$ matrices which can be written as linear combinations of $F_1, F_2, F_3$. Then the intuition is that we can save a lot of computations by running the expensive max-convolutions only three times (rather than thousand times) and then reconstruct $D_R$ from some weighted combinations of $D_{F_1}$, $D_{F_2}$, $D_{F_3}$.

Upon numerical simulations with random matrices, I think the ingredients should be $D_{F_1}$, $D_{F_2}$, $D_{F_3}, F_1, F_2, F_3, w_1, w_2, w_3, d_1, d_2, d_3$. Please note that I've included the discount terms $d_1, d_2, d_3$.

I'm not sure if this is a known problem because I couldn't find a similar result from google search. Does anyone have an idea on how to approach this problem? If there isn't any closed form decomposition, what would be a good approximation?

Jackson
  • 33
  • 1
  • 4