Skip to main content
added 47 characters in body
Source Link
Michael
  • 554
  • 4
  • 12

I assume that $\theta(y) = 1$ if $y\geq 0$, and $\theta(y)=0$ else. In that case you can "linearize" your constraint 5 as follows: Add new variables $y_t$ for $t \in T$. Replace constraint 5 with the following three linear constraints:

i) $y_t \geq b_t$ for all $t \in T$.

ii) $y_t \geq \eta b_t$ for all $t \in T$.

iii) $\sum_{t=1}^{t'} y_t \leq 0$ for all $t' \in \{1, ..., T\}$.

TheThis works precisely because $0 < \eta \leq 1$. The entire problem would be a convex optimization problem if the functions $h(d_t-b_t)b_t$ were concave in $b_t$ for all $t$.


Without concavity of the objective function, general solutions seem hard. However, the above transformation at least maps it to a problem of maximizing a (nonconcave) objective function over a convex set defined by linear inequality constraints. And KKT conditions may be easier to state now.

I assume that $\theta(y) = 1$ if $y\geq 0$, and $\theta(y)=0$ else. In that case you can "linearize" your constraint 5 as follows: Add new variables $y_t$ for $t \in T$. Replace constraint 5 with the following three linear constraints:

i) $y_t \geq b_t$ for all $t \in T$.

ii) $y_t \geq \eta b_t$ for all $t \in T$.

iii) $\sum_{t=1}^{t'} y_t \leq 0$ for all $t' \in \{1, ..., T\}$.

The entire problem would be a convex optimization problem if the functions $h(d_t-b_t)b_t$ were concave in $b_t$ for all $t$.


Without concavity of the objective function, general solutions seem hard. However, the above transformation at least maps it to a problem of maximizing a (nonconcave) objective function over a convex set defined by linear inequality constraints.

I assume that $\theta(y) = 1$ if $y\geq 0$, and $\theta(y)=0$ else. In that case you can "linearize" your constraint 5 as follows: Add new variables $y_t$ for $t \in T$. Replace constraint 5 with the following three linear constraints:

i) $y_t \geq b_t$ for all $t \in T$.

ii) $y_t \geq \eta b_t$ for all $t \in T$.

iii) $\sum_{t=1}^{t'} y_t \leq 0$ for all $t' \in \{1, ..., T\}$.

This works precisely because $0 < \eta \leq 1$. The entire problem would be a convex optimization problem if the functions $h(d_t-b_t)b_t$ were concave in $b_t$ for all $t$.


Without concavity of the objective function, general solutions seem hard. However, the above transformation at least maps it to a problem of maximizing a (nonconcave) objective function over a convex set defined by linear inequality constraints. And KKT conditions may be easier to state now.

Source Link
Michael
  • 554
  • 4
  • 12

I assume that $\theta(y) = 1$ if $y\geq 0$, and $\theta(y)=0$ else. In that case you can "linearize" your constraint 5 as follows: Add new variables $y_t$ for $t \in T$. Replace constraint 5 with the following three linear constraints:

i) $y_t \geq b_t$ for all $t \in T$.

ii) $y_t \geq \eta b_t$ for all $t \in T$.

iii) $\sum_{t=1}^{t'} y_t \leq 0$ for all $t' \in \{1, ..., T\}$.

The entire problem would be a convex optimization problem if the functions $h(d_t-b_t)b_t$ were concave in $b_t$ for all $t$.


Without concavity of the objective function, general solutions seem hard. However, the above transformation at least maps it to a problem of maximizing a (nonconcave) objective function over a convex set defined by linear inequality constraints.