Does there exists some $\gamma \ge 0$ such that the solution to the following ODE converges to 0 as $t \to \infty$? $$y'(t) = \alpha y(t) - \gamma \sigma(t) (1-y^2(t))$$
We are also given y(0) = 2/3, $\alpha > 0$, $t_\sigma > 0$ and $\sigma(t) = 1/\left(1+e^{2(t-t_\sigma)}\right)$.
Note that if $\gamma = 0$, the solution blows up to infinity. While if $\gamma$ is large enough, $y$ quickly becomes negative. I want to know if there is some critical $\gamma$ which balances perfectly such that $y$ neither blows up nor becomes negative.
Wolfram Alpha is able to solve the ODE. However, the solution is enormous, and contains the hypergeometric function. Therefore, I don't think solving the ODE for this particular $\sigma$ is the best approach. I suspect the relevant properties of $\sigma$ are being analytic, $0 < \sigma(t) < 1$, $\sigma'(t) < 0$ and $0 < \int_0^\infty \sigma(t) dt < \infty$.
I got the problem while constructing a hard case for low rank matrix reconstruction with gradient descent. Concretely, if such a $\gamma$ always exists, it would imply a lower bound on the sample complexity required to reconstruct low rank matrices in a commonly studied setting, considered for example here.