Here’s a generalization on the Sinkhorn/RAS/IPFP algorithm that may get you what you want.
Give a tensor $T^{i_1i_2\dots}$, define $S_n(T) = \sum_{\mbox{all but }i_n} T^{i_1i_2\dots}$. What I am guessing is that you want a way of going from $T$ to a new tensor $f(T)$ such that $S_n(f(T)) = 1$ (where $1$ is a vector of ones) for all $n$.
I don’t think you have put enough conditions to restrict $f$ above, but here is something that should work. For a given vector $v$, we define the multiplication $D_{n,v}(T) = v^{i_n} T^{i_1i_2\dots}$ — note that there is no sum and the $i_n$ index appears twice. This generalizes matrix multiplication by the diagonal matrix associated with $v$.
Now, the algorithm will proceed by cyclically going through all values for $n$ (ie, from 1 to the rank of the tensor). At each step, calculate $v = 1/S_n(T)$ where $T$ is the tensor from the previous step and the reciprocal is calculated componentwise. Now apply $D_{n,v}$ to obtain a new tensor. I claim that, as long as a tensor with the same set of zeroes as what you started with exists that satisfies the desired stochastic condition, this algorithm will converge to the tensor you are looking for. Furthermore, each of the $D_{n,v}$ with the same $n$ can be combined, so that the resulting tensor is given by applying a single $D_{n,v}$ for each $n$.
The proof follows directly from Csiszar’s paper:
Csiszár, Imre. "I-divergence geometry of probability distributions and minimization problems." The Annals of Probability (1975): 146-158. https://www.jstor.org/stable/2959270?seq=1
This is a version of the alternating projection algorithm, using the KL-divergence as a “metric”. The algorithm converges to the “projection” on the space of tensors with the desired stochastic form. In fact, it’s not too hard to generalize the proof in the paper above to drop the requirement that the operators are applied cyclically, as long as each occurs an infinite number of times.