I originally posted this on math.stackexchangethis on math.stackexchange, but it quickly got buried. I removed it not too long after, thinking of rewriting it for MO, but I didn’t have a chance to post it until now. Apologies if you were one of the lucky 24 people who already saw it there.
Background
The Cauchy-Binet formula states that
$$ \det(AB) = \sum_{S\in\tbinom{[n]}m} \det(A_{[m],S})\det(B_{S,[m]}), $$
where $A$ is an $m$ by $n$ matrix, $B$ is an $n$ by $m$ matrix, $[n]$ is notation for the set $\{1,\dots,n\}$ (similarly for $[m]$), $\binom{[n]}{m}$ is the set of all $m$-element subsets of $[n]$, and if $M$ is a $k$ by $l$ matrix and $J,K$ are subsets of $[k],[l]$, respectively, then $M_{J,K}$ denotes the submatrix of $M$ consisting of rows indexed by $J$ and columns indexed by $K$.
Question
Is there a version of the Cauchy-Binet formula to evaluate the following sum for general integers $0\leq j\leq m$?
$$ ? = \sum_{S\in\tbinom{[n-j]}{m-j}} \det(A_{[m],S\cup T})\det(B_{S\cup T,[m]})\quad\quad (*)$$
Here $T=\{n-j+1,\dots,n\}$ so that the columns labeled by $T$ are forced to appear in the submatrices of $A$ included in the sum, and similarly for rows $n-j+1,\dots,n$ in the submatrices of $B$.
When $A^T=B$ is the incidence matrix for a graph $G$, then Kirchhoff's matrix-tree theorem and effective resistance formula imply that I can compute this sum as the determinant of the Laplacian matrix of the graph $G'$, where $G'$ is $G$ where all the edges indexed by $n-j+1,\dots,n$ have been contracted. The motivation of this question is thus to try to understand contraction in a slightly more general setting.
This paper by Konstantopoulos gives a nice coordinate-free version of Cauchy-Binet, but I couldn't wrangle it into what I wanted.
It may also be possible that evaluating this is equivalent to evaluating the permanent or something so I shouldn’t expect anything nice after all. But I would be quite disappointed if that were so.
Other special cases
When $j=0$, $(*)$ reduces to ordinary Cauchy-Binet, and when $j=m$ it is of course just $\det(A_{[m],T}B_{T,[m]})$.
When $j=1$, $(*)$ is equal to $\det(AB)-\det(A_{[m],[n-1]}B_{[n-1],[m]})$, and by using inclusion-exclusion I can extend this to all the other $j$. However, for large $j$, there will be a mess of terms, and it will be a rather inefficient way of evaluating this sum. Is there a simpler formula just involving one determinant?