Skip to main content
clarified a sentence
Source Link
Federico Poloni
  • 20.5k
  • 2
  • 83
  • 125

The permutation that you look for should exist for all nonsingular matrices $A$ --- the orthogonality is not necessary.

To see it, I will first present a proof of the same fact for 2 matrices based on determinants for the case of 2 submatrices. For simplicity, let us choose a fixed partition of the rows and permute columns rather than the opposite.

For each subset $\alpha$ of the rows of a matrix, you can divide the permutations in the Leibniz formula according to the image $\sigma(\alpha)$, hence getting the identity (generalized Laplace expansion) $$ \tag{*} \det A = \sum_{\beta\subset\{1,2,\dots,n\}, |\beta|=|\alpha|} (-1)^{something} \det A(\alpha,\beta) \det A(\alpha^c,\beta^c). $$ The value of "something" depends on how one orders the elements in the various sets, and is not important here.

If $\det A$ is nonzero, then there is at least a nonzero summand; hence, for some $\beta$, $\det A(\alpha,\beta)$ and $\det A(\alpha^c,\beta^c)$ are nonzero. Choose a column permutation that brings $\beta$ to $\{1,2,\dots,|\beta|\}$, and bingo.

Expanding $A(\alpha,\beta)$ with the same formula inductively, one can obtain a generalization of ($*$) to more than two terms: for each row partition $(\alpha_1,\alpha_2,\dots,\alpha_k)$, $$ \det A = \sum_{partitions(\beta_1,\dots,\beta_k) s.t. |\beta_i| = |\alpha_i|} (-1)^{something} \det A(\alpha_1,\beta_1) \det A(\alpha_2,\beta_2) \dots \det A(\alpha_k,\beta_k) $$ and then the same argument holds.

The permutation that you look for should exist for all nonsingular matrices $A$ --- the orthogonality is not necessary.

To see it, I will first present a proof of the same fact for 2 matrices based on determinants. For simplicity, let us choose a fixed partition of the rows and permute columns rather than the opposite.

For each subset $\alpha$ of the rows of a matrix, you can divide the permutations in the Leibniz formula according to the image $\sigma(\alpha)$, hence getting the identity (generalized Laplace expansion) $$ \tag{*} \det A = \sum_{\beta\subset\{1,2,\dots,n\}, |\beta|=|\alpha|} (-1)^{something} \det A(\alpha,\beta) \det A(\alpha^c,\beta^c). $$ The value of "something" depends on how one orders the elements in the various sets, and is not important here.

If $\det A$ is nonzero, then there is at least a nonzero summand; hence, for some $\beta$, $\det A(\alpha,\beta)$ and $\det A(\alpha^c,\beta^c)$ are nonzero. Choose a column permutation that brings $\beta$ to $\{1,2,\dots,|\beta|\}$, and bingo.

Expanding $A(\alpha,\beta)$ with the same formula inductively, one can obtain a generalization of ($*$) to more than two terms: for each row partition $(\alpha_1,\alpha_2,\dots,\alpha_k)$, $$ \det A = \sum_{partitions(\beta_1,\dots,\beta_k) s.t. |\beta_i| = |\alpha_i|} (-1)^{something} \det A(\alpha_1,\beta_1) \det A(\alpha_2,\beta_2) \dots \det A(\alpha_k,\beta_k) $$ and then the same argument holds.

The permutation that you look for should exist for all nonsingular matrices $A$ --- the orthogonality is not necessary.

To see it, I will first present a proof based on determinants for the case of 2 submatrices. For simplicity, let us choose a fixed partition of the rows and permute columns rather than the opposite.

For each subset $\alpha$ of the rows of a matrix, you can divide the permutations in the Leibniz formula according to the image $\sigma(\alpha)$, hence getting the identity (generalized Laplace expansion) $$ \tag{*} \det A = \sum_{\beta\subset\{1,2,\dots,n\}, |\beta|=|\alpha|} (-1)^{something} \det A(\alpha,\beta) \det A(\alpha^c,\beta^c). $$ The value of "something" depends on how one orders the elements in the various sets, and is not important here.

If $\det A$ is nonzero, then there is at least a nonzero summand; hence, for some $\beta$, $\det A(\alpha,\beta)$ and $\det A(\alpha^c,\beta^c)$ are nonzero. Choose a column permutation that brings $\beta$ to $\{1,2,\dots,|\beta|\}$, and bingo.

Expanding $A(\alpha,\beta)$ with the same formula inductively, one can obtain a generalization of ($*$) to more than two terms: for each row partition $(\alpha_1,\alpha_2,\dots,\alpha_k)$, $$ \det A = \sum_{partitions(\beta_1,\dots,\beta_k) s.t. |\beta_i| = |\alpha_i|} (-1)^{something} \det A(\alpha_1,\beta_1) \det A(\alpha_2,\beta_2) \dots \det A(\alpha_k,\beta_k) $$ and then the same argument holds.

deleted 1 character in body
Source Link
Federico Poloni
  • 20.5k
  • 2
  • 83
  • 125

The permutation that you look for should exist for all nonsingular matrices $A$ --- the orthogonality is not necessary.

To see it, I will first present a proof of the same fact for 2 matrices based on determinants. For simplicity, let us choose a fixed partition of the rows and permute columns rather than the opposite.

For each subset $\alpha$ of the rows of a matrix, you can divide the permutations in the Leibniz formula according to the image $\sigma(\alpha)$, hence getting the identity (generalized Laplace expansion) $$ \tag{*} \det A = \sum_{\beta\subset\{1,2,\dots,n\}, |\beta|=|\alpha|} (-1)^{something} \det A(\alpha,\beta) \det A(\alpha^c,\beta^c). $$ The value of "something" depends on how one orders the elements in the various sets, and is not important here.

If $\det A$ is nonzero, then there is at least a nonzero summand; hence, for some $\beta$, $\det A(\alpha,\beta)$ and $\det A(\alpha^c,\beta^c)$ are nonzero. Choose a column permutation that brings $\beta$ to $\{1,2,\dots,|\beta|\}$, and bingo.

Expanding $A(\alpha,\beta)$ with the same formula inductively, one can obtain a generalization of ($*$) to more than two terms: for each row partition $(\alpha_1,\alpha_2,\dots,\alpha_k)$, $$ \det A = \sum_{partitions(\beta_1,\dots,\beta_k) s.t. |\beta_i| = |\alpha_i|} (-1)^{something} \det A(\alpha_1,\beta_1) \det A(\alpha_2,\beta_2) \dots \det A(\alpha_k,\beta_k). $$$$ \det A = \sum_{partitions(\beta_1,\dots,\beta_k) s.t. |\beta_i| = |\alpha_i|} (-1)^{something} \det A(\alpha_1,\beta_1) \det A(\alpha_2,\beta_2) \dots \det A(\alpha_k,\beta_k) $$ and then the same argument holds.

The permutation that you look for should exist for all nonsingular matrices $A$ --- the orthogonality is not necessary.

To see it, I will first present a proof of the same fact for 2 matrices based on determinants. For simplicity, let us choose a fixed partition of the rows and permute columns rather than the opposite.

For each subset $\alpha$ of the rows of a matrix, you can divide the permutations in the Leibniz formula according to the image $\sigma(\alpha)$, hence getting the identity (generalized Laplace expansion) $$ \tag{*} \det A = \sum_{\beta\subset\{1,2,\dots,n\}, |\beta|=|\alpha|} (-1)^{something} \det A(\alpha,\beta) \det A(\alpha^c,\beta^c). $$ The value of "something" depends on how one orders the elements in the various sets, and is not important here.

If $\det A$ is nonzero, then there is at least a nonzero summand; hence, for some $\beta$, $\det A(\alpha,\beta)$ and $\det A(\alpha^c,\beta^c)$ are nonzero. Choose a column permutation that brings $\beta$ to $\{1,2,\dots,|\beta|\}$, and bingo.

Expanding $A(\alpha,\beta)$ with the same formula inductively, one can obtain a generalization of ($*$) to more than two terms: for each row partition $(\alpha_1,\alpha_2,\dots,\alpha_k)$, $$ \det A = \sum_{partitions(\beta_1,\dots,\beta_k) s.t. |\beta_i| = |\alpha_i|} (-1)^{something} \det A(\alpha_1,\beta_1) \det A(\alpha_2,\beta_2) \dots \det A(\alpha_k,\beta_k). $$ and then the same argument holds.

The permutation that you look for should exist for all nonsingular matrices $A$ --- the orthogonality is not necessary.

To see it, I will first present a proof of the same fact for 2 matrices based on determinants. For simplicity, let us choose a fixed partition of the rows and permute columns rather than the opposite.

For each subset $\alpha$ of the rows of a matrix, you can divide the permutations in the Leibniz formula according to the image $\sigma(\alpha)$, hence getting the identity (generalized Laplace expansion) $$ \tag{*} \det A = \sum_{\beta\subset\{1,2,\dots,n\}, |\beta|=|\alpha|} (-1)^{something} \det A(\alpha,\beta) \det A(\alpha^c,\beta^c). $$ The value of "something" depends on how one orders the elements in the various sets, and is not important here.

If $\det A$ is nonzero, then there is at least a nonzero summand; hence, for some $\beta$, $\det A(\alpha,\beta)$ and $\det A(\alpha^c,\beta^c)$ are nonzero. Choose a column permutation that brings $\beta$ to $\{1,2,\dots,|\beta|\}$, and bingo.

Expanding $A(\alpha,\beta)$ with the same formula inductively, one can obtain a generalization of ($*$) to more than two terms: for each row partition $(\alpha_1,\alpha_2,\dots,\alpha_k)$, $$ \det A = \sum_{partitions(\beta_1,\dots,\beta_k) s.t. |\beta_i| = |\alpha_i|} (-1)^{something} \det A(\alpha_1,\beta_1) \det A(\alpha_2,\beta_2) \dots \det A(\alpha_k,\beta_k) $$ and then the same argument holds.

edited body
Source Link
Federico Poloni
  • 20.5k
  • 2
  • 83
  • 125

The permutation that you look for should exist for all nonsingular matrices $A$ --- the orthogonality is not necessary.

To see it, I will first present a proof of the same fact for 2 matrices based on determinants. For simplicity, let us choose a fixed partition of the rows and permute columns rather than the opposite.

For each subset $\alpha$ of the rows of a matrix, you can divide the permutations in the Leibniz formula according to the image $\sigma(\alpha)$, hence getting the identity (generalized Laplace expansion) $$ \tag{*} \det A = \sum_{\beta\subset\{1,2,\dots,n\}, |\beta|=|\alpha|} (-1)^{something} \det A(\alpha,\beta) \det A(\alpha^c,\beta^c). $$ The value of "something" depends on how one orders the elements in the various sets, and is not important here.

If $\det A$ is nonzero, then there is at least a nonzero summand; hence, for some $\beta$, $\det A(\alpha,\beta)$ and $\det A(\alpha^c,\beta^c)$ are nonzero. Take theChoose a column permutation that brings $\beta$ to $\{1,2,\dots,|\beta|\}$, and bingo.

Expanding $A(\alpha,\beta)$ with the same formula inductively, one can obtain a generalization of ($*$) to more than two terms: for each row partition $(\alpha_1,\alpha_2,\dots,\alpha_k)$, $$ \det A = \sum_{partitions(\beta_1,\dots,\beta_k) s.t. |\beta_i| = |\alpha_i|} (-1)^{something} \det A(\alpha_1,\beta_1) \det A(\alpha_2,\beta_2) \dots \det A(\alpha_k,\beta_k). $$ and then the same argument holds.

The permutation that you look for should exist for all nonsingular matrices $A$ --- the orthogonality is not necessary.

To see it, I will first present a proof of the same fact for 2 matrices based on determinants. For simplicity, let us choose a fixed partition of the rows and permute columns rather than the opposite.

For each subset $\alpha$ of the rows of a matrix, you can divide the permutations in the Leibniz formula according to the image $\sigma(\alpha)$, hence getting the identity (generalized Laplace expansion) $$ \tag{*} \det A = \sum_{\beta\subset\{1,2,\dots,n\}, |\beta|=|\alpha|} (-1)^{something} \det A(\alpha,\beta) \det A(\alpha^c,\beta^c). $$ The value of "something" depends on how one orders the elements in the various sets, and is not important here.

If $\det A$ is nonzero, then there is at least a nonzero summand; hence, for some $\beta$, $\det A(\alpha,\beta)$ and $\det A(\alpha^c,\beta^c)$ are nonzero. Take the column permutation that brings $\beta$ to $\{1,2,\dots,|\beta|\}$, and bingo.

Expanding $A(\alpha,\beta)$ with the same formula inductively, one can obtain a generalization of ($*$) to more than two terms: for each row partition $(\alpha_1,\alpha_2,\dots,\alpha_k)$, $$ \det A = \sum_{partitions(\beta_1,\dots,\beta_k) s.t. |\beta_i| = |\alpha_i|} (-1)^{something} \det A(\alpha_1,\beta_1) \det A(\alpha_2,\beta_2) \dots \det A(\alpha_k,\beta_k). $$ and then the same argument holds.

The permutation that you look for should exist for all nonsingular matrices $A$ --- the orthogonality is not necessary.

To see it, I will first present a proof of the same fact for 2 matrices based on determinants. For simplicity, let us choose a fixed partition of the rows and permute columns rather than the opposite.

For each subset $\alpha$ of the rows of a matrix, you can divide the permutations in the Leibniz formula according to the image $\sigma(\alpha)$, hence getting the identity (generalized Laplace expansion) $$ \tag{*} \det A = \sum_{\beta\subset\{1,2,\dots,n\}, |\beta|=|\alpha|} (-1)^{something} \det A(\alpha,\beta) \det A(\alpha^c,\beta^c). $$ The value of "something" depends on how one orders the elements in the various sets, and is not important here.

If $\det A$ is nonzero, then there is at least a nonzero summand; hence, for some $\beta$, $\det A(\alpha,\beta)$ and $\det A(\alpha^c,\beta^c)$ are nonzero. Choose a column permutation that brings $\beta$ to $\{1,2,\dots,|\beta|\}$, and bingo.

Expanding $A(\alpha,\beta)$ with the same formula inductively, one can obtain a generalization of ($*$) to more than two terms: for each row partition $(\alpha_1,\alpha_2,\dots,\alpha_k)$, $$ \det A = \sum_{partitions(\beta_1,\dots,\beta_k) s.t. |\beta_i| = |\alpha_i|} (-1)^{something} \det A(\alpha_1,\beta_1) \det A(\alpha_2,\beta_2) \dots \det A(\alpha_k,\beta_k). $$ and then the same argument holds.

found reference
Source Link
Federico Poloni
  • 20.5k
  • 2
  • 83
  • 125
Loading
Source Link
Federico Poloni
  • 20.5k
  • 2
  • 83
  • 125
Loading