2
$\begingroup$

I have two $2N\times 2N$ matrices, defined by blocks:

$$ A = \begin{bmatrix} a & 0 \\ 0 & 0 \end{bmatrix} $$ $$ B = \begin{bmatrix} 0 & 0 \\ 0 & b \end{bmatrix} $$

where $a$ and $b$ are $N\times N$ matrices. I assume that $A\ne B$, i.e. they are not trivially 0.

Then I have a coordinate transformation defined by a generic invertible matrix $T$. Through this transformation $A$ becomes $A'$ and $B$ becomes $B'$:

$$ A' = T^{-1} A T $$ $$ B' = T^{-1} B T $$

Of course, it is not possible that $A'=B'$ (proof: this would imply $A=B$)

Instead my question is if it is possible that the rows of $A'$ and $B'$ are equal, except one:

$$ \left[A'\right]_{i,j} = \left[B'\right]_{i,j} \tag{1} $$

for $i=1\dots 2N-1$ and $j=1\dots 2N$ (here $i$ the subscript of the row and $j$ of the column).

I guess that this cannot happen, unless $A$ and $B$ are trivially 0. But I could not prove it, although it does not seem too difficult.

$\endgroup$
1
  • 4
    $\begingroup$ This is clearly possible if $a = 0$ and $b$ is non-zero with just one non-zero row, for example. And this is basically the only time it is possible: your condition implies $\operatorname{rank}(A^\prime-B^\prime) \leq 1$ so $\operatorname{rank}(A-B) \leq 1$, so one of $a$ or $b$ is $0$ and the other has rank $0$ or $1$. $\endgroup$ Commented May 13, 2019 at 10:47

1 Answer 1

4
$\begingroup$

Let $S = T^{-1} = \begin{bmatrix} S_{00} & S_{01} \\ S_{10} & S_{11} \end{bmatrix}$. If all rows but the last of the matrices $A'$ and $B'$ are equal, then the same is true for $SA = A'T^{-1}$ and $SB = B'T^{-1}$ (right multiplication mixes columns but not rows). Computing the products \begin{align*} SA &= \begin{bmatrix} S_{00} a & 0 \\ S_{10} a & 0 \end{bmatrix}, \\ SB &= \begin{bmatrix} 0 & S_{01} b \\ 0 & S_{11} b \end{bmatrix} , \end{align*} you can see that all but the last row are equal only when $S_{00} a = 0 = S_{01} b$ and the first $N-1$ rows of both $S_{10}a$ and $S_{11} b$ are zero. If $\alpha$ and $\beta$ are vectors such that $a\alpha \ne 0 \ne b\beta$ (otherwise, $a$ and $b$ are identically zero), the two vectors $S [\begin{smallmatrix} a\alpha \\ 0 \end{smallmatrix}]$ and $S[\begin{smallmatrix} 0 \\ b\beta \end{smallmatrix}]$ will have non-zero components only in the very last row and hence be linearly dependent. But that implies that $S$ is not invertible, contrary to what was assumed.

Thus, for the equality of all but the last rows of $SA$ and $SB$ to be realized, at least one of $a$ and $b$ must be identically zero (in which case the situation is obvious).

$\endgroup$
2
  • $\begingroup$ Taking inspiration from the comment of @NathanielJohnston, maybe it is possible to simplify the proof removing the initial part with the calculation with blocks. $\endgroup$ Commented May 13, 2019 at 12:50
  • $\begingroup$ @DorianoBrogioli, I agree that it's essentially the same argument. The block calculation helped me see how to find the kernel of the $S$ matrix; that is all. $\endgroup$ Commented May 13, 2019 at 18:05

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.