0
$\begingroup$

In the following matrix equation, all coefficients $a_{ij}>0$ and all $a_i>0$ and the column sums in the matrix $A$ are all 0
(e.g. $-a_{11}+a_{21}+a_{31}=0$, etc.). This means that the determinant of $A$ is 0: $|A| = 0$.

$$\begin{pmatrix} -a_{11} & a_{12} & a_{13} \\ a_{21} & -a_{22} & a_{23} \\ a_{31} & a_{32} & -a_{33} \\ \end{pmatrix} \begin{pmatrix} x_1\\ x_2\\ x_3\\ \end{pmatrix}= \begin{pmatrix} -a_1\\ -a_2\\ -a_3\\\end{pmatrix} $$

Conjecture:
If you now change the matrix $A$ by subtracting any value $d_i$ in the main diagonal (all $d_i>0$), the changed matrix equation only ever has a positive (i.e. all $x_i>0$) solution.

$$\begin{pmatrix} -a_{11}-d_1 & a_{12} & a_{13} \\ a_{21} & -a_{22}-d_2 & a_{23} \\ a_{31} & a_{32} & -a_{33}-d_3 \\ \end{pmatrix} \begin{pmatrix} x_1\\ x_2\\ x_3\\ \end{pmatrix}= \begin{pmatrix} -a_1\\ -a_2\\ -a_3\\\end{pmatrix} $$

Question 1:
Is this conjecture correct or can it be found somewhere on the Internet as a lemma?

Question 2:
Can the conjecture also be generalized to $n\times n$ matrices ($n>3$)?

$\endgroup$
1
  • 2
    $\begingroup$ It looks like your hypotheses can never hold. If you multiply both sides of your matrix equation on the left by the row vector $(1,\ 1,\ 1)$ and use associativity, then you find that $-a_1-a_2-a_3=0$, which contradicts all $a_i>0$. $\endgroup$ Commented Sep 27, 2024 at 18:29

1 Answer 1

3
$\begingroup$

Yes, this is true for all dimensions. It follows from the theory of M-matrices. The property that the columns sum to zero is enough to prove that $-A$ is a (singular) M-matrix, and then adding a positive diagonal matrix $D$ to it produces a (nonsingular) M-matrix. And the inverse of an M-matrix has non-negative elements.

A standard book on that topic is

$\endgroup$

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.