1
$\begingroup$

When $n=2m$, let us consider the following vectors $\mathbf{v}_1,\ldots, \mathbf{v}_n$ in $\mathbb{R}^n$
$$\mathbf{v}_q=(v_{1q},\ldots,v_{n,q})$$ $$v_{p,q}=\sin\Big(\frac{pq}{n+1}\pi\Big)$$ It is known that $\mathbf{v}_1,\ldots, \mathbf{v}_n$ are mutually orthogonal.

Suppose that $\sigma$ is a permutation on the set of $\{1,\dots,n\}$. Let us define $$\mathbf{w}_j=\mathbf{v}_{\sigma(j)}+\mathbf{e}_{\sigma(j)}$$ where $\mathbf{e}_j$ is the $j$th-standard vector in $\mathbb{R}^n$, with all entries equal to zero except for the $j$th component, which is $1.$

Q. Can we conclude that $\mathbf{w}_1,\ldots, \mathbf{w}_m$ form a linearly independent set in $\mathbb{R}^n$?

p.s. Nevertheless, this question deals with the given set of orthogonal real vectors $\mathbf{v}_1,\ldots,\mathbf{v}_n$, it is likely to be generalized for a broader class of orthogonal vectors.

$\endgroup$
1
  • $\begingroup$ When examining $\mathbf{w_j}$,it should be noted that we are concentrating on $m$ arbitrary vectors $\mathbf{v_j+e_j}$, which may not necessarily correspond to the first $m$ vectors. T o clarify this point, a permutation is applied. $\endgroup$ Commented Oct 22, 2023 at 17:11

1 Answer 1

3
$\begingroup$

$\newcommand\v{\mathbf v}\newcommand\w{\mathbf w}$Let us show that the vectors $\w_1,\dots,\w_n$ are linearly independent. Of course, then the first $m$ of these $n$ vectors are linearly independent.

Without loss of generality, the permutation $\sigma$ is the identity permutation, because changing the order of vectors does not affect their linear independence (or the lack thereof).

The Gram matrix of $\v_1,\dots,\v_n$ is $\frac n2\,I=mI$, where $I$ is the identity matrix. So, the matrix $V$ with columns $\v_1,\dots,\v_n$ is of the form $cQ$, where $c:=\sqrt{m}$ and $Q$ is an orthogonal matrix.

So, the Gram matrix of $\w_1,\dots,\w_n$ is $G:=(I+cQ)^\top(I+cQ)=(1+c^2)I+c(Q^\top+Q)$. So, for any $n=2m\ge4$ and any $n\times 1$ matrix $x$ with $x^\top x=1$, in view of the Cauchy--Schwarz inequality we have $$x^\top Gx=1+c^2+2cx^\top Qx\ge1+c^2-2c=(\sqrt{m}-1)^2>0.$$ So, $G$ is nonsingular and hence $\w_1,\dots,\w_n$ are linearly independent.

If now $n=2$, then the determinant of the matrix with columns $\w_1,\w_2$ is $-1/2\ne0$, so that $\w_1,\w_2$ are linearly independent.

Thus, $\w_1,\dots,\w_n$ are linearly independent for any even $n$. $\quad\Box$

$\endgroup$
7
  • $\begingroup$ Let us consider vectors $\mathbf{v_j}$s are all orthogonal (it can be assumed by scaling). Then $c=1$ and so $1+c^2-2c=0$. $\endgroup$ Commented Oct 23, 2023 at 8:12
  • $\begingroup$ Indeed, despite the fact that the vectors $\mathbf{w}_1, \ldots, \mathbf{w}_n$ are never linearly independent, it seems that $\mathbf{w}_1, \ldots, \mathbf{w}_m$ are linearly independent. $\endgroup$ Commented Oct 23, 2023 at 8:15
  • $\begingroup$ @ABB : Your $\mathbb v_j$'s are orthogonal, but $c=\sqrt m>1$ if $m>1$. Perhaps, you now want to rescale the $\mathbb v_j$'s to make them orthonormal, but that would be another problem, right? $\endgroup$ Commented Oct 23, 2023 at 13:25
  • $\begingroup$ The matrix $S$, whose columns are $\mathbf{v}_1, \ldots, \mathbf{v}_n$, is a symmetric matrix satisfying the equation $S^2 = c^2\mathbf{I}$ for some positive scalar $c$. This implies that it has only two eigenvalues, namely $\pm\sqrt{c}$. On the other hand, one may observe that all $n$-vectors $\mathbf{v}_j + \sqrt{c}~\mathbf{e}_j$ are eigenvectors corresponding to the eigenvalue $\sqrt{c}$. Therefore, they can not be linearly independent. $\endgroup$ Commented Oct 23, 2023 at 14:29
  • $\begingroup$ Hence, You are right, it could be considered as another problem and your argument works in this case. $\endgroup$ Commented Oct 23, 2023 at 14:33

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.