4
$\begingroup$

I am looking for ways to obtain the extremal eigenvalues and eigenvectors of the skew-adjacency matrix of a directed graph without diagonalizing it. The graphs I am interested in are not regular (but they have a maximum degree) or bipartite. They may or may not be planar.

  1. Are there any bounds for either of the extremal eigenvalues of the skew-adjacency matrix?
  2. Are there any bounds to the entries of the eigenvectors corresponding to the extremal eigenvalues that I can obtain without diagonalizing the skew-adjacency matrix?
  3. Suppose that I know that the extremal eigenvalues of the skew-adjacency matrix are degenerate. Does this tell me anything useful related to the above questions?
$\endgroup$
4
  • 2
    $\begingroup$ Why not just square your matrix (so it becomes symmetric) and then use standard software to find the largest eigenvalues of a symmetric matrix? (Or, if your software handles Hermitian matrices, multiply by $i$.) $\endgroup$ Commented Jan 15, 2015 at 15:20
  • $\begingroup$ Thanks for the question. I would like to know the (approximate) eigenvalues & eigenvectors of very large adjacency matrices, beyond what iterative methods can achieve. Ideally, I would like to extrapolate to infinite (but locally finite) graphs. $\endgroup$ Commented Jan 15, 2015 at 17:43
  • $\begingroup$ What is the "skew-adjacency matrix"? $\endgroup$ Commented Jan 15, 2015 at 22:01
  • $\begingroup$ It's like the adjacency matrix of an undirected graph, but with the directionality encapsulated into the signs of the matrix elements. So if $S$ is the skew-adjacency matrix of a directed graph and $i$, $j$ are two adjacent nodes of the graph, then $S_{ij}$ is $+1$ if the path from $i$ to $j$ is along the direction set by the directionality of the graph. This means that $S_{ji}=-S_{ij}$, hence the naming. $\endgroup$ Commented Jan 16, 2015 at 9:09

2 Answers 2

3
$\begingroup$
  1. Are there any bounds for either of the extremal eigenvalues of the skew-adjacency matrix?

Yes. In particular, the extremal eigenvalue bounds the "asymmetry of arcs" between large subsets of vertices. See for example, "Discrepancy Inequalities for Directed Graphs", Discrete Applied Mathematics, 176 (2014), pp. 30-42.) (though the article focuses on Markov chains, similar results can be obtained for the adjacency case- though not as pretty).

  1. Is there a way to obtain the eigenvector corresponding to either of the extremal eigenvalues without diagonalizing the skew-adjacency matrix?

Yes. Consider $Z = A - A^T$. Then the eigenvectors corresponding to extreme eigenvalues of $Z$ are precisely $f \pm ig$ where $f$ and $g$ are the real left- and right- singular vectors of $Z$. This is a lemma in the paper above. Hence, you can apply the power method to $Z + iI$ to find one of the leading eigenvectors of $Z$. the other extreme eigenvector is its conjugate.

  1. Are there any known results that may help with either of the above?

Help is above.

$\endgroup$
3
  • $\begingroup$ Isn't $Z^TZ=ZZ^T=-Z^2$? If so, then the suggested SVD is equivalent to diagonalization of $Z^2$, right? $\endgroup$ Commented Jan 22, 2015 at 11:13
  • $\begingroup$ Also, the cited paper deals only with strongly connected and aperiodic directed graphs, which is not the general case I'm interested in. $\endgroup$ Commented Jan 22, 2015 at 11:25
  • $\begingroup$ 1. Correct. However, since you only need the largest singular vector, you can apply the power method. In retrospect, it would be best to apply the power method to $Z+iI$ not $Z^T Z$. This will give you the leading eigenvector. Note that is important to use an appropriate initial vector (in particular the all-1 vector is in the null space of $Z$). Regarding 2: the paper considers that case out of the context for Markov chains- but the same conditions (should) hold in general but require a (nasty) renormalization of your adjacency matrix. $\endgroup$ Commented Jan 22, 2015 at 19:43
0
$\begingroup$

Complexification. Let $A$ be a real $n\times n$ matrix such that $A^T=-A$. Let us define $ B=iA, i=\sqrt{-1}. $ We have $$ B^*=\overline{B}^T=-iA^T=iA=B, $$ so that $B$ is self-adjoint on $\mathbb C^n$: as a consequence $$ B=\sum_{1\le j\le r}\lambda_j\mathbb P_j,\quad \lambda_j\in \mathbb R, \mathbb P_j \text{ orthogonal projection onto $E_{\lambda_j}$}, \oplus_{1\le j\le r}E_{\lambda_j}=\mathbb C^n, $$ so that $A=\sum_{1\le j\le r}(-i\lambda_j)\mathbb P_j$. This reduces the analysis of a skew-adjoint operator to the study of a selfadjoint operator.

$\endgroup$
2
  • $\begingroup$ True, but in this case I have to diagonalize $B$. I am interested in bounding eigenvalues and eigenvectors without any eigendecomposition. Such bounds exist for adjacency matrices (see, for example, arxiv.org/abs/1403.1479) but not for skew-adjacency ones. $\endgroup$ Commented Feb 7, 2015 at 10:41
  • $\begingroup$ How about calculating $\sqrt{B^*B}=\vert B\vert$? $\endgroup$ Commented Feb 7, 2015 at 11:41

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.