8
$\begingroup$

We call a matrix $M \in \mathbb{R}^{d \times d}$ transitive if it satisfies the following:

For any three vectors $u, v, w$ in $\mathbb{R}^d$. If $u^T M v > 0$ and $v^T M w > 0$ then $u^TMw > 0$.

EDIT: The case when $d = 1$ is simple. For $d > 1$ the following is true:

a) M must be Positive Semi Definite, because if $M$ is transitive and $\exists u$ s.t. $u^TMu < 0$ then we can let $v = -u, w = u$ and arrive at the contradiction that $u^t M u < 0 \implies u^tMv > 0, v^tMw > 0 \implies u^TMw > 0 \implies u^TMu > 0$

b) $M \ne I$ because if $M = I$ then we just need to construct three vectors such that $u^Tv > 0, v^Tw > 0$ but $u^Tw < 0$. Intuitively, I know that sum of two acute angles can be obtuse.

c) More generally, $M$ can not be full rank positive definite because then the factors in the Cholesky decomposition $LL^T$ of $M$ would also be full rank and the problem could be reduced to case b) by change of basis using $L$.

Now I am stuck in the space of low rank positive semidefinite matrices. Is there a way to characterize such matrices further? Can such a matrix even exist, apart from zero matrix?

EDIT2: The deduction that $M$ is PSD in step a) implicitly assumed that $M$ is symmetric. Keith Kearnes' answer explicitly shows that $M$ must be rank 1 in that case. The case when $M$ is anti-symmetric is open but I doubt it would be possible to prove anything in that case. I will wait a few days to get more answers. Thanks.
EDIT3: Darij Grinberg's answer proved that every transitive matrix must be symmetric. This completed the characterization.

$\endgroup$
3
  • 1
    $\begingroup$ b) isn't quite right for $d=1$. Actually for $d=1$ any matrix with a non-negative entry is an example. $\endgroup$ Jul 31, 2015 at 19:42
  • $\begingroup$ Actually now I think that the rank of such matrices must be 1. Because every positive semidefinite matrix also has a eigen decomposition so I can again change my basis and reduce the problem to one where $M$ is diagonal with $1 \ge k < d$ positive values. If $k=1$ then fine otherwise if $k>1$ then I can use the counterexample in the $k$ dimensional space (used for b)) to create a counterexample here. $\endgroup$
    – Pushpendre
    Jul 31, 2015 at 21:45
  • $\begingroup$ @Pushpendre: Be careful; diagonalization might mess with the scalar product. $\endgroup$ Jul 31, 2015 at 23:09

2 Answers 2

5
$\begingroup$

I assume from the wording of the question (positive semidefinite, Cholesky decomposition) that you intend $M$ to be symmetric. Write $M$ as $N^TN$ and let $V\leq \mathbb R^d$ be the range of $N$. For $u, v, w\in \mathbb R^d$ let $x=Nu, y=Nv, z= Nw$. Then your condition reduces to $$ x^Ty>0 \;\&\; y^Tz>0 \Rightarrow x^Tz>0$$ on $V$. If the dimension of $V$ is 2 or more, then (as you observed in your last sentence under your item (b)) it is possible to contradict this. That is, it is possible to find $x, y, z$ in the same plane in $V$ such that the angles between $x$ and $y$ and between $y$ and $z$ are acute, while the angle between $x$ and $z$ is obtuse. Once they are known, you can solve for $u, v, w$ which contradict the original condition.

On the other hand, if the dimension of $V$ is 1, there is no contradiction. In this case, $N$ is $1\times d$, so $u^TN^T = Nu$ is a real number, $u^TN^TNv = (Nu)(Nv)$, and you can argue that if $(Nu)(Nv) > 0$ and $(Nv)(Nw) > 0$, then the number $(Nu)(Nw)$ has the same sign as $$(Nu)(Nv)^2(Nw) = [(Nu)(Nv)][(Nv)(Nw)] > 0.$$ The condition that the dimension of $V$ is 1 is that $N$ (and $M$) have rank $1$.

So, the answer for symmetric $M$ is: $M$ is positive semidefinite of rank at most $1$.

$\endgroup$
5
$\begingroup$

Here is a proof of the fact that any transitive matrix $M\in\mathbb{R} ^{d\times d}$ is symmetric. Together with the argument in the answer by Keith Kearnes, this proves that any transitive matrix $M\in\mathbb{R}^{d\times d}$ is a positive-semidefinite symmetric matrix of rank $\leq1$.

We first prove a lemma: If $p\in\mathbb{R}^{d}\setminus\left\{ 0\right\} $ and $q\in\mathbb{R}^{d}$ are two vectors such that every $u\in\mathbb{R}^{d}$ satisfying $p^{T}u>0$ satisfies $q^{T}u\geq0$, then

(1) there exists a nonnegative real $\lambda$ such that $q=\lambda p$.

Proof of (1). Let $p\in\mathbb{R}^{d}\setminus\left\{ 0\right\} $ and $q\in\mathbb{R}^{d}$ be two vectors such that every $u\in\mathbb{R}^{d}$ satisfying $p^{T}u>0$ satisfies $q^{T}u\geq0$. We have $p\neq0$ and thus $p^{T}p>0$.

If $p$ and $q$ were linearly independent, then there would be a vector $v\in\mathbb{R}^{d}$ satisfying $p^{T}v=1$ and $q^{T}v=-1$; but this would contradict the assumption that every $u\in\mathbb{R}^{d}$ satisfying $p^{T}u>0$ satisfies $q^{T}u\geq0$. Hence, $p$ and $q$ must be linearly dependent. Since $p\neq0$, this shows that there exists a $\lambda \in\mathbb{R}$ such that $q=\lambda p$. It remains to prove that this $\lambda$ is nonnegative. Indeed, recall that every $u\in\mathbb{R}^{d}$ satisfying $p^{T}u>0$ satisfies $q^{T}u\geq0$. Applying this to $u=p$, we get $q^{T}p\geq0$ (since $p^{T}p>0$). Since $q=\lambda p$, this rewrites as $\lambda p^{T}p\geq0$, and thus $\lambda\geq0$ (since $p^{T}p>0$). This finishes the proof of (1).

Another lemma, which is really well-known: If $V$ is a finite-dimensional vector space, and if $\phi$ is a linear endomorphism of $V$ such that $\phi$ sends every vector in $V$ to a scalar multiple of this vector, then

(2) the endomorphism $\phi$ is a scalar multiple of the identity.

(In order to prove (2), fix a basis of $V$ and see what $\phi$ does to the basis vectors and their pairwise sums.)

Now, let $M\in\mathbb{R}^{d\times d}$ be any transitive matrix. We need to prove that $M$ is symmetric.

If $M=0$, then this is obvious. Thus, WLOG assume that $M\neq0$.

Let $w\in\mathbb{R}^{d}$ be any vector such that $M^{T}w\neq0$. Then, $w\neq0$ (since $M^{T}w\neq0$).

Let $u\in\mathbb{R}^{d}$ be any vector such that $\left( M^{T}w\right) ^{T}u>0$. Let us now show that $w^{T}M\left( M^{T}u\right) \geq0$. Indeed, if $M^{T}u=0$, then this is clear; otherwise it follows from the transitivity of $M$ (in fact, from $w^{T}Mu=\left( M^{T}w\right) ^{T}u>0$ and $u^{T}M\left( M^{T}u\right) =\left( M^{T}u\right) ^{T}\left( M^{T}u\right) >0$ (since $M^{T}u\neq0$), we obtain $w^{T}M\left( M^{T}u\right) >0$ (since $M$ is transitive)). Thus, we have proven that $w^{T}M\left( M^{T}u\right) \geq0$. Hence, $\left( MM^{T}w\right) ^{T}u=w^{T}MM^{T}u=w^{T}M\left( M^{T}u\right) \geq0$.

Let us now forget that we fixed $u$. We thus have shown that every $u\in\mathbb{R}^{d}$ satisfying $\left( M^{T}w\right) ^{T}u>0$ satisfies $\left( MM^{T}w\right) ^{T}u\geq0$. Thus, (1) (applied to $p=M^{T}w$ and $q=MM^{T}w$) yields that

(3) there exists a nonnegative real $\lambda$ such that $MM^{T}w=\lambda M^{T}w$.

Now, let us forget that we fixed $w$. We thus have proven that for every $w\in\mathbb{R}^{d}$ satisfying $M^{T}w\neq0$, we have (3). But (3) also holds for every $w\in\mathbb{R}^{d}$ satisfying $M^{T}w=0$ (since we can use $\lambda=0$). Hence, (3) holds for every $w\in\mathbb{R}^{d}$.

Let $V=M^{T}\left( \mathbb{R}^{d}\right) $ be the image of $M^T$. Then, $M\left( V\right) \subseteq V$ (because (3) holds for every $w\in\mathbb{R}^{d}$). Thus, $M$ restricts to an $\mathbb{R}$-linear endomorphism $\phi$ of $V$. This endomorphism $\phi$ sends every vector in $M$ to a scalar multiple of this vector (because (3) holds for every $w\in\mathbb{R}^{d}$). Thus, $\phi$ must be a scalar multiple of the identity (due to (2)). In other words, there exists some $\mu\in\mathbb{R}$ such that every $v\in V$ satisfies $\phi\left( v\right) =\mu v$. In other words, there exists some $\mu\in\mathbb{R}$ such that every $w\in\mathbb{R}^{d}$ satisfies $MM^{T}v=\mu M^{T}v$ (because of what $V$ is and what $\phi$ is). In other words, there exists some $\mu\in\mathbb{R}$ such that $MM^{T}=\mu M^{T} $. Consider this $\mu$.

We are working over $\mathbb{R}$. Hence, a well-known fact says that $\operatorname*{Ker}\left( MM^{T}\right) =\operatorname*{Ker}\left( M^{T}\right) $. Thus, from $M^{T}\neq0$, we obtain $MM^{T}\neq0$, so that $\mu M^{T}=MM^{T}\neq0$ and therefore $\mu\neq0$. Thus, we can transform $MM^{T}=\mu M^{T}$ into $M^{T}=\dfrac{1}{\mu}MM^{T}$. The matrix $M^{T}$ is thus symmetric (since $MM^{T}$ is symmetric). In other words, the matrix $M$ is symmetric.

$\endgroup$

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge that you have read and understand our privacy policy and code of conduct.

Not the answer you're looking for? Browse other questions tagged or ask your own question.