1
$\begingroup$

Let $V$ be a finite-dimensional real vector space (e.g space of $m \times n$ real matrices equiped with Hilbert-Schmidt inner product $(A,B) \to \mathrm{tr}(AB^\top)$, and let $f:V^2 \to \mathbb R$, $(x,y) \to f(x,y)$ be a continuously-differentiable function (say with Lipschitz-continuous gradient, if that helps).

What is a principled strategy to go about minimizing $f$ on $V^2$ subject to the constraint $x \perp y$ ?

Of course, one could try do something like

  • Do gradient descent on $x$ and $y$.
  • Project $y$ on the orthogonal complement of $x$.

I don't how rigorous this is or of whether it even converges.

$\endgroup$
3
  • 1
    $\begingroup$ I guess, projecting onto the set $\{(x,y)\mid x\bot y\}$ should be doable, so you could to projected gradient descent, if you want to. But how about solving the problem with by standard Lagrange multipliers? Depending on $f$, this might work directly. $\endgroup$
    – Dirk
    May 25, 2021 at 17:42
  • $\begingroup$ @Dirk Thanks for the input. This is reassuring because The stuff I wrote above is pretty much projected gradient descent on the set $\{(x,y) \mid x \perp y = 0\}$. Do you have a ref for (even local) convergence of projected gradient descent ? Also, I'd rather not do lagrangian methods, because I want the constraint to be satisfied exactly. $\endgroup$
    – dohmatob
    May 25, 2021 at 19:06
  • $\begingroup$ "projected gradient descent non-convex" brings up several recent references. And with "Lagrange multipliers" I meant the standard approach to formulate optimality systems for constraint optimization problems. This is used to solve problems exactly. What you have in mind seems to be penalization approaches. $\endgroup$
    – Dirk
    May 26, 2021 at 6:58

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge that you have read and understand our privacy policy and code of conduct.