I have the following minimization problem:
$$ \arg \min_{z} \max_{U} \left\{ \frac{1}{2} {\left\| b - z \right\|}_{2}^{2} + \frac{\lambda}{2} \sum_{i = 1}^{n} {\left\| {U}^{T} {z}_{i} \right\|}_{2}^{2} \right\} \; \text{s. t.} \; {U}^{T} U = I $$
Where $ {z}_{i} \in {\mathbb{R}}^{d} $, $ z = {\left[ {z}_{1}^{T}, {z}_{2}^{T}, \ldots, {z}_{n}^{T} \right]}^{T} \in {\mathbb{R}}^{nd} $ and $ U \in {\mathbb{R}}^{d \times k}, \; k < n $.
For fixed $ U $ this is a simple Least Squares problem with respect to $ z $.
For fixed $ z $ this is PCA / Eigenvector / SVD problem with respect to $ U $ (The data is given by the set of $ {z}_{i} $).
The questions are:
- How could one solve this (Is there a unique minimizer at all?)?
- Will solve it in iterative way (Alternating between $ U $ and $ z $ as fixed) as described above guaranteed to achieve something? Could it be shown to fail?
Thank You.
P. S.
I think it is not trivial as the function of $ U $ isn't concave and the constraints aren't convex. Hence it is not a convex optimization problem and alternation isn't guaranteed to produce something.