Given $A, B \in \mathbb{R}^{n \times k}$ and S.P.D. $C \in \mathbb{R}^{n \times n}$, I would like to find an analytical solution for the matrix $X \in \mathbb{R}^{n \times n}$ that minimizes \begin{align} \lVert X A - B \rVert^2_{F} \end{align} subject to the hard constraint $$ C X = X C. $$
Given the eigendecomposition of $C$ with $C = R \Lambda R^T$, from Nearest commuting matrix it appears that a reasonable projection operator onto the space of matrices commuting with $C$ is $P_C(X) = R P_\Lambda(R^T X R) R^T$ with $$ [P_\Lambda(X)]_{ij} = \begin{cases} X_{ij}, & \lambda_i = \lambda_j \\ 0, & \textrm{otherwise} \end{cases} $$.
With this in mind, my guess at solving this would be to apply this projection operator to the unconstrained minimal solution to the least squares system, e.g. $$ X = P_C(B A^{\dagger}) $$ with $A^{\dagger}$ denoting the M.P. pseudoinverse of $A$. Alternatively, it seems like the solution $$ X = P_C(B A^T) $$ is also reasonable (similar to the orthogonal procrustes solution). That said, I don't know which one of these (if either) is the minimal solution that commutes with $C$.
EDIT:
Building on top of user1551's excellent answer below, the solution to the case where $X$ is also constrained to be orthogonal can be expressed concisely as $$ X = R U V^T R^T $$ where $U, V^T$ are recovered from the SVD $$ U \Sigma V^T = P_{\Lambda}(R^T B A^T R). $$ Pretty nifty.