I think I nailed it.
We'll first reduce to the case $k=1$.
Consider a matrix $A$ of determinant $1$ and dimension $n\times n$. I'll show how to make the first row and column equal $e_{1}$.
1st case: $a_{1,1}=0$.
• If $a_{1,j}=0$ for all $j>k$, it means that $a_{1,j}\neq 0$ for some $1<j\le k$. Add the $j$'th column to the $(k+1)$'th. Now, $a_{1,j}\neq 0$ for some $j>k$.
• By subtracting appropriate multiples of the $j$'th column from the 1st $k$ columns, I can guarantee $a_{1,1}=1,a_{1,2}=a_{1,3}=\cdots=a_{1,k}=0$.
• By subtracting appropriate multiples of the 1st column from the last $n-k$ columns, I can guarantee $a_{1,k+1}=\cdots=a_{1,n}=0$.
2nd case: $a_{1,1}\neq0$. I can subtract an appropriate multiple of the 1st column from the last $n-k$ columns and make $a_{1,k+1},\cdots,a_{1,n-1}$ all equal 0, and $a_{1,n}=1$. Now, by subtracting an appropriate multiple of the last column from columns no. 2 through $k$, I can make $a_{1,2},\cdots,a_{1,k}$ all equal 0. Now it remains to make $a_{1,n}$ equal 0 which can be done by subtracting an appropriate multiple of the 1st column.
So the first row is now $e_{1}$. To make the first column $e_{1}$ too, just subtract multiples of the 1st row from the rest of the rows.
By ignoring the 1st row and 1st column, we can continue in the same fashion and eventually make the first $k$ rows look like $e_{1},\cdots,e_{k}$ and similarly for the columns. By ignoring the first $k-1$ rows and first $k-1$ columns, we've indeed reduced the problem to the case $k=1$.
The case $k=1$:
So we'll now assume $k=1$, and that I've done the operations I've described above. This means I have a matrix of the form
$
\begin{pmatrix}1 & 0 & \cdots & 0\\
0 & * & \cdots & *\\
\cdots & \cdots & \cdots & \cdots\\
0 & * & \cdots & *
\end{pmatrix}=\begin{pmatrix}1 & \vec{0}\\
\vec{0} & A
\end{pmatrix}
$
Where $A$ is a $(n-1)\times(n-1)$ matrix of determinant 1. I need to show I can generate this matrix using only $X_{1,j}^{\lambda}$ and $X_{j,1}^{\lambda}$. Note that:
$\begin{pmatrix}1 & \vec{0}\\
\vec{0} & A
\end{pmatrix}\begin{pmatrix}1 & \vec{0}\\
\vec{0} & B
\end{pmatrix}=\begin{pmatrix}1 & \vec{0}\\
\vec{0} & AB
\end{pmatrix}
$
Since $SL_{n-1}(\mathbb{R})$ is generated by $X_{i,j}^{\lambda}$ (see the link in the description of the problem), it is enough to show how to express those generators using $X_{1,j}^{\lambda}$ and $X_{j,1}^{\lambda}$ only:
$\begin{pmatrix}1 & \vec{0}\\
\vec{0} & X_{i,j}^{\lambda}
\end{pmatrix}=X_{1,j+1}^{-\lambda}X_{i+1,1}^{1}X_{1,j+1}^{\lambda}X_{i+1,1}^{-1}$