In addition, the subproblems are now nonlinear least squares problems
in the factor matrices.
Where [lambda] = 1 the adaptive least squares problem
are equivalent to the total least squares problem
In this section, we demonstrate how to adapt the LSQR algorithm of Paige and Sanders  for solving the low-order least squares problem
which is just the problem of determining the correct value [mu] for the Tikhonov least squares problem
such that the discrepancy principle holds with equality.
The key to an efficient implementation is to update an orthogonal factorization of the coefficient matrix in the least squares problem
This decomposition of the problem suggests that we search first for a solution of the reduced least squares problem
, that is, determine a suitable size n, the matrices [V.
perpendicular to]]y and we have effectively removed the linear parameters and now need to solve a non-linear least squares problem
to find [alpha].
2) is replaced by an upper triangular least squares problem
, which can be solved immediately.
Variable projection leads to the nonlinear least squares problem
In this case, an iterative method such as LSQR  is applied to the least squares problem
This method amounts to truncating the singular value decomposition of the coefficient matrix A in such a way that the smallest singular values of A are discarded, and then solving the modified least squares problem
Matrix formulation of the least squares problem
and its solution.