## Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

# The Differentiation of Pseudo-Inverses and Nonlinear Least Squares Problems Whose Variables Separate

G. H. Golub and V. Pereyra
SIAM Journal on Numerical Analysis
Vol. 10, No. 2 (Apr., 1973), pp. 413-432
Stable URL: http://www.jstor.org/stable/2156365
Page Count: 20
Preview not available

## Abstract

For given data (ti, yi), i = 1, ⋯, m, we consider the least squares fit of nonlinear models of the form $\eta(\mathbf{a, \alpha};t) = \sum^n_{j = 1} a_j \varphi_j(\mathbf{\alpha};t),\quad \mathbf{a} \in \mathscr{R}^n, \mathbf{\alpha} \in \mathscr{R}^k.$ For this purpose we study the minimization of the nonlinear functional r(a, α) = ∑m i = 1 (yi - η(a, α, ti))2. It is shown that by defining the matrix {Φ(α)}i,j = φj(α; ti), and the modified functional r2(α) = |y - Φ(α)Φ+ (α)y|2 2, it is possible to optimize first with respect to the parameters α, and then to obtain, a posteriori, the optimal parameters $\hat{\mathbf{a}}$. The matrix Φ+(α) is the Moore-Penrose generalized inverse of Φ(α). We develop formulas for the Frechet derivative of orthogonal projectors associated with Φ(α) and also for Φ+(α), under the hypothesis that Φ(α) is of constant (though not necessarily full) rank. Detailed algorithms are presented which make extensive use of well-known reliable linear least squares techniques, and numerical results and comparisons are given. These results are generalizations of those of H. D. Scolnik [20] and Guttman, Pereyra and Scolnik [9].

• 413
• 414
• 415
• 416
• 417
• 418
• 419
• 420
• 421
• 422
• 423
• 424
• 425
• 426
• 427
• 428
• 429
• 430
• 431
• 432