least squares

(redirected from Sums of squares)
Also found in: Dictionary, Thesaurus.

least squares

(lēst skwārz),
A principle of estimation invented by Gauss in which the estimates of a set of parameters in a statistical model are the quantities that minimize the sum of squared differences between the observed values of the dependent variable and the values predicted by the model.
References in periodicals archive ?
The mean and standard deviation of residual sums of squares of drinking water sample were calculated, and then the residual sum of squares of each unknown test sample subtracted the mean of background water samples.
The sum of squares of the interaction [alpha][beta] is obtained from the difference between the sum of squares of the treatments from the double factorial and the sums of squares of such factors, as: S[Q.sub.[alpha][beta]] = [a,b.summation over (i,j)] A[B.sup.2.sub.ij]/cJ - Co - S[Q.sub.[alpha]] - S[Q.sub.[beta]], where A[B.sub.ij] is the total of the plots receiving [[alpha].sub.i] and [[beta].sub.j].
Determination coefficients of these functions are very high ([R.sup.2] = 0.83-0.99), relative errors are comparatively high ([delta] = 0.00-0.23), but the sums of squares are comparatively small ([SIGMA][[DELTA].sup.2] = 0.13-1.99).
Using this model, hybrid sums of squares were partitioned into sources of variation due to females, males, and the female x male interaction.
Ordinarily, it is convenient to deal with the quantities known as mean squares instead of sums of squares. The mean squares are obtained by dividing each sum of squares by the corresponding degrees of freedom.
Note that if we add the treatment and error sums of squares, they equal the total sum of squares, (equation 9.5)
Partitioning of multivariate sums of squares in a multifactorial linear model, which replaces traditional squared straight-line distances with squared dissimilarities, has been described by Pillar and Orloci (1996).
Briefly, the randomization procedure generates an expected frequency distribution of treatment sum of squares for the null hypothesis of no treatment effect by repeatedly (n = 5000) assigning plants at random to two treatment groups and calculating sums of squares. The significance level of the test is the proportion of expected sums of squares that are less than the experimentally-observed sums of squares (Adams and Anthony, 1996).
Using these notations, the ANOVA table including source of variation, degrees of freedom, sums of squares, and mean squares is readily constructed for the s-level hierarchical population structure (Table 1).
TABLE 4 F VALUES FROM THREE SEPARATE GLM ANALYSES INCORPORATING A QUALITY DIMENSION VARIABLE AND ORDER OF MARKET ENTRY Quality dimension(1) Product quality Overall model 6.75(*) Type III sums of squares: Quality dimension(1) 15.89(**) Market entry 2.17 Quality dimension x Market entry 0.26 Service quality Overall model 8.48(**) Type III sums of squares: Quality dimension(1) 15.77(**) Market entry 1.69 Quality dimension x Market entry 3.06 Image quality Overall model 11.64(**) Type III sums of squares: Quality dimension(1) 30.23(**) Market entry 0.17 Quality dimension x Market entry 0.02
By expanding the respective sums of squares, we obtain [([M.sub.x] - [M.sub.y]).sup.2] [greater than or equal to] 0, where [M.sub.x] and means of the two groups into which the population has been split.