Copy the page URI to the clipboard
Gower, John C.; Le Roux, Niël J. and Gardner-Lubbe, Sugnet
(2022).
DOI: https://doi.org/10.1007/s00362-021-01275-8
Abstract
Indscal models consider symmetric matrices for where is a compromise matrix termed the group-average and is a diagonal matrix of weights given by the individual to the , specified in advance, columns of ; non-negative weights are preferred and usually . We propose a new two-phase alternating least squares (ALS) algorithm, which emphasizes the two main components (group average and weighting parameters) of the Indscal model and specifically helps with the interpretation of the model. Furthermore, it has thrown new light on the properties of the converged solution, that would be satisfied by any algorithm that minimizes the basic Indscal criterion: where the minimization is over and the . The new algorithm has also proved to be a useful tool in unravelling the algebraic understanding of the role played by parameter constraints and their interpretation in variants of the Indscal model. The proposed analysis focusses on Indscal but the approach may be of more widespread interest, especially in the field of multidimensional data analysis. A major issue is that simultaneous least-squares estimates of the parameters may be found without imposing constraints. However, group average and individual weighting parameters may not be estimated uniquely, without imposing some subjective constraint that could encourage misleading interpretations. We encourage the use of linear constraints , as it enables a comparison of the weights obtained (i) within group and (ii) between the same item drawn from two or more groups. However, it is easy to exchange one system of constraints to another in a post- or pre-analysis. The new two-phase ALS algorithm (i) computes for fixed the weights subject to , and then (ii) keeping fixed, it updates . At convergence, the estimates of and the will apply to all algorithms that minimize the Indscal criterion. Furthermore, we show that only at convergence an analysis-of-variance property holds on the demarcation region between over- and under-fitting. When the analysis-of-variance is valid, its validity extends over the whole matrix domain, over trace operations, and to individual matrix elements. The optimization process is unusual in that optima and local optima occur on the edges of what seem to be closely related to Heywood cases in Factor analysis.