131 research outputs found
My Early Interactions with Jan and Some of His Lost Papers
It has been over 40 years since I got to know Jan. This period almost entirely overlaps my career as a psychometrician. During these years, I have had many contacts with him. This paper reviews some of my early interactions, focussing on the following topics: (1) An episode surrounding the inception of the ALSOS project, and (2) Jan's unpublished (and some lost) notes and papers that I cherished and quoted in my work, including (2a) the ELEGANT algorithm for squared distance scaling, (2b) the INDISCAL method for nonmetric multidimensional scaling (MDS), and (2c) notes on DEDICOM
A new family of constrained principal component analysis (CPCA)
AbstractSeveral decompositions of the orthogonal projector PX=X(X′X)−X′ are proposed with a prospect of their use in constrained principal component analysis (CPCA). In CPCA, the main data matrix X is first decomposed into several additive components by the row side and/or column side predictor variables G and H. The decomposed components are then subjected to singular value decomposition (SVD) to explore structures within the components. Unlike the previous proposal, the current proposal ensures that the decomposed parts are columnwise orthogonal and stay inside the column space of X. Mathematical properties of the decompositions and their data analytic implications are investigated. Extensions to regularized PCA are also envisaged, considering analogous decompositions of ridge operators
On the PLS algorithm for multiple regression (PLS1)
Abstract Partial least squares (PLS) was first introduced by Wold in the mid 1960's as a heuristic algorithm to solve linear least squares (LS) problems. No optimality property of the algorithm was known then. Since then, however, a number of interesting properties have been established about the PLS algorithm for regression analysis (called PLS1). This paper shows that the PLS estimator for a specific dimensionality S is a kind of constrained LS estimator confined to a Krylov subspace of dimensionality S. Links to the Lanczos bidiagonalization and conjugate gradient methods are also discussed from a somewhat different perspective from previous authors
Matrices with special reference to applications in psychometrics
AbstractMultidimensional scaling, item response theory, and factor analysis may be considered three major contributions of psychometricians to statistics. Matrix theory played an important role in early developments of these techniques. Unfortunately, nonlinear models are currently very prevalent in these areas. Still, one can identify several areas of psychometrics where matrix algebra plays a prominent role. They include analysis of asymmetric square tables, multiway data analysis, reduced-rank regression analysis, and multiple-set (T-set) canonical correlation analysis among others. In this article we review some of the important matrix results in these areas and suggest future studies
Professor Haruo Yanai and multivariate analysis
The late Professor Yanai has contributed to many fields ranging from aptitude diagnostics, epidemiology,
and nursing to psychometrics and statistics. This paper reviews some of his accomplishments in multivariate
analysis through his collaborative work with the present author, along with some untold episodes
for the inception of key ideas underlying the work. The various topics covered include constrained principal
component analysis, extensions of Khatri’s lemma, theWedderburn-Guttman theorem, ridge operators, generalized
constrained canonical correlation analysis, and causal inference. A common thread running through
all of them is projectors and singular value decomposition, which are the main subject matters of a recent
monograph by Yanai, Takeuchi, and Takane [60]
- …