33 research outputs found

    A Fortran IV Program Generalizing the Schönemann-Carroll Matrix Fitting Algorithm To Monotone and Linear Fitting of Configurations

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/66700/2/10.1177_001316447403400117.pd

    Some boundary conditions for a monotone analysis of symmetric matrices

    Full text link
    This paper gives a rigorous and greatly simplified proof of Guttman's theorem for the least upper-bound dimensionality of arbitrary real symmetric matrices S , where the points embedded in a real Euclidean space subtend distances which are strictly monotone with the off-diagonal elements of S . A comparable and more easily proven theorem for the vector model is also introduced. At most n -2 dimensions are required to reproduce the order information for both the distance and vector models and this is true for any choice of real indices, whether they define a metric space or not. If ties exist in the matrices to be analyzed, then greatest lower bounds are specifiable when degenerate solutions are to be avoided. These theorems have relevance to current developments in nonmetric techniques for the monotone analysis of data matrices.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/45729/1/11336_2005_Article_BF02291398.pd

    Book review

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/45718/1/11336_2005_Article_BF02289749.pd

    What weight should weights have in individual differences scaling?

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/43554/1/11135_2004_Article_BF00144137.pd

    On evaluating the equivalency of alternative MDS representations

    Full text link
    A typical question in MDS is whether two alternative configurations that are both acceptable in terms of data fit may be considered “practically the same”. To answer such questions on the equivalency of MDS solutions. Lingoes & Borg (1983) have recently proposed a quasistatistical decision strategy that allows one to take various features of the situation into account. This paper adds another important piece of information to this approach: for the Lingoes-Borg decision criterion R , we compute what proportion of R -values is greater/less than the observed coefficient if one were to consider all possible alternative distance sets within certain bounds defined by the observed fit coefficients for two alternative MDS solutions, what are the limits of acceptability for such fit coefficients, and how are the observed MDS configurations interrelated.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/43557/1/11135_2004_Article_BF00227428.pd

    An alternative approach to confirmatory inference and geometric models

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/43556/1/11135_2004_Article_BF00221555.pd

    Alternative measures of fit for the Schönemann-carroll matrix fitting algorithm

    Full text link
    In connection with a least-squares solution for fitting one matrix, A , to another, B , under optimal choice of a rigid motion and a dilation, Schönemann and Carroll suggested two measures of fit: a raw measure, e , and a refined similarity measure, e s , which is symmetric. Both measures share the weakness of depending upon the norm of the target matrix, B , e.g. , e ( A , kB ) ≠ e ( A , B ) for k ≠ 1. Therefore, both measures are useless for answering questions of the type: “Does A fit B better than A fits C ?”. In this note two new measures of fit are suggested which do not depend upon the norms of A and B , which are (0, 1)-bounded, and which, therefore, provide meaningful answers for comparative analyses.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/45731/1/11336_2005_Article_BF02291666.pd

    Louis E. Guttman (1916–1987)

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/45742/1/11336_2005_Article_BF02294129.pd

    A direct approach to individual differences scaling using increasingly complex transformations

    Full text link
    A family of models for the representation and assessment of individual differences for multivariate data is embodied in a hierarchically organized and sequentially applied procedure called PINDIS. The two principal models used for directly fitting individual configurations to some common or hypothesized space are the dimensional salience and perspective models. By systematically increasing the complexity of transformations one can better determine the validities of the various models and assess the patterns and commonalities of individual differences. PINDIS sheds some new light on the interpretability and general applicability of the dimension weighting approach implemented by the commonly used INDSCAL procedure.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/45738/1/11336_2005_Article_BF02293810.pd

    A model and algorithm for multidimensional scaling with external constraints on the distances

    Full text link
    A method for externally constraining certain distances in multidimensional scaling configurations is introduced and illustrated. The approach defines an objective function which is a linear composite of the loss function of the point configuration X relative to the proximity data P and the loss of X relative to a pseudo-data matrix R . The matrix R is set up such that the side constraints to be imposed on X 's distances are expressed by the relations among R 's numerical elements. One then uses a double-phase procedure with relative penalties on the loss components to generate a constrained solution X . Various possibilities for constructing actual MDS algorithms are conceivable: the major classes are defined by the specification of metric or nonmetric loss for data and/or constraints, and by the various possibilities for partitioning the matrices P and R . Further generalizations are introduced by substituting R by a set of R matrices, R i , i =1, ... r , which opens the way for formulating overlapping constraints as, e.g., in patterns that are both row- and column-conditional at the same time.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/45739/1/11336_2005_Article_BF02293597.pd
    corecore