2,531 research outputs found

    Mixture of Bilateral-Projection Two-dimensional Probabilistic Principal Component Analysis

    Full text link
    The probabilistic principal component analysis (PPCA) is built upon a global linear mapping, with which it is insufficient to model complex data variation. This paper proposes a mixture of bilateral-projection probabilistic principal component analysis model (mixB2DPPCA) on 2D data. With multi-components in the mixture, this model can be seen as a soft cluster algorithm and has capability of modeling data with complex structures. A Bayesian inference scheme has been proposed based on the variational EM (Expectation-Maximization) approach for learning model parameters. Experiments on some publicly available databases show that the performance of mixB2DPPCA has been largely improved, resulting in more accurate reconstruction errors and recognition rates than the existing PCA-based algorithms

    Machine Analysis of Facial Expressions

    Get PDF
    No abstract

    Machine Understanding of Human Behavior

    Get PDF
    A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. If this prediction is to come true, then next generation computing, which we will call human computing, should be about anticipatory user interfaces that should be human-centered, built for humans based on human models. They should transcend the traditional keyboard and mouse to include natural, human-like interactive functions including understanding and emulating certain human behaviors such as affective and social signaling. This article discusses a number of components of human behavior, how they might be integrated into computers, and how far we are from realizing the front end of human computing, that is, how far are we from enabling computers to understand human behavior

    An automatic data system for vibration modal tuning and evaluation

    Get PDF
    A digitally based automatic modal tuning and analysis system developed to provide an operational capability beginning at 0.1 hertz is described. The elements of the system, which provides unique control features, maximum operator visibility, and rapid data reduction and documentation, are briefly described; and the operational flow is discussed to illustrate the full range of capabilities and the flexibility of application. The successful application of the system to a modal survey of the Skylab payload is described. Information about the Skylab test article, coincident-quadrature analysis of modal response data, orthogonality, and damping calculations is included in the appendixes. Recommendations for future application of the system are also made

    Machine Analysis of Facial Expressions

    Get PDF

    Philosophy Enters the Optics Laboratory: Bell's Theorem and its First Experimental Tests (1965-1982)

    Full text link
    This paper deals with the ways that the issue of completing quantum mechanics was brought into laboratories and became a topic in mainstream quantum optics. It focuses on the period between 1965, when Bell published what now we call Bell's theorem, and 1982, when Aspect published the results of his experiments. I argue that what was considered good physics after Aspect's experiments was once considered by many a philosophical matter instead of a scientific one, and that the path from philosophy to physics required a change in the physics community's attitude about the status of the foundations of quantum mechanics.Comment: 57 pages, accepted by Studies in History and Philosophy of Modern Physic
    • …
    corecore