131 research outputs found

    Emotional expressions reconsidered: challenges to inferring emotion from human facial movements

    Get PDF
    It is commonly assumed that a person’s emotional state can be readily inferred from his or her facial movements, typically called emotional expressions or facial expressions. This assumption influences legal judgments, policy decisions, national security protocols, and educational practices; guides the diagnosis and treatment of psychiatric illness, as well as the development of commercial applications; and pervades everyday social interactions as well as research in other scientific fields such as artificial intelligence, neuroscience, and computer vision. In this article, we survey examples of this widespread assumption, which we refer to as the common view, and we then examine the scientific evidence that tests this view, focusing on the six most popular emotion categories used by consumers of emotion research: anger, disgust, fear, happiness, sadness, and surprise. The available scientific evidence suggests that people do sometimes smile when happy, frown when sad, scowl when angry, and so on, as proposed by the common view, more than what would be expected by chance. Yet how people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation. Furthermore, similar configurations of facial movements variably express instances of more than one emotion category. In fact, a given configuration of facial movements, such as a scowl, often communicates something other than an emotional state. Scientists agree that facial movements convey a range of information and are important for social communication, emotional or otherwise. But our review suggests an urgent need for research that examines how people actually move their faces to express emotions and other social information in the variety of contexts that make up everyday life, as well as careful study of the mechanisms by which people perceive instances of emotion in one another. We make specific research recommendations that will yield a more valid picture of how people move their faces to express emotions and how they infer emotional meaning from facial movements in situations of everyday life. This research is crucial to provide consumers of emotion research with the translational information they require

    GANimation: one-shot anatomically consistent facial animation

    Get PDF
    The final publication is available at link.springer.comRecent advances in generative adversarial networks (GANs) have shown impressive results for the task of facial expression synthesis. The most successful architecture is StarGAN (Choi et al. in CVPR, 2018), that conditions GANs’ generation process with images of a specific domain, namely a set of images of people sharing the same expression. While effective, this approach can only generate a discrete number of expressions, determined by the content and granularity of the dataset. To address this limitation, in this paper, we introduce a novel GAN conditioning scheme based on action units (AU) annotations, which describes in a continuous manifold the anatomical facial movements defining a human expression. Our approach allows controlling the magnitude of activation of each AU and combining several of them. Additionally, we propose a weakly supervised strategy to train the model, that only requires images annotated with their activated AUs, and exploit a novel self-learned attention mechanism that makes our network robust to changing backgrounds, lighting conditions and occlusions. Extensive evaluation shows that our approach goes beyond competing conditional generators both in the capability to synthesize a much wider range of expressions ruled by anatomically feasible muscle movements, as in the capacity of dealing with images in the wild. The code of this work is publicly available at https://github.com/albertpumarola/GANimation.Peer ReviewedPostprint (author's final draft

    Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements

    Get PDF
    It is commonly assumed that a person’s emotional state can be readily inferred from his or her facial movements, typically called emotional expressions or facial expressions. This assumption influences legal judgments, policy decisions, national security protocols, and educational practices; guides the diagnosis and treatment of psychiatric illness, as well as the development of commercial applications; and pervades everyday social interactions as well as research in other scientific fields such as artificial intelligence, neuroscience, and computer vision. In this article, we survey examples of this widespread assumption, which we refer to as the common view, and we then examine the scientific evidence that tests this view, focusing on the six most popular emotion categories used by consumers of emotion research: anger, disgust, fear, happiness, sadness, and surprise. The available scientific evidence suggests that people do sometimes smile when happy, frown when sad, scowl when angry, and so on, as proposed by the common view, more than what would be expected by chance. Yet how people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation. Furthermore, similar configurations of facial movements variably express instances of more than one emotion category. In fact, a given configuration of facial movements, such as a scowl, often communicates something other than an emotional state. Scientists agree that facial movements convey a range of information and are important for social communication, emotional or otherwise. But our review suggests an urgent need for research that examines how people actually move their faces to express emotions and other social information in the variety of contexts that make up everyday life, as well as careful study of the mechanisms by which people perceive instances of emotion in one another. We make specific research recommendations that will yield a more valid picture of how people move their faces to express emotions and how they infer emotional meaning from facial movements in situations of everyday life. This research is crucial to provide consumers of emotion research with the translational information they require

    Using the information embedded in the testing sample to break the limits caused by the small sample size in microarray-based classification

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Microarray-based tumor classification is characterized by a very large number of features (genes) and small number of samples. In such cases, statistical techniques cannot determine which genes are correlated to each tumor type. A popular solution is the use of a subset of pre-specified genes. However, molecular variations are generally correlated to a large number of genes. A gene that is not correlated to some disease may, by combination with other genes, express itself.</p> <p>Results</p> <p>In this paper, we propose a new classiification strategy that can reduce the effect of over-fitting without the need to pre-select a small subset of genes. Our solution works by taking advantage of the information embedded in the testing samples. We note that a well-defined classification algorithm works best when the data is properly labeled. Hence, our classification algorithm will discriminate all samples best when the testing sample is assumed to belong to the correct class. We compare our solution with several well-known alternatives for tumor classification on a variety of publicly available data-sets. Our approach consistently leads to better classification results.</p> <p>Conclusion</p> <p>Studies indicate that thousands of samples may be required to extract useful statistical information from microarray data. Herein, it is shown that this problem can be circumvented by using the information embedded in the testing samples.</p

    Cell-cycle inhibition and immune microenvironment in breast cancer treated with ribociclib and letrozole or chemotherapy

    Full text link
    In this study, we performed genomic analyses of cell cycle and tumor microenvironment changes during and after ribociclib and letrozole or chemotherapy in the CORALLEEN trial. 106 women with untreated PAM50-defined Luminal B early breast cancers were randomly assigned to receive neoadjuvant ribociclib and letrozole or standard-of-care chemotherapy. Ki67 immunohistochemistry, tumor-infiltrating lymphocytes quantification, and RNA sequencing were obtained from tissue biopsies pre-treatment, on day 14 of treatment, and tumor specimens from surgical resection. Results showed that at surgery, Ki67 and the PAM50 proliferation scores were lower after ribociclib compared to chemotherapy. However, consistent reactivation of tumor cell proliferation from day 14 to surgery was only observed in the ribociclib arm. In tumors with complete cell cycle arrest (CCCA) at surgery, PAM50 proliferation scores were lower in the ribociclib arm compared to chemotherapy (p < 0.001), whereas the opposite was observed with tumor cellularity (p = 0.002). Gene expression signatures (GES) associated with antigen-presenting cells (APCs) and innate immune system activity showed increased expression post-chemotherapy but decreased expression post-ribociclib. Interferon-associated GES had decreased expression with CCCA and increased expression with non-CCCA. Our findings suggest that while both treatment strategies decreased proliferation, the depth and the patterns over time differed by treatment arm. Immunologically, ribociclib was associated with downregulated GES associated with APCs and the innate immune system in Luminal B tumors, contrary to existing preclinical data. Further studies are needed to understand the effect of CDK4/6 inhibition on the tumor cells and microenvironment, an effect which may vary according to tumor subtypes
    • …
    corecore