152 research outputs found

    Recognizing Induced Emotions of Movie Audiences: Are Induced and Perceived Emotions the Same?

    Get PDF
    Predicting the emotional response of movie audi- ences to affective movie content is a challenging task in affective computing. Previous work has focused on using audiovisual movie content to predict movie induced emotions. However, the relationship between the audience’s perceptions of the affective movie content (perceived emotions) and the emotions evoked in the audience (induced emotions) remains unexplored. In this work, we address the relationship between perceived and in- duced emotions in movies, and identify features and modelling approaches effective for predicting movie induced emotions. First, we extend the LIRIS-ACCEDE database by annotating perceived emotions in a crowd-sourced manner, and find that perceived and induced emotions are not always consistent. Second, we show that dialogue events and aesthetic highlights are effective predictors of movie induced emotions. In addition to movie based features, we also study physiological and be- havioural measurements of audiences. Our experiments show that induced emotion recognition can benefit from including temporal context and from including multimodal information. Our study bridges the gap between affective content analysis and induced emotion prediction

    Three FLOWERING LOCUS T-like genes function as potential florigens and mediate photoperiod response in sorghum

    Get PDF
    Sorghum is a typical short-day (SD) plant and its use in grain or biomass production in temperate regions depends on its flowering time control, but the underlying molecular mechanism of floral transition in sorghum is poorly understood. Here we characterized sorghum FLOWERING LOCUS T (SbFT) genes to establish a molecular road map for mechanistic understanding. Out of 19 PEBP genes, SbFT1, SbFT8 and SbFT10 were identified as potential candidates for encoding florigens using multiple approaches. Phylogenetic analysis revealed that SbFT1 clusters with the rice Hd3a subclade, while SbFT8 and SbFT10 cluster with the maize ZCN8 subclade. These three genes are expressed in the leaf at the floral transition initiation stage, expressed early in grain sorghum genotypes but late in sweet and forage sorghum genotypes, induced by SD treatment in photoperiod-sensitive genotypes, cooperatively repressed by the classical sorghum maturity loci, interact with sorghum 14-3-3 proteins and activate flowering in transgenic Arabidopsis plants, suggesting florigenic potential in sorghum. SD induction of these three genes in sensitive genotypes is fully reversed by 1 wk of long-day treatment, and yet, some aspects of the SD treatment may still make a small contribution to flowering in long days, indicating a complex photoperiod response mediated by SbFT genes

    Recognizing Induced Emotions of Movie Audiences From Multimodal Information

    Get PDF
    Recognizing emotional reactions of movie audiences to affective movie content is a challenging task in affective computing. Previous research on induced emotion recognition has mainly focused on using audio-visual movie content. Nevertheless, the relationship between the perceptions of the affective movie content (perceived emotions) and the emotions evoked in the audiences (induced emotions) is unexplored. In this work, we studied the relationship between perceived and induced emotions of movie audiences. Moreover, we investigated multimodal modelling approaches to predict movie induced emotions from movie content based features, as well as physiological and behavioral reactions of movie audiences. To carry out analysis of induced and perceived emotions, we first extended an existing database for movie affect analysis by annotating perceived emotions in a crowd-sourced manner. We find that perceived and induced emotions are not always consistent with each other. In addition, we show that perceived emotions, movie dialogues, and aesthetic highlights are discriminative for movie induced emotion recognition besides spectators’ physiological and behavioral reactions. Also, our experiments revealed that induced emotion recognition could benefit from including temporal information and performing multimodal fusion. Moreover, our work deeply investigated the gap between affective content analysis and induced emotion recognition by gaining insight into the relationships between aesthetic highlights, induced emotions, and perceived emotions

    Atmospheric River Tracking Method Intercomparison Project (ARTMIP): project goals and experimental design

    Get PDF
    The Atmospheric River Tracking Method Intercomparison Project (ARTMIP) is an international collaborative effort to understand and quantify the uncertainties in atmospheric river (AR) science based on detection algorithm alone. Currently, there are many AR identification and tracking algorithms in the literature with a wide range of techniques and conclusions. ARTMIP strives to provide the community with information on different methodologies and provide guidance on the most appropriate algorithm for a given science question or region of interest. All ARTMIP participants will implement their detection algorithms on a specified common dataset for a defined period of time. The project is divided into two phases: Tier 1 will utilize the Modern-Era Retrospective analysis for Research and Applications, version 2 (MERRA-2) reanalysis from January 1980 to June 2017 and will be used as a baseline for all subsequent comparisons. Participation in Tier 1 is required. Tier 2 will be optional and include sensitivity studies designed around specific science questions, such as reanalysis uncertainty and climate change. High-resolution reanalysis and/or model output will be used wherever possible. Proposed metrics include AR frequency, duration, intensity, and precipitation attributable to ARs. Here, we present the ARTMIP experimental design, timeline, project requirements, and a brief description of the variety of methodologies in the current literature. We also present results from our 1-month proof-of-concept trial run designed to illustrate the utility and feasibility of the ARTMIP project
    • …
    corecore