1,068 research outputs found

    Affect-based information retrieval

    Get PDF
    One of the main challenges Information Retrieval (IR) systems face nowadays originates from the semantic gap problem: the semantic difference between a user’s query representation and the internal representation of an information item in a collection. The gap is further widened when the user is driven by an ill-defined information need, often the result of an anomaly in his/her current state of knowledge. The formulated search queries, which are submitted to the retrieval systems to locate relevant items, produce poor results that do not address the users’ information needs. To deal with information need uncertainty IR systems have employed in the past a range of feedback techniques, which vary from explicit to implicit. The first category of feedback techniques necessitates the communication of explicit relevance judgments, in return for better query reformulations and recommendations of relevant results. However, the latter happens at the expense of users’ cognitive resources and, furthermore, introduces an additional layer of complexity to the search process. On the other hand, implicit feedback techniques make inferences on what is relevant based on observations of user search behaviour. By doing so, they disengage users from the cognitive burden of document rating and relevance assessments. However, both categories of RF techniques determine topical relevance with respect to the cognitive and situational levels of interaction, failing to acknowledge the importance of emotions in cognition and decision making. In this thesis I investigate the role of emotions in the information seeking process and develop affective feedback techniques for interactive IR. This novel feedback framework aims to aid the search process and facilitate a more natural and meaningful interaction. I develop affective models that determine topical relevance based on information gathered from various sensory channels, and enhance their performance using personalisation techniques. Furthermore, I present an operational video retrieval system that employs affective feedback to enrich user profiles and offers meaningful recommendations of unseen videos. The use of affective feedback as a surrogate for the information need is formalised as the Affective Model of Browsing. This is a cognitive model that motivates the use of evidence extracted from the psycho-somatic mobilisation that occurs during cognitive appraisal. Finally, I address some of the ethical and privacy issues that arise from the social-emotional interaction between users and computer systems. This study involves questionnaire data gathered over three user studies, from 74 participants of different educational background, ethnicity and search experience. The results show that affective feedback is a promising area of research and it can improve many aspects of the information seeking process, such as indexing, ranking and recommendation. Eventually, it may be that relevance inferences obtained from affective models will provide a more robust and personalised form of feedback, which will allow us to deal more effectively with issues such as the semantic gap

    EEG Based Emotion Identification Using Unsupervised Deep Feature Learning

    Get PDF
    Capturing user’s emotional state is an emerging way for implicit relevance feedback in information retrieval (IR). Recently, EEG-based emotion recognition has drawn increasing attention. However, a key challenge is effective learning of useful features from EEG signals. In this paper, we present our on-going work on using Deep Belief Network (DBN) to automatically extract high-level features from raw EEG signals. Our preliminary experiment on the DEAP dataset shows that the learned features perform comparably to the use of manually generated features for emotion recognition

    Exploring Peripheral Physiology as a Predictor of Perceived Relevance in Information Retrieval

    Get PDF
    Peripheral physiological signals, as obtained using electrodermal activity and facial electromyography over the corrugator supercilii muscle, are explored as indicators of perceived relevance in information retrieval tasks. An experiment with 40 participants is reported, in which these physiological signals are recorded while participants perform information retrieval tasks. Appropriate feature engineering is defined, and the feature space is explored. The results indicate that features in the window of 4 to 6 seconds after the relevance judgment for electrodermal activity, and from 1 second before to 2 seconds after the relevance judgment for corrugator supercilii activity, are associated with the users’ perceived relevance of information items. A classifier verified the predictive power of the features and showed up to 14% improvement predicting relevance. Our research can help the design of intelligent user interfaces for information retrieval that can detect the user’s perceived relevance from physiological signals and complement or replace conventional relevance feedback

    Extracting Relevance and Affect Information from Physiological Text Annotation

    Get PDF
    We present physiological text annotation, which refers to the practice of associating physiological responses to text content in order to infer characteristics of the user information needs and affective responses. Text annotation is a laborious task, and implicit feedback has been studied as a way to collect annotations without requiring any explicit action from the user. Previous work has explored behavioral signals, such as clicks or dwell time to automatically infer annotations, and physiological signals have mostly been explored for image or video content. We report on two experiments in which physiological text annotation is studied first to 1) indicate perceived relevance and then to 2) indicate affective responses of the users. The first experiment tackles the user’s perception of relevance of an information item, which is fundamental towards revealing the user’s information needs. The second experiment is then aimed at revealing the user’s affective responses towards a -relevant- text document. Results show that physiological user signals are associated with relevance and affect. In particular, electrodermal activity (EDA) was found to be different when users read relevant content than when they read irrelevant content and was found to be lower when reading texts with negative emotional content than when reading texts with neutral content. Together, the experiments show that physiological text annotation can provide valuable implicit inputs for personalized systems. We discuss how our findings help design personalized systems that can annotate digital content using human physiology without the need for any explicit user interaction

    Implicit Interaction with Textual Information using Physiological Signals

    Get PDF
    Implicit interaction refers to human-computer interaction techniques that do not require active engagement from the users. Instead, the user is passively monitored while performing a computer task, and the data gathered is used to infer implicit measures as inputs to the system. Among the multiple applications for implicit interaction, collecting user feedback on information content is one that has increasingly been investigated. As the amount of available information increases, traditional methods that rely on the users' explicit input become less feasible. As measurement devices become less intrusive, physiological signals arise as a valid approach for generating implicit measures when users interact with information. These signals have mostly been investigated in response to audio-visual content, while it is still unclear how to use physiological signals for implicit interaction with textual information. This dissertation contributes to the body of knowledge by studying physiological signals for implicit interaction with textual information. The research targets three main research areas: a) physiology for implicit relevance measures, b) physiology for implicit affect measures, and c) physiology for real-time implicit interaction. Together, these provide understanding not only on what type of implicit measures can be extracted from physiological signals from users interacting with textual information, but also on how these can be used in real time as part of fully integrated interactive information systems. The first research area targets perceived relevance, as the most noteworthy underlying property regarding the user interaction with information items. Two experimental studies are presented that evaluate the potential for brain activity, electrodermal activity, and facial muscle activity as candidate measures to infer relevance from textual information. The second research area targets affective reactions of the users. The thesis presents two experimental studies that target brain activity, electrodermal activity, and cardiovascular activity to indicate users' affective responses to textual information. The third research area focuses on demonstrating how these measures can be used in a closed interactive loop. The dissertation reports on two systems that use physiological signals to generate implicit measures that capture the user's responses to textual information. The systems demonstrate real-time generation of implicit physiological measures, as well as information recommendation on the basis of implicit physiological measures. This thesis advances the understanding of how physiological signals can be implemented for implicit interaction in information systems. The work calls for researchers and practitioners to consider the use of physiological signals as implicit inputs for improved information delivery and personalization.Implisiittinen vuorovaikutus viittaa ihmisen ja tietokoneen välisen vuorovaikutuksen tekniikoihin, jotka eivät vaadi käyttäjän tarkkaavaisuutta. Tämän sijaan järjestelmä kerää käyttäjästä tietoja passiivisesti ja käyttää näitä tietoja operatiivisina syötteinä. Esimerkiksi viestiä kirjoitettaessa (eksplisiittinen vuorovaikutus) järjestelmä tunnistaa tekemämme kirjoitusvirheen ja automaattisesti korjaa väärin kirjoitetun sanan (implisiittinen vuorovaikutus). Implisiittinen vuorovaikutus mahdollistaa näin uusia vuorovaikutuskanavia vaivaamatta lainkaan käyttäjää. Mittauslaitteiden kehityksen myötä implisiittisessä vuorovaikutuksessa voidaan hyödyntää myös fysiologisia signaaleja, kuten aivovasteita ja kardiovaskulaarisia reaktioita. Näiden signaalien analyysi paljastaa tietoja käyttäjän kiinnostuksen kohteista ja tunteista suhteessa tietokoneen esittämään sisältöön, ja tarjoaa näin järjestelmälle paremmat mahdollisuudet vastata käyttäjän tarpeisiin. Väitöskirjani tarkoituksena on tutkia käyttäjien fysiologisia signaaleja sekä kerätä tietoa heidän reaktioistaan ja mielipiteistään suhteessa tekstipohjaiseen informaatioon ja käyttää näitä signaaleja ja tietoja implisiittisen vuorovaikutuksen mahdollistamiseksi. Tarkkaan ottaen tarkoituksenani on tutkia a) fysiologisten signaalien kykyä kertoa siitä, miten kiinnostavana käyttäjä kokee lukemansa tekstin, b) fysiologisten signaalinen käyttökelpoisuutta ennustamaan, minkälaisia tunnereaktiota (esim. huvittuneisuutta) tekstit herättävät lukijassa sekä, c) fysiologisen signaalien käyttökelpoisuutta reaaliaikaisessa implisiittisessä vuorovaikutuksessa. Tutkimuksen tulokset osoittavat, että fysiologiset signaalit tarjoavat toimivan ratkaisun reaaliaikaiseen implisiittiseen vuorovaikutukseen tekstipohjaisten sisältöjen parissa. Tutkimuksen löydösten pääviesti tutkimusyhteisölle ja alan ammattilaisille on se, että implisiittisinä syötteinä fysiologiset signaalit helpottavat informaation kulkua ja parantavat personalisoimista ihmisen ja tietokoneen välisessä vuorovaikutuksessa

    Moderating effects of self-perceived knowledge in a relevance assessment task : an EEG study

    Get PDF
    Relevance assessment, a crucial Human-computer Information Retrieval (HCIR) aspect, denotes how well retrieved information meets the user’s information need (IN). Recently, user-centred research benefited from the employment of brain imaging, which contributed to our understanding of relevance assessment and associated cognitive processes. However, the effect of contextual aspects, such as the searcher’s self-perceived knowledge (SPK) on relevance assessment and its underlying neurocognitive processes, has not been studied. This work investigates the impact of users’ SPK about a topic (i.e. ‘knowledgeable’ vs. ‘not knowledgeable’) on relevance assessments (i.e. ‘relevant’ vs. ‘non-relevant’). To do so, using electroencephalography (EEG), we measured the neural activity of twenty-five participants while they provided relevance assessments during the Question and Answering (Q/A) Task. In the analysis, we considered the effects of SPK and specifically how it modulates the brain activity underpinning relevance judgements. Data-driven analysis revealed significant event-related potential differences (P300/CPP, N400, LPC), which were modulated by searchers’ SPK in the context of relevance assessment. We speculate that SPK affects distinct cognitive processes associated with attention, semantic integration and categorisation, memory, and decision formation that underpin relevance assessment formation. Our findings are an important step toward a better understanding of the role users’ SPK plays during relevance assessment

    Exploring the dynamics of the biocybernetic loop in physiological computing

    Get PDF
    Physiological computing is a highly multidisciplinary emerging field in which the spread of results across several application areas and disciplines creates a challenge of combining the lessons learned from various studies. The thesis comprises diverse publications that together create a privileged position for contributing to a common understanding of the roles and uses of physiological computing systems, generalizability of results across application areas, the theoretical grounding of the field (as with the various ways the psychophysiological states of the user can be modeled), and the emerging data analysis approaches from the domain of machine learning. The core of physiological computing systems has been built around the concept of biocybernetic loop, aimed at providing real-time adaptation to the cognitions, motivations, and emotions of the user. However, the traditional concept of the biocybernetic loop has been both self-regulatory and immediate; that is, the system adapts to the user immediately. The thesis presents an argument that this is too narrow a view of physiological computing, and it explores scenarios wherein the physiological signals are used not only to adapt to the user but to aid system developers in designing better systems, as well as to aid other users of the system. The thesis includes eight case studies designed to answer three research questions: 1) what are the various dynamics the biocybernetic loop can display, 2) how do the changes in loop dynamics affect the way the user is represented and modeled, and 3) how do the choices of loop dynamics and user representations affect the selection of machine learning methods and approaches? To answer these questions, an analytical model for physiological computing is presented that divides each of the physiological computing systems into five separate layers. The thesis presents three main findings corresponding to the three research questions: Firstly, the case studies show that physiological computing extends beyond the simple real-time self-regulatory loop. Secondly, the selected user representations seem to correlate with the type of loop dynamics. Finally, the case studies show that the machine learning approaches are implemented at the level of feature generation and are used when the loop diverges from the traditional real-time and self-regulatory dynamics into systems where the adaptation happens in the future.Perinteinen ihmisen ja tietokoneen vuorovaikutus on hyvin epäsymmetristä: tietokone voi esittää ihmiselle monimutkaista audiovisuaalista informaatiota kun taas ihmisen kommunikaatio koneen suuntaan on rajattu näppäimistöön ja hiireen. Samoin, vaikka ihmisellä on mahdollisuus saada informaatiota tietokoneen sisäisestä tilasta, kuten muistin ja prosessorin käyttöasteesta, ei tietokoneella ole vastaavaa mahdollisuutta tutkia ihmisen sisäisiä tiloja kuten tunteita. Mittaamalla reaaliajassa ihmisen fysiologisia signaaleja nämä molemmat ongelmat voidaan ratkaista: näppäimistön ja hiiren lisäksi tietokone saa suuren määrän informaatiota ihmisen kognitiivisista ja affektiivisista tiloista. Esimerkiksi mittaamalla ihmisen sykettä tai ihon sähkönjohtavuutta voi tietokone päätellä onko käyttäjä juuri nyt kiihtynyt tai rentoutunut. Tällaista fysiologisten signaalien reaaliaikaista hyödyntämistä ihmisen ja koneen vuorovaikutuksessa on tutkittu onnistuneesti monessa eri yhteyksissä: autonkuljettajien väsymystä voidaan mitata ja tarvittaessa varoittaa ajajaa, tietokonepelaajia mittaamalla on mahdollista säätää pelin vaikeustasoa sopivaksi ja älykello voi reagoida käyttäjän stressiin ehdottamalla rentoutumisharjoitusta. Näitä tapauksia yhdistää se, että käyttäjän fysiologisia signaaleja käytetään reaaliajassa sopeuttamaan järjestelmä käyttäjän itsensä tarpeisiin. Tällaista järjestelmän sopeuttamista reaaliajassa käyttäjän fysiologisten signaalien perusteella kutsutaan “biokyberneettiseksi silmukaksi” (biocybernetic loop). Biokyberneettisen silmukka on perinteisesti määritelty systeemin sopeuttamiseen yksittäisen käyttäjän sen hetkisen fysiologisen vasteen mukaan. Väitöskirjan tarkoitus on tutkia kuinka biokyberneettisen silmukan dynamiikkaa voidaan laajentaa sekä tilassa (voiko silmukka käsittää useita käyttäjiä) ja ajassa (voiko silmukan idea toimia myös ei-reaaliajassa). Erityisesti keskitytään tutkimaan kuinka muutokset silmukan dynamiikassa vaikuttavat silmukan toteutuksen yksityiskohtiin: kannattaako käyttäjää mallintaa eri tavoin ja ovatko tietyn tyyppiset silmukat soveltuvampia koneoppimiseen verrattuna ns. käsintehtyyn ratkaisuun. Väitöskirja sisältää kahdeksan käyttäjätutkimusta, jotka peilaavat biokyberneettisen silmukan käyttäytymistä erilaisissa konteksteissa. Tutkimukset osoittavat, että biokyberneettistä silmukkaa voidaan käyttää myös osana järjestelmän suunnittelua kun fysiologisten mittausten tulokset ohjataan järjestelmän kehittelijöille, ja järjestelmän muiden käyttäjien auttamiseen suosittelujärjestelmissä, joissa käyttäjän antamaa implisiittistä palautetta käytetään hyväksi suositeltaessa tuotteita toisille käyttäjille

    Engaged or Frustrated? Disambiguating Engagement and Frustration in Search

    Get PDF
    One of the primary ways researchers have characterized engagement is by an increase in search actions. Another possibility is that instead of experiencing increased engagement, people who click and query frequently are actually frustrated; several studies have shown that frustration is also characterized by increases in clicking and querying behaviors. This research seeks to illuminate the differences in search behavior between participants who are engaged and frustrated, as well as investigate the effect of task interest on engagement and frustration. To accomplish this, a laboratory experiment was conducted with 40 participants. Participants completed four tasks, and responded to questionnaires that measured their engagement, frustration, and stress. Participants were asked to rank eight topics based on interest, and were given their two most interesting and two least interesting tasks. Poor search result quality was introduced to induce frustration during their most interesting and least interesting tasks. This study found that physiological signals hold some promise for disambiguating engagement and frustration, but this depends on the time frame and manner in which they are examined. Frustrated participants had significantly more skin conductance responses during the task, while engaged participants had greater increases in skin conductance during the first 60 seconds of the task. Significant main and interaction effects for interest and frustration were found for heart rate in the window analysis, indicating that heart rate fluctuations over time can be most effective in distinguishing engagement from frustration. The multilevel modeling of engagement and frustration confirmed this, showing that interest contributed significantly to the model of skin conductance, while frustration contributed significantly to the model of heart rate. This study also found that interest had a significant effect on engagement, while the frustrator effectively created frustration. Frustration also had a significant effect on self-reported stress. Participants exhibited increases in search actions such as clicks and scrolls during periods of both engagement and frustration, but a regression analyses showed that scrolls, clicks on documents, and SERP clicks were most predictive of a frustrating episode. A significant main effect for interest was found for time between queries, indicating that this could be a useful signal of engagement. A model including the physiological signals and search behaviors showed that physiological signals aided in the prediction of engagement and frustration. Findings of this research have provided insight into the utility of physiological signals in distinguishing emotional states as well as provided evidence about the relationship among search actions, engagement and frustration. These findings have also increased our understanding of the role emotions play in search behavior and how information about a searcher’s emotional state can be used to improve the search experience.Doctor of Philosoph

    Affective and Implicit Tagging using Facial Expressions and Electroencephalography.

    Get PDF
    PhDRecent years have seen an explosion of user-generated, untagged multimedia data, generating a need for efficient search and retrieval of this data. The predominant method for content-based tagging is through manual annotation. Consequently, automatic tagging is currently the subject of intensive research. However, it is clear that the process will not be fully automated in the foreseeable future. We propose to involve the user and investigate methods for implicit tagging, wherein users' responses to the multimedia content are analysed in order to generate descriptive tags. We approach this problem through the modalities of facial expressions and EEG signals. We investigate tag validation and affective tagging using EEG signals. The former relies on the detection of event-related potentials triggered in response to the presentation of invalid tags alongside multimedia material. We demonstrate significant differences in users' EEG responses for valid versus invalid tags, and present results towards single-trial classification. For affective tagging, we propose methodologies to map EEG signals onto the valence-arousal space and perform both binary classification as well as regression into this space. We apply these methods in a real-time affective recommendation system. We also investigate the analysis of facial expressions for implicit tagging. This relies on a dynamic texture representation using non-rigid registration that we first evaluate on the problem of facial action unit recognition. We present results on well-known datasets (with both posed and spontaneous expressions) comparable to the state of the art in the field. Finally, we present a multi-modal approach that fuses both modalities for affective tagging. We perform classification in the valence-arousal space based on these modalities and present results for both feature-level and decision-level fusion. We demonstrate improvement in the results when using both modalities, suggesting the modalities contain complementary information
    corecore