1,021 research outputs found

    Emotional expressions reconsidered: challenges to inferring emotion from human facial movements

    Get PDF
    It is commonly assumed that a person’s emotional state can be readily inferred from his or her facial movements, typically called emotional expressions or facial expressions. This assumption influences legal judgments, policy decisions, national security protocols, and educational practices; guides the diagnosis and treatment of psychiatric illness, as well as the development of commercial applications; and pervades everyday social interactions as well as research in other scientific fields such as artificial intelligence, neuroscience, and computer vision. In this article, we survey examples of this widespread assumption, which we refer to as the common view, and we then examine the scientific evidence that tests this view, focusing on the six most popular emotion categories used by consumers of emotion research: anger, disgust, fear, happiness, sadness, and surprise. The available scientific evidence suggests that people do sometimes smile when happy, frown when sad, scowl when angry, and so on, as proposed by the common view, more than what would be expected by chance. Yet how people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation. Furthermore, similar configurations of facial movements variably express instances of more than one emotion category. In fact, a given configuration of facial movements, such as a scowl, often communicates something other than an emotional state. Scientists agree that facial movements convey a range of information and are important for social communication, emotional or otherwise. But our review suggests an urgent need for research that examines how people actually move their faces to express emotions and other social information in the variety of contexts that make up everyday life, as well as careful study of the mechanisms by which people perceive instances of emotion in one another. We make specific research recommendations that will yield a more valid picture of how people move their faces to express emotions and how they infer emotional meaning from facial movements in situations of everyday life. This research is crucial to provide consumers of emotion research with the translational information they require

    Smart Pain Assessment tool for critically ill patients unable to communicate: Early stage development of a medical device

    Get PDF
    Critically ill patients often experience pain during their treatment but due to patients’ lowered ability to communicate, pain assessment may be challenging. The aim of the study was to develop the concept of the Smart Pain Assessment tool based on the Internet of Things technology for critically ill patients who are unable to communicate their pain. The study describes two phases of the early stage development of the Smart Pain Assessment tool in a medical device development framework. The initiation Phase I consists of a scoping review, conducted to explore the potentiality of the Internet of Things technology in basic nursing care. In the formulation Phase II, the prototype of the Smart Pain Assessment tool was tested and the concept was evaluated for feasibility. The prototype was tested with healthy participants (n=31) during experimental pain, measuring pain-related physiological variables and activity of five facial muscles. The variables were combined using machine learning to create a model for pain prediction. The feasibility of the concept was evaluated in focus group interviews with critical care nurses (n=20) as potential users of the device. The literature review suggests that the development of Internet of Things -based innovations in basic nursing care is diverse but still in its early stages. The prototype was able to identify experimental pain and classify its intensity as mild or moderate/severe with 83% accuracy. In addition, three of the five facial muscles tested were recognised to provide the most pain-related information. According to critical care nurses, the Smart Pain Assessment tool could be used to ensure pain assessment, but it needs to be integrated into an existing patient monitoring and information system, and the reliability of the data provided by the device needs to be assessable for nurses. Based on the results of this study, detecting and classifying experimental pain's intensity automatically using an Internet of Things -based device is possible. The prototype of the device should be further developed and tested in clinical trials, involving the users at each stage of the development to ensure clinical relevance and a user-centric design.Älykäs kipumittari kommunikoimaan kykenemättömille kriittisesti sairaille potilaille: Lääkinnällisen laitteen varhainen kehittäminen Kriittisesti sairaat potilaat kokevat usein kipua hoidon aikana, mutta potilaiden kivun arviointi on haastavaa tilanteissa, joissa potilaan kyky kommunikoida on alentunut. Tutkimuksen tavoitteena oli kehittää toimintakonsepti esineiden internet -teknologiaan perustuvalle Älykkäälle kipumittarille, joka on suunniteltu kriittisesti sairaille potilaille, jotka eivät kykene kommunikoimaan kivustaan. Tutkimuksessa kuvataan Älykkään kipumittarin varhaisia kehitysvaiheita lääkinnällisen laitteen kehitysprosessin mukaisesti. Aloitusvaiheessa I toteutettiin kartoittava kirjallisuuskatsaus, jossa selvitettiin esineiden internet -teknologian mahdollisuuksia perushoidossa. Muotoiluvaiheessa II testattiin laitteen prototyyppiä ja arvioitiin laitteen toimintakonseptin toteutettavuutta. Prototyypin testaukseen osallistui terveitä koehenkilöitä (n=31), joille tuotettiin kipua. Kipualtistuksen aikana mitattiin kipuun liittyviä fysiologisia muuttujia ja viiden kasvolihaksen aktivoitumista. Muuttujat yhdistettiin koneoppimismenetelmällä kivun ennustemalliksi. Lisäksi teho-osastolla työskentelevät sairaanhoitajat (n=20) arvioivat fokusryhmähaastatteluissa laitteen toimintakonseptin toteutettavuutta. Kirjallisuuskatsauksen tuloksista käy ilmi, että esineiden internetiin perustuvien innovaatioiden kehittäminen perushoidon tukemiseen on monipuolista mutta se on vielä alkuvaiheessa. Älykkään kipumittarin prototyyppi osoittautui lupaavaksi kokeellisen kivun tunnistamisessa ja sen voimakkuuden luokittelussa, saavuttaen 83 %:n tarkkuuden kivun luokittelussa lievään tai kohtalaiseen/voimakkaaseen. Lisäksi todettiin, että viidestä mitatusta kasvolihaksesta kolme antoi merkittävintä tietoa kivun tunnistamiseen ja voimakkuuteen liittyen. Sairaanhoitajat näkivät potentiaalia Älykkään kipumittarin käytössä potilaiden kivun arvioinnissa teho-osastolla. Laite tulisi kuitenkin integroida käytössä olevaan potilastietojärjestelmään, ja laitteen tuottamien tietojen luotettavuus tulisi olla hoitajien arvioitavissa. Tulosten perusteella esineiden internet -teknologiaan perustuvan laitteen avulla on mahdollista tunnistaa ja luokitella kokeellisen kivun voimakkuutta automaattisesti. Laitteen prototyyppiä tulee jatkokehittää ja testata kliinisissä tutkimuksissa. Tulevat käyttäjät tulee ottaa mukaan jokaiseen kehitysvaiheeseen laitteen kliinisen merkityksen ja käyttäjälähtöisen muotoilun varmistamiseksi

    2011 Annual Research Symposium Abstract Book

    Get PDF
    2011 annual volume of abstracts for science research projects conducted by students at Trinity College

    Ethical issues in labour care in Sri Lanka

    Get PDF

    Facial Expression Rendering in Medical Training Simulators: Current Status and Future Directions

    Get PDF
    Recent technological advances in robotic sensing and actuation methods have prompted development of a range of new medical training simulators with multiple feedback modalities. Learning to interpret facial expressions of a patient during medical examinations or procedures has been one of the key focus areas in medical training. This paper reviews facial expression rendering systems in medical training simulators that have been reported to date. Facial expression rendering approaches in other domains are also summarized to incorporate the knowledge from those works into developing systems for medical training simulators. Classifications and comparisons of medical training simulators with facial expression rendering are presented, and important design features, merits and limitations are outlined. Medical educators, students and developers are identified as the three key stakeholders involved with these systems and their considerations and needs are presented. Physical-virtual (hybrid) approaches provide multimodal feedback, present accurate facial expression rendering, and can simulate patients of different age, gender and ethnicity group; makes it more versatile than virtual and physical systems. The overall findings of this review and proposed future directions are beneficial to researchers interested in initiating or developing such facial expression rendering systems in medical training simulators.This work was supported by the Robopatient project funded by the EPSRC Grant No EP/T00519X/

    8th Annual Research Week- Event Proceedings

    Get PDF
    8th Annual Research Wee
    corecore