4,063 research outputs found

    Behavioral state detection of newborns based on facial expression analysis

    Get PDF
    Prematurely born infants are observed at a Neonatal Intensive Care Unit (NICU) for medical treatment. Whereas vital body functions are continuously monitored, their incubator is covered by a blanket for medical reasons. This prevents visual observation of the newborns during most time of the day, while it is known that the facial expression can give valuable information about the presence of discomfort. This prompted the authors to develop a prototype of an automated video survey system for the detection of discomfort in newborn babies by analysis of their facial expression. Since only a reliable and situation-independent system is useful, we focus at robustness against non-ideal viewpoints and lighting conditions. Our proposed algorithm automatically segments the face from the background and localizes the eye, eyebrow and mouth regions. Based upon measurements in these regions, a hierarchical classifier is employed to discriminate between the behavioral states sleep, awake and cry. We have evaluated the described prototype system on recordings of three healthy newborns, and we show that our algorithm operates with approximately 95% accuracy. Small changes in viewpoint and lighting conditions are allowed, but when there is a major reduction in light, or when the viewpoint is far from frontal, the algorithm fails. © 2009 Springer Berlin Heidelberg

    Differences between uni-and multidimensional scales for assessing pain in term newborn infants at the bedside

    Get PDF
    OBJECTIVES: This study sought to determine the level of agreement between behavioral and multidimensional pain assessment scales in term newborn infants submitted to an acute nociceptive stimulus. METHODS: This cross-sectional study was performed on 400 healthy term newborns who received an intramuscular injection of vitamin K during the first 6 hours of life. Two behavioral pain scales (the Neonatal Facial Coding System and the Behavioral Indicators of Infant Pain) and one multidimensional tool (the Premature Infant Pain Profile) were applied by a single observer before the procedure, during cleansing, during injection and two minutes after injection. The Cochran Q, McNemar and kappa tests were used to compare the presence and degree of agreement between the three scales. The Hotelling T2 test was used to compare the groups of newborns for which the scales showed agreement or disagreement. A generalized linear regression was used to compare the results of the Neonatal Facial Coding System and the Behavioral Indicators of Infant Pain across the four study time points. RESULTS: The neonates studied had a gestational age of 39±1 weeks, a birth weight of 3169±316 g and and postnatal age of 67±45 minutes. During the stimulus procedure, 80% of the newborns exhibited pain behaviors according to the Neonatal Facial Coding System and the Behavioral Indicators of Infant Pain, and 70% experienced pain according to the Premature Infant Pain Profile (p<0.001). The frequencies of the detection of pain using the Behavioral Indicators of Infant Pain and the Neonatal Facial Coding System were similar. The characteristics of the neonates were not associated with the level of agreement between the scales. CONCLUSION: The Neonatal Facial Coding System and the Behavioral Indicators of Infant Pain behavioral scales are more sensitive for the identification of pain in healthy term newborn infants than the multidimensional Premature Infant Pain Profile scale.Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)Universidade Federal de São Paulo (UNIFESP), Escola Paulista de Medicina (EPM) Division of Neonatal MedicineUNIFESP, EPM, Division of Neonatal Medicine2009-06145-5 e 2012/50511-9SciEL

    Assessment and detection of pain in noncommunicative severely brain-injured patients.

    Full text link
    peer reviewedDetecting pain in severely brain-injured patients recovering from coma represents a real challenge. Patients with disorders of consciousness are unable to consistently or reliably communicate their feelings and potential perception of pain. However, recent studies suggest that patients in a minimally conscious state can experience pain to some extent. Pain monitoring in these patients is hence of medical and ethical importance. In this article, we will focus on the possible use of behavioral scales for the assessment and detection of pain in noncommunicative patients

    Differences between uni-and multidimensional scales for assessing pain in term newborn infants at the bedside

    Get PDF
    OBJECTIVES: This study sought to determine the level of agreement between behavioral and multidimensional pain assessment scales in term newborn infants submitted to an acute nociceptive stimulus. METHODS: This cross-sectional study was performed on 400 healthy term newborns who received an intramuscular injection of vitamin K during the first 6 hours of life. Two behavioral pain scales (the Neonatal Facial Coding System and the Behavioral Indicators of Infant Pain) and one multidimensional tool (the Premature Infant Pain Profile) were applied by a single observer before the procedure, during cleansing, during injection and two minutes after injection. The Cochran Q, McNemar and kappa tests were used to compare the presence and degree of agreement between the three scales. The Hotelling T2 test was used to compare the groups of newborns for which the scales showed agreement or disagreement. A generalized linear regression was used to compare the results of the Neonatal Facial Coding System and the Behavioral Indicators of Infant Pain across the four study time points. RESULTS: The neonates studied had a gestational age of 39±1 weeks, a birth weight of 3169±316 g and and postnatal age of 67±45 minutes. During the stimulus procedure, 80% of the newborns exhibited pain behaviors according to the Neonatal Facial Coding System and the Behavioral Indicators of Infant Pain, and 70% experienced pain according to the Premature Infant Pain Profile (

    Atypical eye contact in autism: Models, mechanisms and development

    Get PDF
    An atypical pattern of eye contact behaviour is one of the most significant symptoms of Autism Spectrum Disorder (ASD). Recent empirical advances have revealed the developmental, cognitive and neural basis of atypical eye contact behaviour in ASD. We review different models and advance a new ‘fast-track modulator model’. Specifically, we propose that atypical eye contact processing in ASD originates in the lack of influence from a subcortical face and eye contact detection route, which is hypothesized to modulate eye contact processing and guide its emergent specialization during development

    Cultural modulation of face and gaze scanning in young children

    Get PDF
    Previous research has demonstrated that the way human adults look at others’ faces is modulated by their cultural background, but very little is known about how such a culture-specific pattern of face gaze develops. The current study investigated the role of cultural background on the development of face scanning in young children between the ages of 1 and 7 years, and its modulation by the eye gaze direction of the face. British and Japanese participants’ eye movements were recorded while they observed faces moving their eyes towards or away from the participants. British children fixated more on the mouth whereas Japanese children fixated more on the eyes, replicating the results with adult participants. No cultural differences were observed in the differential responses to direct and averted gaze. The results suggest that different patterns of face scanning exist between different cultures from the first years of life, but differential scanning of direct and averted gaze associated with different cultural norms develop later in life

    Facial expression of pain: an evolutionary account.

    Get PDF
    This paper proposes that human expression of pain in the presence or absence of caregivers, and the detection of pain by observers, arises from evolved propensities. The function of pain is to demand attention and prioritise escape, recovery, and healing; where others can help achieve these goals, effective communication of pain is required. Evidence is reviewed of a distinct and specific facial expression of pain from infancy to old age, consistent across stimuli, and recognizable as pain by observers. Voluntary control over amplitude is incomplete, and observers can better detect pain that the individual attempts to suppress rather than amplify or simulate. In many clinical and experimental settings, the facial expression of pain is incorporated with verbal and nonverbal vocal activity, posture, and movement in an overall category of pain behaviour. This is assumed by clinicians to be under operant control of social contingencies such as sympathy, caregiving, and practical help; thus, strong facial expression is presumed to constitute and attempt to manipulate these contingencies by amplification of the normal expression. Operant formulations support skepticism about the presence or extent of pain, judgments of malingering, and sometimes the withholding of caregiving and help. To the extent that pain expression is influenced by environmental contingencies, however, "amplification" could equally plausibly constitute the release of suppression according to evolved contingent propensities that guide behaviour. Pain has been largely neglected in the evolutionary literature and the literature on expression of emotion, but an evolutionary account can generate improved assessment of pain and reactions to it

    A unified coding strategy for processing faces and voices

    Get PDF
    Both faces and voices are rich in socially-relevant information, which humans are remarkably adept at extracting, including a person's identity, age, gender, affective state, personality, etc. Here, we review accumulating evidence from behavioral, neuropsychological, electrophysiological, and neuroimaging studies which suggest that the cognitive and neural processing mechanisms engaged by perceiving faces or voices are highly similar, despite the very different nature of their sensory input. The similarity between the two mechanisms likely facilitates the multi-modal integration of facial and vocal information during everyday social interactions. These findings emphasize a parsimonious principle of cerebral organization, where similar computational problems in different modalities are solved using similar solutions
    • 

    corecore