124 research outputs found

    The Limits of Organized Employer-Employee Relations in Non-Union Facilities: Some New Evidence of Flexibility

    Get PDF

    From Short-Term Tolerance to Long-Term Recognition in Human Visual Memory

    Get PDF
    Humans have a remarkable ability to recognize visual objects following limited exposure and despite changes at the image-level. How humans acquire this ability remains a mystery, and it remains one area in which artificial intelligence has yet to match human performance. I sought to understand this fundamental cognitive ability by leveraging theories and methods from multiple fields. In particular, I examined how rules guiding the perception of objects in visual working memory assist in the construction of visual long-term memories. In four experiments, I reveal that our expectations for how objects move in the world are used to learn and integrate object information into visual long-term memory. Next, I further examined how aspects of memory over the short-term may actually be features used to construct appropriately constrained representations in the long-term. I demonstrate in three experiments that visual working memory is highly tolerant to variability at test, in order to act as a venue to integrate information into long-term memory. Finally, I moved past investigating memory following singular experiences to understand how our memories change over repeated exposures. I discovered across three experiments that the initial quality of an experience, as well the amount of time between repeated encounters, affects our ability to integrate and remember objects we encounter multiple times. This work contributes to our understanding of the growth process of visual memory, and attempts to form bridges between traditionally disparate fields of vision scientist studying object perception, neuroscientists studying long-term memory, and engineers designing artificial intelligence recognition systems

    Hypertriglyceridemic Waist Phenotype Predicts Increased Visceral Fat in Subjects With Type 2 Diabetes

    Get PDF
    OBJECTIVE: Greater accumulation of visceral fat is strongly linked to risk of cardiovascular disease. However, elevated waist circumference by itself does not always identify individuals with increased visceral fat. RESEARCH DESIGN AND METHODS: We examined 375 subjects with type 2 diabetes from the CHICAGO cohort for presence of hypertriglyceridemic waist phenotype (waist circumference >90 cm in men or >85 cm in women, in conjunction with a plasma triglyceride concentration of ≥177 mg/dl) to determine its usefulness for identifying subjects with increased amounts of visceral fat. We divided subjects into three groups: group 1 (low waist circumference and low triglycerides; waist circumference ≤90 cm in men or ≤85 cm in women and triglyceride <177 mg/dl, n = 18), group 2 (high waist circumference and low triglycerides; waist circumference >90 cm in men or >85 cm in women and triglycerides <177 mg/dl, n = 230), and group 3 (high waist circumference and high triglycerides; waist circumference >90 cm in men or >85 cm in women and triglycerides ≥177 mg/dl, n = 127). RESULTS: Subjects in group 3 had significantly higher visceral fat (P < 0.0001), A1C (P < 0.01), and coronary artery calcium (P < 0.05) compared with group 2, despite similar age, BMI, and waist circumference. The relationship of the phenotype to atherosclerosis, however, was attenuated by adjustment for HDL cholesterol, triglyceride-rich lipoprotein cholesterol, apolipoprotein B, or LDL particle number. CONCLUSIONS: The presence of hypertriglyceridemic waist phenotype in subjects with type 2 diabetes identifies a subset with greater degree of visceral adiposity. This subset also has greater degree of subclinical atherosclerosis that may be related to the proatherogenic lipoprotein changes.Takeda Global Research and Development; National Institutes of Health (DK 71711); University of Illinois at Chicag

    Take an Emotion Walk: Perceiving Emotions from Gaits Using Hierarchical Attention Pooling and Affective Mapping

    Full text link
    We present an autoencoder-based semi-supervised approach to classify perceived human emotions from walking styles obtained from videos or motion-captured data and represented as sequences of 3D poses. Given the motion on each joint in the pose at each time step extracted from 3D pose sequences, we hierarchically pool these joint motions in a bottom-up manner in the encoder, following the kinematic chains in the human body. We also constrain the latent embeddings of the encoder to contain the space of psychologically-motivated affective features underlying the gaits. We train the decoder to reconstruct the motions per joint per time step in a top-down manner from the latent embeddings. For the annotated data, we also train a classifier to map the latent embeddings to emotion labels. Our semi-supervised approach achieves a mean average precision of 0.84 on the Emotion-Gait benchmark dataset, which contains both labeled and unlabeled gaits collected from multiple sources. We outperform current state-of-art algorithms for both emotion recognition and action recognition from 3D gaits by 7%--23% on the absolute. More importantly, we improve the average precision by 10%--50% on the absolute on classes that each makes up less than 25% of the labeled part of the Emotion-Gait benchmark dataset.Comment: In proceedings of the 16th European Conference on Computer Vision, 2020. Total pages 18. Total figures 5. Total tables

    Visual Behavior, Pupil Dilation, and Ability to Identify Emotions From Facial Expressions After Stroke

    Full text link
    [EN] Social cognition is the innate human ability to interpret the emotional state of others from contextual verbal and non-verbal information, and to self-regulate accordingly. Facial expressions are one of the most relevant sources of non-verbal communication, and their interpretation has been extensively investigated in the literature, using both behavioral and physiological measures, such as those derived from visual activity and visual responses. The decoding of facial expressions of emotion is performed by conscious and unconscious cognitive processes that involve a complex brain network that can be damaged after cerebrovascular accidents. A diminished ability to identify facial expressions of emotion has been reported after stroke, which has traditionally been attributed to impaired emotional processing. While this can be true, an alteration in visual behavior after brain injury could also negatively contribute to this ability. This study investigated the accuracy, distribution of responses, visual behavior, and pupil dilation of individuals with stroke while identifying emotional facial expressions. Our results corroborated impaired performance after stroke and exhibited decreased attention to the eyes, evidenced by a diminished time and number of fixations made in this area in comparison to healthy subjects and comparable pupil dilation. The differences in visual behavior reached statistical significance in some emotions when comparing individuals with stroke with impaired performance with healthy subjects, but not when individuals post-stroke with comparable performance were considered. The performance dependence of visual behavior, although not determinant, might indicate that altered visual behavior could be a negatively contributing factor for emotion recognition from facial expressions.This study was funded by Conselleria de Educacion, Cultura y Deporte of Generalitat Valenciana of Spain (Project SEJI/2019/017), and Universitat Politecnica de Valencia (Grant PAID-10-18).Maza, A.; Moliner, B.; Ferri, J.; Llorens Rodríguez, R. (2020). Visual Behavior, Pupil Dilation, and Ability to Identify Emotions From Facial Expressions After Stroke. Frontiers in Neurology. 10:1-12. https://doi.org/10.3389/fneur.2019.01415S11210Nijsse, B., Spikman, J. M., Visser-Meily, J. M. A., de Kort, P. L. M., & van Heugten, C. M. (2019). Social cognition impairments are associated with behavioural changes in the long term after stroke. PLOS ONE, 14(3), e0213725. doi:10.1371/journal.pone.0213725Feldman, R. S., White, J. B., & Lobato, D. (1982). Social Skills and Nonverbal Behavior. Development of Nonverbal Behavior in Children, 259-277. doi:10.1007/978-1-4757-1761-7_10Vuilleumier, P., & Pourtois, G. (2007). Distributed and interactive brain mechanisms during emotion face perception: Evidence from functional neuroimaging. Neuropsychologia, 45(1), 174-194. doi:10.1016/j.neuropsychologia.2006.06.003Frith, C. D., & Frith, U. (2006). How we predict what other people are going to do. Brain Research, 1079(1), 36-46. doi:10.1016/j.brainres.2005.12.126Zinchenko, O., Yaple, Z. A., & Arsalidou, M. (2018). Brain Responses to Dynamic Facial Expressions: A Normative Meta-Analysis. Frontiers in Human Neuroscience, 12. doi:10.3389/fnhum.2018.00227Ekman, P. (1993). Facial expression and emotion. American Psychologist, 48(4), 384-392. doi:10.1037/0003-066x.48.4.384Srinivasan, R., & Martinez, A. M. (2021). Cross-Cultural and Cultural-Specific Production and Perception of Facial Expressions of Emotion in the Wild. IEEE Transactions on Affective Computing, 12(3), 707-721. doi:10.1109/taffc.2018.2887267Smith, M. L., Grühn, D., Bevitt, A., Ellis, M., Ciripan, O., Scrimgeour, S., … Ewing, L. (2018). Transmitting and decoding facial expressions of emotion during healthy aging: More similarities than differences. Journal of Vision, 18(9), 10. doi:10.1167/18.9.10Thompson, A. E., & Voyer, D. (2014). Sex differences in the ability to recognise non-verbal displays of emotion: A meta-analysis. Cognition and Emotion, 28(7), 1164-1195. doi:10.1080/02699931.2013.875889Doležal, J., & Fabian, V. (2015). 41. Application of eye tracking in neuroscience. Clinical Neurophysiology, 126(3), e44. doi:10.1016/j.clinph.2014.10.200Guo, K. (2012). Holistic Gaze Strategy to Categorize Facial Expression of Varying Intensities. PLoS ONE, 7(8), e42585. doi:10.1371/journal.pone.0042585Guo, K., Soornack, Y., & Settle, R. (2019). Expression-dependent susceptibility to face distortions in processing of facial expressions of emotion. Vision Research, 157, 112-122. doi:10.1016/j.visres.2018.02.001Eisenbarth, H., & Alpers, G. W. (2011). Happy mouth and sad eyes: Scanning emotional facial expressions. Emotion, 11(4), 860-865. doi:10.1037/a0022758Schurgin, M. W., Nelson, J., Iida, S., Ohira, H., Chiao, J. Y., & Franconeri, S. L. (2014). Eye movements during emotion recognition in faces. Journal of Vision, 14(13), 14-14. doi:10.1167/14.13.14Guo, K., & Shaw, H. (2015). Face in profile view reduces perceived facial expression intensity: An eye-tracking study. Acta Psychologica, 155, 19-28. doi:10.1016/j.actpsy.2014.12.001Guo, K. (2013). Size-Invariant Facial Expression Categorization and Associated Gaze Allocation within Social Interaction Space. Perception, 42(10), 1027-1042. doi:10.1068/p7552Sirois, S., & Brisson, J. (2014). Pupillometry. WIREs Cognitive Science, 5(6), 679-692. doi:10.1002/wcs.1323Eckstein, M. K., Guerra-Carrillo, B., Miller Singley, A. T., & Bunge, S. A. (2017). Beyond eye gaze: What else can eyetracking reveal about cognition and cognitive development? Developmental Cognitive Neuroscience, 25, 69-91. doi:10.1016/j.dcn.2016.11.001Ariel, R., & Castel, A. D. (2013). Eyes wide open: enhanced pupil dilation when selectively studying important information. Experimental Brain Research, 232(1), 337-344. doi:10.1007/s00221-013-3744-5Zekveld, A. A., & Kramer, S. E. (2014). Cognitive processing load across a wide range of listening conditions: Insights from pupillometry. Psychophysiology, 51(3), 277-284. doi:10.1111/psyp.12151De Gee, J. W., Knapen, T., & Donner, T. H. (2014). Decision-related pupil dilation reflects upcoming choice and individual bias. Proceedings of the National Academy of Sciences, 111(5), E618-E625. doi:10.1073/pnas.1317557111Bradley, M. M., Miccoli, L., Escrig, M. A., & Lang, P. J. (2008). The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology, 45(4), 602-607. doi:10.1111/j.1469-8986.2008.00654.xDuque, A., Sanchez, A., & Vazquez, C. (2014). Gaze-fixation and pupil dilation in the processing of emotional faces: The role of rumination. Cognition and Emotion, 28(8), 1347-1366. doi:10.1080/02699931.2014.881327Lanata, A., Armato, A., Valenza, G., & Scilingo, E. P. (2011). Eye tracking and pupil size variation as response to affective stimuli: a preliminary study. Proceedings of the 5th International ICST Conference on Pervasive Computing Technologies for Healthcare. doi:10.4108/icst.pervasivehealth.2011.246056Peinkhofer, C., Knudsen, G. M., Moretti, R., & Kondziella, D. (2019). Cortical modulation of pupillary function: systematic review. PeerJ, 7, e6882. doi:10.7717/peerj.6882Grill-Spector, K., Knouf, N., & Kanwisher, N. (2004). The fusiform face area subserves face perception, not generic within-category identification. Nature Neuroscience, 7(5), 555-562. doi:10.1038/nn1224Ferretti, V., & Papaleo, F. (2018). Understanding others: emotion recognition abilities in humans and other animals. Genes, Brain and Behavior, e12544. doi:10.1111/gbb.12544Sergerie, K., Chochol, C., & Armony, J. L. (2008). The role of the amygdala in emotional processing: A quantitative meta-analysis of functional neuroimaging studies. Neuroscience & Biobehavioral Reviews, 32(4), 811-830. doi:10.1016/j.neubiorev.2007.12.002Rapcsak, S. Z., Galper, S. R., Comer, J. F., Reminger, S. L., Nielsen, L., Kaszniak, A. W., … Cohen, R. A. (2000). Fear recognition deficits after focal brain damage: A cautionary note. Neurology, 54(3), 575-575. doi:10.1212/wnl.54.3.575Radice-Neumann, D., Zupan, B., Tomita, M., & Willer, B. (2009). Training Emotional Processing in Persons With Brain Injury. Journal of Head Trauma Rehabilitation, 24(5), 313-323. doi:10.1097/htr.0b013e3181b09160Yuvaraj, R., Murugappan, M., Norlinah, M. I., Sundaraj, K., & Khairiyah, M. (2013). Review of Emotion Recognition in Stroke Patients. Dementia and Geriatric Cognitive Disorders, 36(3-4), 179-196. doi:10.1159/000353440Babbage, D. R., Yim, J., Zupan, B., Neumann, D., Tomita, M. R., & Willer, B. (2011). Meta-analysis of facial affect recognition difficulties after traumatic brain injury. Neuropsychology, 25(3), 277-285. doi:10.1037/a0021908Milders, M., Fuchs, S., & Crawford, J. R. (2003). Neuropsychological Impairments and Changes in Emotional and Social Behaviour Following Severe Traumatic Brain Injury. Journal of Clinical and Experimental Neuropsychology, 25(2), 157-172. doi:10.1076/jcen.25.2.157.13642Genova, H. M., Genualdi, A., Goverover, Y., Chiaravalloti, N. D., Marino, C., & Lengenfelder, J. (2016). An investigation of the impact of facial affect recognition impairments in moderate to severe TBI on fatigue, depression, and quality of life. Social Neuroscience, 12(3), 303-307. doi:10.1080/17470919.2016.1173584Rigon, A., Voss, M. W., Turkstra, L. S., Mutlu, B., & Duff, M. C. (2018). Different aspects of facial affect recognition impairment following traumatic brain injury: The role of perceptual and interpretative abilities. Journal of Clinical and Experimental Neuropsychology, 40(8), 805-819. doi:10.1080/13803395.2018.1437120Rosenberg, H., McDonald, S., Dethier, M., Kessels, R. P. C., & Westbrook, R. F. (2014). Facial Emotion Recognition Deficits following Moderate–Severe Traumatic Brain Injury (TBI): Re-examining the Valence Effect and the Role of Emotion Intensity. Journal of the International Neuropsychological Society, 20(10), 994-1003. doi:10.1017/s1355617714000940Lancelot, C., & Gilles, C. (2018). How does visual context influence recognition of facial emotion in people with traumatic brain injury? Brain Injury, 33(1), 4-11. doi:10.1080/02699052.2018.1531308McDonald, S. (2013). Impairments in Social Cognition Following Severe Traumatic Brain Injury. Journal of the International Neuropsychological Society, 19(3), 231-246. doi:10.1017/s1355617712001506Vallat-Azouvi, C., Azouvi, P., Le-Bornec, G., & Brunet-Gouet, E. (2018). Treatment of social cognition impairments in patients with traumatic brain injury: a critical review. Brain Injury, 33(1), 87-93. doi:10.1080/02699052.2018.1531309Godin, B., Oishi, K., Oishi, K., Davis, C., Gomez, Y., Trupe, L., … Tippett, D. (2018). Impaired Recognition of Emotional Faces after Stroke Involving Right Amygdala or Insula. Seminars in Speech and Language, 39(01), 087-100. doi:10.1055/s-0037-1608859Abbott, J. D., Cumming, G., Fidler, F., & Lindell, A. K. (2013). The perception of positive and negative facial expressions in unilateral brain-damaged patients: A meta-analysis. Laterality: Asymmetries of Body, Brain and Cognition, 18(4), 437-459. doi:10.1080/1357650x.2012.703206Abbott, J. D., Wijeratne, T., Hughes, A., Perre, D., & Lindell, A. K. (2014). The perception of positive and negative facial expressions by unilateral stroke patients. Brain and Cognition, 86, 42-54. doi:10.1016/j.bandc.2014.01.017Delazer, M., Sojer, M., Ellmerer, P., Boehme, C., & Benke, T. (2018). Eye-Tracking Provides a Sensitive Measure of Exploration Deficits After Acute Right MCA Stroke. Frontiers in Neurology, 9. doi:10.3389/fneur.2018.00359Lech, M., Kucewicz, M. T., & Czyżewski, A. (2019). Human Computer Interface for Tracking Eye Movements Improves Assessment and Diagnosis of Patients With Acquired Brain Injuries. Frontiers in Neurology, 10. doi:10.3389/fneur.2019.00006Spikman, J. M., Milders, M. V., Visser-Keizer, A. C., Westerhof-Evers, H. J., Herben-Dekker, M., & van der Naalt, J. (2013). Deficits in Facial Emotion Recognition Indicate Behavioral Changes and Impaired Self-Awareness after Moderate to Severe Traumatic Brain Injury. PLoS ONE, 8(6), e65581. doi:10.1371/journal.pone.0065581Knox, L., & Douglas, J. (2009). Long-term ability to interpret facial expression after traumatic brain injury and its relation to social integration. Brain and Cognition, 69(2), 442-449. doi:10.1016/j.bandc.2008.09.009Struchen, M. A., Clark, A. N., Sander, A. M., Mills, M. R., Evans, G., & Kurtz, D. (2008). Relation of executive functioning and social communication measures to functional outcomes following traumatic brain injury. NeuroRehabilitation, 23(2), 185-198. doi:10.3233/nre-2008-23208Ferro, J. M., Caeiro, L., & Santos, C. (2009). Poststroke Emotional and Behavior Impairment: A Narrative Review. Cerebrovascular Diseases, 27(1), 197-203. doi:10.1159/000200460Bortolon, C., Capdevielle, D., & Raffard, S. (2015). Face recognition in schizophrenia disorder: A comprehensive review of behavioral, neuroimaging and neurophysiological studies. Neuroscience & Biobehavioral Reviews, 53, 79-107. doi:10.1016/j.neubiorev.2015.03.006Harms, M. B., Martin, A., & Wallace, G. L. (2010). Facial Emotion Recognition in Autism Spectrum Disorders: A Review of Behavioral and Neuroimaging Studies. Neuropsychology Review, 20(3), 290-322. doi:10.1007/s11065-010-9138-6Folstein, M. F., Folstein, S. E., & McHugh, P. R. (1975). «Mini-mental state». Journal of Psychiatric Research, 12(3), 189-198. doi:10.1016/0022-3956(75)90026-6Romero, M., Sánchez, A., Marín, C., Navarro, M. D., Ferri, J., & Noé, E. (2012). Clinical usefulness of the Spanish version of the Mississippi Aphasia Screening Test (MASTsp): validation in stroke patients. Neurología (English Edition), 27(4), 216-224. doi:10.1016/j.nrleng.2011.06.001Aguillon-Hernandez, N., Roché, L., Bonnet-Brilhault, F., Roux, S., Barthelemy, C., & Martineau, J. (2016). Eye Movement Monitoring and Maturation of Human Face Exploration. Medical Principles and Practice, 25(6), 548-554. doi:10.1159/000447971Mathôt, S., Fabius, J., Van Heusden, E., & Van der Stigchel, S. (2018). Safe and sensible preprocessing and baseline correction of pupil-size data. Behavior Research Methods, 50(1), 94-106. doi:10.3758/s13428-017-1007-2Green, C., & Guo, K. (2016). Factors contributing to individual differences in facial expression categorisation. Cognition and Emotion, 32(1), 37-48. doi:10.1080/02699931.2016.1273200Burley, D. T., Gray, N. S., & Snowden, R. J. (2017). As Far as the Eye Can See: Relationship between Psychopathic Traits and Pupil Response to Affective Stimuli. PLOS ONE, 12(1), e0167436. doi:10.1371/journal.pone.0167436Partala, T., & Surakka, V. (2003). Pupil size variation as an indication of affective processing. International Journal of Human-Computer Studies, 59(1-2), 185-198. doi:10.1016/s1071-5819(03)00017-xGotham, K. O., Siegle, G. J., Han, G. T., Tomarken, A. J., Crist, R. N., Simon, D. M., & Bodfish, J. W. (2018). Pupil response to social-emotional material is associated with rumination and depressive symptoms in adults with autism spectrum disorder. PLOS ONE, 13(8), e0200340. doi:10.1371/journal.pone.0200340Demenescu, L. R., Mathiak, K. A., & Mathiak, K. (2014). Age- and Gender-Related Variations of Emotion Recognition in Pseudowords and Faces. Experimental Aging Research, 40(2), 187-207. doi:10.1080/0361073x.2014.882210Grainger, S. A., Henry, J. D., Phillips, L. H., Vanman, E. J., & Allen, R. (2015). Age Deficits in Facial Affect Recognition: The Influence of Dynamic Cues. The Journals of Gerontology Series B: Psychological Sciences and Social Sciences, gbv100. doi:10.1093/geronb/gbv100Wagner, H. L. (1993). On measuring performance in category judgment studies of nonverbal behavior. Journal of Nonverbal Behavior, 17(1), 3-28. doi:10.1007/bf00987006Bradley, M. M., & Lang, P. J. (2015). Memory, emotion, and pupil diameter: Repetition of natural scenes. Psychophysiology, 52(9), 1186-1193. doi:10.1111/psyp.12442Larsen, R. S., & Waters, J. (2018). Neuromodulatory Correlates of Pupil Dilation. Frontiers in Neural Circuits, 12. doi:10.3389/fncir.2018.0002

    Subclinical atherosclerosis: what it is, what it means and what we can do about it

    Get PDF
    Atherosclerosis is a chronic, progressive, inflammatory disease with a long asymptomatic phase. Disease progression can lead eventually to the occurrence of acute cardiovascular events such as myocardial infarction, unstable angina pectoris and sudden cardiac death. While the disease is still in a subclinical stage, however, the presence of atherosclerosis can be identified by several methods, including coronary angiography, intravascular ultrasonography, B-mode ultrasonography, computed tomography and magnetic resonance imaging. Based on the results of imaging studies, statin therapy can slow, halt or even reverse the progression of atherosclerotic disease, depending on the intensity of treatment. Whether to screen and treat patients for subclinical atherosclerosis remains controversial. Although atheromatous plaque burden reduction has not yet been definitively correlated with significant decreases in risk for acute coronary events in asymptomatic patients, statin therapy contributes significantly to the risk reduction observed in clinical trials in patients with and without overt coronary disease

    Current directions in visual working memory research: An introduction and emerging insights

    Get PDF
    Visual working memory (VWM) is a core construct in the cognitive (neuro-)sciences, assumed to serve as a hub for information exchange and thus supporting a multitude of cognitive functions related to processing visual information. Here, we give an introduction into key terms and paradigms and an overview of ongoing debates in the field, to which the articles collected in this Special Issue on 'Current Directions in Visual Working Memory Research' contribute. Our aim is to extract, from this overview, some 'emerging' theoretical insights concerning questions such as the optimal way to examine VWM, which types of mental representations contribute to performance on VWM tasks, and how VWM keeps features from the same object together and apart from features of concurrently maintained objects (the binding problem)
    corecore