10 research outputs found

    A system for modeling social traits in realistic faces with artificial intelligence

    Full text link
    Los seres humanos han desarrollado especialmente su capacidad perceptiva para procesar caras y extraer información de las características faciales. Usando nuestra capacidad conductual para percibir rostros, hacemos atribuciones tales como personalidad, inteligencia o confiabilidad basadas en la apariencia facial que a menudo tienen un fuerte impacto en el comportamiento social en diferentes dominios. Por lo tanto, las caras desempeñan un papel fundamental en nuestras relaciones con otras personas y en nuestras decisiones cotidianas. Con la popularización de Internet, las personas participan en muchos tipos de interacciones virtuales, desde experiencias sociales, como juegos, citas o comunidades, hasta actividades profesionales, como e-commerce, e-learning, e-therapy o e-health. Estas interacciones virtuales manifiestan la necesidad de caras que representen a las personas reales que interactúan en el mundo digital: así surgió el concepto de avatar. Los avatares se utilizan para representar a los usuarios en diferentes escenarios y ámbitos, desde la vida personal hasta situaciones profesionales. En todos estos casos, la aparición del avatar puede tener un efecto no solo en la opinión y percepción de otra persona, sino en la autopercepción, que influye en la actitud y el comportamiento del sujeto. De hecho, los avatares a menudo se emplean para obtener impresiones o emociones a través de expresiones no verbales, y pueden mejorar las interacciones en línea o incluso son útiles para fines educativos o terapéuticos. Por lo tanto, la posibilidad de generar avatares de aspecto realista que provoquen un determinado conjunto de impresiones sociales supone una herramienta muy interesante y novedosa, útil en un amplio abanico de campos. Esta tesis propone un método novedoso para generar caras de aspecto realistas con un perfil social asociado que comprende 15 impresiones diferentes. Para este propósito, se completaron varios objetivos parciales. En primer lugar, las características faciales se extrajeron de una base de datos de caras reales y se agruparon por aspecto de una manera automática y objetiva empleando técnicas de reducción de dimensionalidad y agrupamiento. Esto produjo una taxonomía que permite codificar de manera sistemática y objetiva las caras de acuerdo con los grupos obtenidos previamente. Además, el uso del método propuesto no se limita a las características faciales, y se podría extender su uso para agrupar automáticamente cualquier otro tipo de imágenes por apariencia. En segundo lugar, se encontraron las relaciones existentes entre las diferentes características faciales y las impresiones sociales. Esto ayuda a saber en qué medida una determinada característica facial influye en la percepción de una determinada impresión social, lo que permite centrarse en la característica o características más importantes al diseñar rostros con una percepción social deseada. En tercer lugar, se implementó un método de edición de imágenes para generar una cara totalmente nueva y realista a partir de una definición de rostro utilizando la taxonomía de rasgos faciales antes mencionada. Finalmente, se desarrolló un sistema para generar caras realistas con un perfil de rasgo social asociado, lo cual cumple el objetivo principal de la presente tesis. La principal novedad de este trabajo reside en la capacidad de trabajar con varias dimensiones de rasgos a la vez en caras realistas. Por lo tanto, en contraste con los trabajos anteriores que usan imágenes con ruido, o caras de dibujos animados o sintéticas, el sistema desarrollado en esta tesis permite generar caras de aspecto realista eligiendo los niveles deseados de quince impresiones: Miedo, Enfado, Atractivo, Cara de niño, Disgustado, Dominante, Femenino, Feliz, Masculino, Prototípico, Triste, Sorprendido, Amenazante, Confiable e Inusual. Los prometedores resultados obtenidos permitirán investigar más a fondo cómo modelar lHumans have specially developed their perceptual capacity to process faces and to extract information from facial features. Using our behavioral capacity to perceive faces, we make attributions such as personality, intelligence or trustworthiness based on facial appearance that often have a strong impact on social behavior in different domains. Therefore, faces play a central role in our relationships with other people and in our everyday decisions. With the popularization of the Internet, people participate in many kinds of virtual interactions, from social experiences, such as games, dating or communities, to professional activities, such as e-commerce, e-learning, e-therapy or e-health. These virtual interactions manifest the need for faces that represent the actual people interacting in the digital world: thus the concept of avatar emerged. Avatars are used to represent users in different scenarios and scopes, from personal life to professional situations. In all these cases, the appearance of the avatar may have an effect not only on other person's opinion and perception but on self-perception, influencing the subject's own attitude and behavior. In fact, avatars are often employed to elicit impressions or emotions through non-verbal expressions, and are able to improve online interactions or even useful for education purposes or therapy. Then, being able to generate realistic looking avatars which elicit a certain set of desired social impressions poses a very interesting and novel tool, useful in a wide range of fields. This thesis proposes a novel method for generating realistic looking faces with an associated social profile comprising 15 different impressions. For this purpose, several partial objectives were accomplished. First, facial features were extracted from a database of real faces and grouped by appearance in an automatic and objective manner employing dimensionality reduction and clustering techniques. This yielded a taxonomy which allows to systematically and objectively codify faces according to the previously obtained clusters. Furthermore, the use of the proposed method is not restricted to facial features, and it should be possible to extend its use to automatically group any other kind of images by appearance. Second, the existing relationships among the different facial features and the social impressions were found. This helps to know how much a certain facial feature influences the perception of a given social impression, allowing to focus on the most important feature or features when designing faces with a sought social perception. Third, an image editing method was implemented to generate a completely new, realistic face from just a face definition using the aforementioned facial feature taxonomy. Finally, a system to generate realistic faces with an associated social trait profile was developed, which fulfills the main objective of the present thesis. The main novelty of this work resides in the ability to work with several trait dimensions at a time on realistic faces. Thus, in contrast with the previous works that use noisy images, or cartoon-like or synthetic faces, the system developed in this thesis allows to generate realistic looking faces choosing the desired levels of fifteen impressions, namely Afraid, Angry, Attractive, Babyface, Disgusted, Dominant, Feminine, Happy, Masculine, Prototypical, Sad, Surprised, Threatening, Trustworthy and Unusual. The promising results obtained in this thesis will allow to further investigate how to model social perception in faces using a completely new approach.Els sers humans han desenvolupat especialment la seua capacitat perceptiva per a processar cares i extraure informació de les característiques facials. Usant la nostra capacitat conductual per a percebre rostres, fem atribucions com ara personalitat, intel·ligència o confiabilitat basades en l'aparença facial que sovint tenen un fort impacte en el comportament social en diferents dominis. Per tant, les cares exercixen un paper fonamental en les nostres relacions amb altres persones i en les nostres decisions quotidianes. Amb la popularització d'Internet, les persones participen en molts tipus d'inter- accions virtuals, des d'experiències socials, com a jocs, cites o comunitats, fins a activitats professionals, com e-commerce, e-learning, e-therapy o e-health. Estes interaccions virtuals manifesten la necessitat de cares que representen a les persones reals que interactuen en el món digital: així va sorgir el concepte d'avatar. Els avatars s'utilitzen per a representar als usuaris en diferents escenaris i àmbits, des de la vida personal fins a situacions professionals. En tots estos casos, l'aparició de l'avatar pot tindre un efecte no sols en l'opinió i percepció d'una altra persona, sinó en l'autopercepció, que influïx en l'actitud i el comportament del subjecte. De fet, els avatars sovint s'empren per a obtindre impressions o emocions a través d'expressions no verbals, i poden millorar les interaccions en línia o inclús són útils per a fins educatius o terapèutics. Per tant, la possibilitat de generar avatars d'aspecte realista que provoquen un determinat conjunt d'impressions socials planteja una ferramenta molt interessant i nova, útil en un ampla varietat de camps. Esta tesi proposa un mètode nou per a generar cares d'aspecte realistes amb un perfil social associat que comprén 15 impressions diferents. Per a este propòsit, es van completar diversos objectius parcials. En primer lloc, les característiques facials es van extraure d'una base de dades de cares reals i es van agrupar per aspecte d'una manera automàtica i objectiva emprant tècniques de reducció de dimensionalidad i agrupament. Açò va produir una taxonomia que permet codificar de manera sistemàtica i objectiva les cares d'acord amb els grups obtinguts prèviament. A més, l'ús del mètode proposat no es limita a les característiques facials, i es podria estendre el seu ús per a agrupar automàticament qualsevol altre tipus d'imatges per aparença. En segon lloc, es van trobar les relacions existents entre les diferents característiques facials i les impressions socials. Açò ajuda a saber en quina mesura una determinada característica facial influïx en la percepció d'una determinada impressió social, la qual cosa permet centrar-se en la característica o característiques més importants al dissenyar rostres amb una percepció social desitjada. En tercer lloc, es va implementar un mètode d'edició d'imatges per a generar una cara totalment nova i realista a partir d'una definició de rostre utilitzant la taxonomia de trets facials abans mencionada. Finalment, es va desenrotllar un sistema per a generar cares realistes amb un perfil de tret social associat, la qual cosa complix l'objectiu principal de la present tesi. La principal novetat d'este treball residix en la capacitat de treballar amb diverses dimensions de trets al mateix temps en cares realistes. Per tant, en contrast amb els treballs anteriors que usen imatges amb soroll, o cares de dibuixos animats o sintètiques, el sistema desenrotllat en esta tesi permet generar cares d'aspecte realista triant els nivells desitjats de quinze impressions: Por, Enuig, Atractiu, Cara de xiquet, Disgustat, Dominant, Femení, Feliç, Masculí, Prototípic, Trist, Sorprés, Amenaçador, Confiable i Inusual. Els prometedors resultats obtinguts en esta tesi permetran investigar més a fons com modelar la percepció social en les cares utilitzant un enfocament completFuentes Hurtado, FJ. (2018). A system for modeling social traits in realistic faces with artificial intelligence [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/101943TESI

    EvoDeep: A new evolutionary approach for automatic Deep Neural Networks parametrisation

    Full text link
    [EN] Deep Neural Networks (DNN) have become a powerful, and extremely popular mechanism, which has been widely used to solve problems of varied complexity, due to their ability to make models fitted to non-linear complex problems. Despite its well-known benefits, DNNs are complex learning models whose parametrisation and architecture are made usually by hand. This paper proposes a new Evolutionary Algorithm, named EvoDeep. devoted to evolve the parameters and the architecture of a DNN in order to maximise its classification accuracy, as well as maintaining a valid sequence of layers. This model is tested against a widely used dataset of handwritten digits images. The experiments performed using this dataset show that the Evolutionary Algorithm is able to select the parameters and the DNN architecture appropriately, achieving a 98.93% accuracy in the best run. (C) 2017 Elsevier Inc. All rights reserved.This work has been co-funded by the next research projects: EphemeCH (TIN2014-56494-C4-4-P) and DeepBio (TIN2017-85727-C4-3-P) Spanish Ministry of Economy and Competitivity and European Regional Development Fund FEDER, Justice Programme of the European Union (2014-2020) 723180 -RiskTrack-JUST-2015-JCOO-AG/JUST-2015-JCOO-AG-1, and by the CAM grant S2013/ICE-3095 (CIBERDINE:Cybersecurity, Data and Risks). The contents of this publication are the sole responsibility of their authors and can in no way be taken to reflect the views of the European Commission.Martín, A.; Lara-Cabrera, R.; Fuentes-Hurtado, FJ.; Naranjo Ornedo, V.; Camacho, D. (2018). EvoDeep: A new evolutionary approach for automatic Deep Neural Networks parametrisation. Journal of Parallel and Distributed Computing. 117:180-191. https://doi.org/10.1016/j.jpdc.2017.09.006S18019111

    The Influence of Each Facial Feature on How We Perceive and Interpret Human Faces

    Full text link
    [EN] Facial information is processed by our brain in such a way that we immediately make judgments about, for example, attractiveness or masculinity or interpret personality traits or moods of other people. The appearance of each facial feature has an effect on our perception of facial traits. This research addresses the problem of measuring the size of these effects for five facial features (eyes, eyebrows, nose, mouth, and jaw). Our proposal is a mixed feature-based and image-based approach that allows judgments to be made on complete real faces in the categorization tasks, more than on synthetic, noisy, or partial faces that can influence the assessment. Each facial feature of the faces is automatically classified considering their global appearance using principal component analysis. Using this procedure, we establish a reduced set of relevant specific attributes (each one describing a complete facial feature) to characterize faces. In this way, a more direct link can be established between perceived facial traits and what people intuitively consider an eye, an eyebrow, a nose, a mouth, or a jaw. A set of 92 male faces were classified using this procedure, and the results were related to their scores in 15 perceived facial traits. We show that the relevant features greatly depend on what we are trying to judge. Globally, the eyes have the greatest effect. However, other facial features are more relevant for some judgments like the mouth for happiness and femininity or the nose for dominance.This study was carried out using the Chicago Face Database developed at the University of Chicago by Debbie S. Ma, Joshua Correll, and Bernd Wittenbrink.Diego-Mas, JA.; Fuentes-Hurtado, FJ.; Naranjo Ornedo, V.; Alcañiz Raya, ML. (2020). The Influence of Each Facial Feature on How We Perceive and Interpret Human Faces. i-Perception. 11(5):1-18. https://doi.org/10.1177/2041669520961123S118115Ahonen, T., Hadid, A., & Pietikainen, M. (2006). Face Description with Local Binary Patterns: Application to Face Recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 28(12), 2037-2041. doi:10.1109/tpami.2006.244Axelrod, V., & Yovel, G. (2010). External facial features modify the representation of internal facial features in the fusiform face area. NeuroImage, 52(2), 720-725. doi:10.1016/j.neuroimage.2010.04.027Belhumeur, P. N., Hespanha, J. P., & Kriegman, D. J. (1997). Eigenfaces vs. Fisherfaces: recognition using class specific linear projection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(7), 711-720. doi:10.1109/34.598228Biederman, I. (1987). Recognition-by-components: A theory of human image understanding. Psychological Review, 94(2), 115-147. doi:10.1037/0033-295x.94.2.115Blais, C., Roy, C., Fiset, D., Arguin, M., & Gosselin, F. (2012). The eyes are not the window to basic emotions. Neuropsychologia, 50(12), 2830-2838. doi:10.1016/j.neuropsychologia.2012.08.010Bovet, J., Barthes, J., Durand, V., Raymond, M., & Alvergne, A. (2012). Men’s Preference for Women’s Facial Features: Testing Homogamy and the Paternity Uncertainty Hypothesis. PLoS ONE, 7(11), e49791. doi:10.1371/journal.pone.0049791Brahnam, S., & Nanni, L. (2010). Predicting trait impressions of faces using local face recognition techniques. Expert Systems with Applications, 37(7), 5086-5093. doi:10.1016/j.eswa.2009.12.002Bruce, V., & Young, A. (1986). Understanding face recognition. British Journal of Psychology, 77(3), 305-327. doi:10.1111/j.2044-8295.1986.tb02199.xCabeza, R., & Kato, T. (2000). Features are Also Important: Contributions of Featural and Configural Processing to Face Recognition. Psychological Science, 11(5), 429-433. doi:10.1111/1467-9280.00283Chihaoui, M., Elkefi, A., Bellil, W., & Ben Amar, C. (2016). A Survey of 2D Face Recognition Techniques. Computers, 5(4), 21. doi:10.3390/computers5040021Cootes, T. F., Edwards, G. J., & Taylor, C. J. (2001). Active appearance models. IEEE Transactions on Pattern Analysis and Machine Intelligence, 23(6), 681-685. doi:10.1109/34.927467Diamond, R., & Carey, S. (1986). Why faces are and are not special: An effect of expertise. Journal of Experimental Psychology: General, 115(2), 107-117. doi:10.1037/0096-3445.115.2.107Dixson, B. J. W., Sulikowski, D., Gouda‐Vossos, A., Rantala, M. J., & Brooks, R. C. (2016). The masculinity paradox: facial masculinity and beardedness interact to determine women’s ratings of men’s facial attractiveness. Journal of Evolutionary Biology, 29(11), 2311-2320. doi:10.1111/jeb.12958Dunn†, J. C. (1974). Well-Separated Clusters and Optimal Fuzzy Partitions. Journal of Cybernetics, 4(1), 95-104. doi:10.1080/01969727408546059Eberhardt, J. L., Davies, P. G., Purdie-Vaughns, V. J., & Johnson, S. L. (2006). Looking Deathworthy. Psychological Science, 17(5), 383-386. doi:10.1111/j.1467-9280.2006.01716.xFink, B., Neave, N., Manning, J. T., & Grammer, K. (2006). Facial symmetry and judgements of attractiveness, health and personality. Personality and Individual Differences, 41(3), 491-499. doi:10.1016/j.paid.2006.01.017Fox, E., & Damjanovic, L. (2006). The eyes are sufficient to produce a threat superiority effect. Emotion, 6(3), 534-539. doi:10.1037/1528-3542.6.3.534Fuentes-Hurtado, F., Diego-Mas, J. A., Naranjo, V., & Alcañiz, M. (2019). Automatic classification of human facial features based on their appearance. PLOS ONE, 14(1), e0211314. doi:10.1371/journal.pone.0211314Gill, D. (2017). Women and men integrate facial information differently in appraising the beauty of a face. Evolution and Human Behavior, 38(6), 756-760. doi:10.1016/j.evolhumbehav.2017.07.001Gosselin, F., & Schyns, P. G. (2001). Bubbles: a technique to reveal the use of information in recognition tasks. Vision Research, 41(17), 2261-2271. doi:10.1016/s0042-6989(01)00097-9Hagiwara, N., Kashy, D. A., & Cesario, J. (2012). The independent effects of skin tone and facial features on Whites’ affective reactions to Blacks. Journal of Experimental Social Psychology, 48(4), 892-898. doi:10.1016/j.jesp.2012.02.001Hayward, W. G., Rhodes, G., & Schwaninger, A. (2008). An own-race advantage for components as well as configurations in face recognition. Cognition, 106(2), 1017-1027. doi:10.1016/j.cognition.2007.04.002Jack, R. E., & Schyns, P. G. (2015). The Human Face as a Dynamic Tool for Social Communication. Current Biology, 25(14), R621-R634. doi:10.1016/j.cub.2015.05.052Jones, B. C., Little, A. C., Burt, D. M., & Perrett, D. I. (2004). When Facial Attractiveness is Only Skin Deep. Perception, 33(5), 569-576. doi:10.1068/p3463Kanwisher, N., McDermott, J., & Chun, M. M. (1997). The Fusiform Face Area: A Module in Human Extrastriate Cortex Specialized for Face Perception. The Journal of Neuroscience, 17(11), 4302-4311. doi:10.1523/jneurosci.17-11-04302.1997Keating, C. F., & Doyle, J. (2002). The faces of desirable mates and dates contain mixed social status cues. Journal of Experimental Social Psychology, 38(4), 414-424. doi:10.1016/s0022-1031(02)00007-0Keil, M. S. (2009). «I Look in Your Eyes, Honey»: Internal Face Features Induce Spatial Frequency Preference for Human Face Processing. PLoS Computational Biology, 5(3), e1000329. doi:10.1371/journal.pcbi.1000329Kwart, D. G., Foulsham, T., & Kingstone, A. (2012). Age and Beauty are in the Eye of the Beholder. Perception, 41(8), 925-938. doi:10.1068/p7136Langlois, J. H., Kalakanis, L., Rubenstein, A. J., Larson, A., Hallam, M., & Smoot, M. (2000). Maxims or myths of beauty? A meta-analytic and theoretical review. Psychological Bulletin, 126(3), 390-423. doi:10.1037/0033-2909.126.3.390Levine, T. R., & Hullett, C. R. (2002). Eta Squared, Partial Eta Squared, and Misreporting of Effect Size in Communication Research. Human Communication Research, 28(4), 612-625. doi:10.1111/j.1468-2958.2002.tb00828.xLittle, A. C., Burriss, R. P., Jones, B. C., & Roberts, S. C. (2007). Facial appearance affects voting decisions. Evolution and Human Behavior, 28(1), 18-27. doi:10.1016/j.evolhumbehav.2006.09.002Lundqvist, D., Esteves, F., & Ohman, A. (1999). The Face of Wrath: Critical Features for Conveying Facial Threat. Cognition & Emotion, 13(6), 691-711. doi:10.1080/026999399379041Ma, D. S., Correll, J., & Wittenbrink, B. (2015). The Chicago face database: A free stimulus set of faces and norming data. Behavior Research Methods, 47(4), 1122-1135. doi:10.3758/s13428-014-0532-5Maloney, L. T., & Dal Martello, M. F. (2006). Kin recognition and the perceived facial similarity of children. Journal of Vision, 6(10), 4. doi:10.1167/6.10.4McKone, E., & Yovel, G. (2009). Why does picture-plane inversion sometimes dissociate perception of features and spacing in faces, and sometimes not? Toward a new theory of holistic processing. Psychonomic Bulletin & Review, 16(5), 778-797. doi:10.3758/pbr.16.5.778Meyers, E., & Wolf, L. (2007). Using Biologically Inspired Features for Face Processing. International Journal of Computer Vision, 76(1), 93-104. doi:10.1007/s11263-007-0058-8Miller, G. A. (1994). The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological Review, 101(2), 343-352. doi:10.1037/0033-295x.101.2.343Pallett, P. M., Link, S., & Lee, K. (2010). New «golden» ratios for facial beauty. Vision Research, 50(2), 149-154. doi:10.1016/j.visres.2009.11.003Paunonen, S. V., Ewan, K., Earthy, J., Lefave, S., & Goldberg, H. (1999). Facial Features as Personality Cues. Journal of Personality, 67(3), 555-583. doi:10.1111/1467-6494.00065Petrican, R., Todorov, A., & Grady, C. (2014). Personality at Face Value: Facial Appearance Predicts Self and Other Personality Judgments among Strangers and Spouses. Journal of Nonverbal Behavior, 38(2), 259-277. doi:10.1007/s10919-014-0175-3Piepers, D. W., & Robbins, R. A. (2012). A Review and Clarification of the Terms «holistic,» «configural,» and «relational» in the Face Perception Literature. Frontiers in Psychology, 3. doi:10.3389/fpsyg.2012.00559Rakover, S. S. (2002). Featural vs. configurational information in faces: A conceptual and empirical analysis. British Journal of Psychology, 93(1), 1-30. doi:10.1348/000712602162427Rhodes, G., Ewing, L., Hayward, W. G., Maurer, D., Mondloch, C. J., & Tanaka, J. W. (2009). Contact and other-race effects in configural and component processing of faces. British Journal of Psychology, 100(4), 717-728. doi:10.1348/000712608x396503Richardson, J. T. E. (2011). Eta squared and partial eta squared as measures of effect size in educational research. Educational Research Review, 6(2), 135-147. doi:10.1016/j.edurev.2010.12.001Ritz-Timme, S., Gabriel, P., Obertovà, Z., Boguslawski, M., Mayer, F., Drabik, A., … Cattaneo, C. (2010). A new atlas for the evaluation of facial features: advantages, limits, and applicability. International Journal of Legal Medicine, 125(2), 301-306. doi:10.1007/s00414-010-0446-4Rojas Q., M., Masip, D., Todorov, A., & Vitria, J. (2011). Automatic Prediction of Facial Trait Judgments: Appearance vs. Structural Models. PLoS ONE, 6(8), e23323. doi:10.1371/journal.pone.0023323Rossion, B. (2008). Picture-plane inversion leads to qualitative changes of face perception. Acta Psychologica, 128(2), 274-289. doi:10.1016/j.actpsy.2008.02.003Russell, R. (2003). Sex, Beauty, and the Relative Luminance of Facial Features. Perception, 32(9), 1093-1107. doi:10.1068/p5101Saavedra, C., Smith, P., & Peissig, J. (2013). The Relative Role of Eyes, Eyebrows, and Eye Region in Face Recognition. Journal of Vision, 13(9), 410-410. doi:10.1167/13.9.410Sadr, J., Jarudi, I., & Sinha, P. (2003). The Role of Eyebrows in Face Recognition. Perception, 32(3), 285-293. doi:10.1068/p5027Said, C., Sebe, N., & Todorov, A. (2009). «Structural resemblance to emotional expressions predicts evaluation of emotionally neutral faces»: Correction to Said, Sebe, and Todorov (2009). Emotion, 9(4), 509-509. doi:10.1037/a0016784Scharff, A., Palmer, J., & Moore, C. M. (2011). Evidence of fixed capacity in visual object categorization. Psychonomic Bulletin & Review, 18(4), 713-721. doi:10.3758/s13423-011-0101-1Schobert, A.-K., Corradi-Dell’Acqua, C., Frühholz, S., van der Zwaag, W., & Vuilleumier, P. (2017). Functional organization of face processing in the human superior temporal sulcus: a 7T high-resolution fMRI study. Social Cognitive and Affective Neuroscience, 13(1), 102-113. doi:10.1093/scan/nsx119Sirovich, L., & Kirby, M. (1987). Low-dimensional procedure for the characterization of human faces. Journal of the Optical Society of America A, 4(3), 519. doi:10.1364/josaa.4.000519Tanaka, J. W., & Farah, M. J. (1993). Parts and Wholes in Face Recognition. The Quarterly Journal of Experimental Psychology Section A, 46(2), 225-245. doi:10.1080/14640749308401045Taubert, J., Apthorp, D., Aagten-Murphy, D., & Alais, D. (2011). The role of holistic processing in face perception: Evidence from the face inversion effect. Vision Research, 51(11), 1273-1278. doi:10.1016/j.visres.2011.04.002Terry, R. L. (1977). Further Evidence on Components of Facial Attractiveness. Perceptual and Motor Skills, 45(1), 130-130. doi:10.2466/pms.1977.45.1.130Todorov, A., Dotsch, R., Wigboldus, D. H. J., & Said, C. P. (2011). Data-driven Methods for Modeling Social Perception. Social and Personality Psychology Compass, 5(10), 775-791. doi:10.1111/j.1751-9004.2011.00389.xTodorov, A., Mandisodza, A. N., Goren, A., & Hall, C. C. (2005). Inferences of Competence from Faces Predict Election Outcomes. Science, 308(5728), 1623-1626. doi:10.1126/science.1110589Todorov, A., Said, C. P., Engell, A. D., & Oosterhof, N. N. (2008). Understanding evaluation of faces on social dimensions. Trends in Cognitive Sciences, 12(12), 455-460. doi:10.1016/j.tics.2008.10.001Tsankova, E., & Kappas, A. (2015). Facial Skin Smoothness as an Indicator of Perceived Trustworthiness and Related Traits. Perception, 45(4), 400-408. doi:10.1177/0301006615616748Turk, M., & Pentland, A. (1991). Eigenfaces for Recognition. Journal of Cognitive Neuroscience, 3(1), 71-86. doi:10.1162/jocn.1991.3.1.71Wang, R., Li, J., Fang, H., Tian, M., & Liu, J. (2012). Individual Differences in Holistic Processing Predict Face Recognition Ability. Psychological Science, 23(2), 169-177. doi:10.1177/0956797611420575Wilson, J. P., & Rule, N. O. (2015). Facial Trustworthiness Predicts Extreme Criminal-Sentencing Outcomes. Psychological Science, 26(8), 1325-1331. doi:10.1177/0956797615590992Yamaguchi, M. K., Hirukawa, T., & Kanazawa, S. (2013). Judgment of Gender through Facial Parts. Perception, 42(11), 1253-1265. doi:10.1068/p240563nMcArthur, L. Z., & Baron, R. M. (1983). Toward an ecological theory of social perception. Psychological Review, 90(3), 215-238. doi:10.1037/0033-295x.90.3.21

    Evolutionary Computation for Modelling Social Traits in Realistic Looking Synthetic Faces

    Full text link
    [EN] Human faces play a central role in our lives. Thanks to our behavioural capacity to perceive faces, how a face looks in a painting, a movie, or an advertisement can dramatically influence what we feel about them and what emotions are elicited. Facial information is processed by our brain in such a way that we immediately make judgements like attractiveness or masculinity or interpret personality traits or moods of other people. Due to the importance of appearance-driven judgements of faces, this has become a major focus not only for psychological research, but for neuroscientists, artists, engineers, and software developers. New technologies are now able to create realistic looking synthetic faces that are used in arts, online activities, advertisement, or movies. However, there is not a method to generate virtual faces that convey the desired sensations to the observers. In this work, we present a genetic algorithm based procedure to create realistic faces combining facial features in the adequate relative positions. A model of how observers will perceive a face based on its features' appearances and relative positions was developed and used as the fitness function of the algorithm. The model is able to predict 15 facial social traits related to aesthetic, moods, and personality. The proposed procedure was validated comparing its results with the opinion of human observers. This procedure is useful not only for creating characters with artistic purposes, but also for online activities, advertising, surgery, or criminology.Fuentes-Hurtado, FJ.; Diego-Mas, JA.; Naranjo Ornedo, V.; Alcañiz Raya, ML. (2018). Evolutionary Computation for Modelling Social Traits in Realistic Looking Synthetic Faces. Complexity. 1-16. https://doi.org/10.1155/2018/9270152S11

    Congreso Internacional de Responsabilidad Social Apuestas para el desarrollo regional.

    Get PDF
    Congreso Internacional de Responsabilidad Social: apuestas para el desarrollo regional [Edición 1 / Nov. 6 - 7: 2019 Bogotá D.C.]El Congreso Internacional de Responsabilidad Social “Apuestas para el Desarrollo Regional”, se llevó a cabo los días 6 y 7 de noviembre de 2019 en la ciudad de Bogotá D.C. como un evento académico e investigativo liderado por la Corporación Universitaria Minuto de Dios -UNIMINUTO – Rectoría Cundinamarca cuya pretensión fue el fomento de nuevos paradigmas, la divulgación de conocimiento renovado en torno a la Responsabilidad Social; finalidad adoptada institucionalmente como postura ética y política que impacta la docencia, la investigación y la proyección social, y cuyo propósito central es la promoción de una “sensibilización consciente y crítica ante las situaciones problemáticas, tanto de las comunidades como del país, al igual que la adquisición de unas competencias orientadas a la promoción y al compromiso con el desarrollo humano y social integral”. (UNIMINUTO, 2014). Dicha postura, de conciencia crítica y sensibilización social, sumada a la experiencia adquirida mediante el trabajo articulado con otras instituciones de índole académico y de forma directa con las comunidades, permitió establecer como objetivo central del evento la reflexión de los diferentes grupos de interés, la gestión de sus impactos como elementos puntuales que contribuyeron en la audiencia a la toma de conciencia frente al papel que se debe asumir a favor de la responsabilidad social como aporte seguro al desarrollo regional y a su vez al fortalecimiento de los Objetivos de Desarrollo Sostenible

    Congreso Internacional de Responsabilidad Social Apuestas para el desarrollo regional.

    Get PDF
    Congreso Internacional de Responsabilidad Social: apuestas para el desarrollo regional [Edición 1 / Nov. 6 - 7: 2019 Bogotá D.C.]El Congreso Internacional de Responsabilidad Social “Apuestas para el Desarrollo Regional”, se llevó a cabo los días 6 y 7 de noviembre de 2019 en la ciudad de Bogotá D.C. como un evento académico e investigativo liderado por la Corporación Universitaria Minuto de Dios -UNIMINUTO – Rectoría Cundinamarca cuya pretensión fue el fomento de nuevos paradigmas, la divulgación de conocimiento renovado en torno a la Responsabilidad Social; finalidad adoptada institucionalmente como postura ética y política que impacta la docencia, la investigación y la proyección social, y cuyo propósito central es la promoción de una “sensibilización consciente y crítica ante las situaciones problemáticas, tanto de las comunidades como del país, al igual que la adquisición de unas competencias orientadas a la promoción y al compromiso con el desarrollo humano y social integral”. (UNIMINUTO, 2014). Dicha postura, de conciencia crítica y sensibilización social, sumada a la experiencia adquirida mediante el trabajo articulado con otras instituciones de índole académico y de forma directa con las comunidades, permitió establecer como objetivo central del evento la reflexión de los diferentes grupos de interés, la gestión de sus impactos como elementos puntuales que contribuyeron en la audiencia a la toma de conciencia frente al papel que se debe asumir a favor de la responsabilidad social como aporte seguro al desarrollo regional y a su vez al fortalecimiento de los Objetivos de Desarrollo Sostenible

    Desarrollo de un modelo anisotrópico para la simulación electrofisiológica auricular

    Full text link
    Los métodos de adquisición de potencial cardíaco existentes actualmente son limitados. El electrocardiograma, el navegador intracavitario o el BSPM no son capaces de obtener un mapa de potenciales completo del corazón. Con este proyecto se desarrolla un modelo auricular con el que poder observar la evolución del potencial en cualquier punto de las aurículas y en cualquier instante del tiempo, mediante la simulación, y de esta forma poder estudiar mejor tanto comportamientos normales como patológicos de la aurícula.Existent potential acquisition methods are limited. The electrocardiogram, the intra-cavitary navigator and the BSPM are not able to obtain a complete potentials map of the heart. With this project we develop an anisotropic atrial model with which we are able to simulate and watch the potential evolution in every cell of the atrial and in every instant. That way, we are able to study in a better way normal and pathological behavior of the heart.Fuentes Hurtado, FJ. (2013). Desarrollo de un modelo anisotrópico para la simulación electrofisiológica auricular. http://hdl.handle.net/10251/33059.Archivo delegad

    Embriología Humana. Manual de prácticas de laboratorio

    No full text
    Manual para cubrir los contenidos de la parte práctica de la UDI (Unidad Didáctica Integrdora) de Embriología de las Unidades Académicas de Ododntología y Medicina Humana de la Universidad Autónoma de Zacatecas

    Prevalence of SARS-CoV-2 in Spain (ENE-COVID): a nationwide, population-based seroepidemiological study

    No full text

    Evolution over Time of Ventilatory Management and Outcome of Patients with Neurologic Disease∗

    No full text
    OBJECTIVES: To describe the changes in ventilator management over time in patients with neurologic disease at ICU admission and to estimate factors associated with 28-day hospital mortality. DESIGN: Secondary analysis of three prospective, observational, multicenter studies. SETTING: Cohort studies conducted in 2004, 2010, and 2016. PATIENTS: Adult patients who received mechanical ventilation for more than 12 hours. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: Among the 20,929 patients enrolled, we included 4,152 (20%) mechanically ventilated patients due to different neurologic diseases. Hemorrhagic stroke and brain trauma were the most common pathologies associated with the need for mechanical ventilation. Although volume-cycled ventilation remained the preferred ventilation mode, there was a significant (p < 0.001) increment in the use of pressure support ventilation. The proportion of patients receiving a protective lung ventilation strategy was increased over time: 47% in 2004, 63% in 2010, and 65% in 2016 (p < 0.001), as well as the duration of protective ventilation strategies: 406 days per 1,000 mechanical ventilation days in 2004, 523 days per 1,000 mechanical ventilation days in 2010, and 585 days per 1,000 mechanical ventilation days in 2016 (p < 0.001). There were no differences in the length of stay in the ICU, mortality in the ICU, and mortality in hospital from 2004 to 2016. Independent risk factors for 28-day mortality were age greater than 75 years, Simplified Acute Physiology Score II greater than 50, the occurrence of organ dysfunction within first 48 hours after brain injury, and specific neurologic diseases such as hemorrhagic stroke, ischemic stroke, and brain trauma. CONCLUSIONS: More lung-protective ventilatory strategies have been implemented over years in neurologic patients with no effect on pulmonary complications or on survival. We found several prognostic factors on mortality such as advanced age, the severity of the disease, organ dysfunctions, and the etiology of neurologic disease
    corecore