35 research outputs found

    How the marketing research affects the improvement in the dental doctor-patient relation

    Get PDF
    The relation between provider and customer in the services area, mainly medical, represents a fundamental desideratum. This type of relation derives from a two-way involvement of both parts at the entire marketing mix level. The base of new marketing strategies that imply effective relation models can only be built by setting out an ample time related investigation process of the mechanisms pertaining to the customer’s perception of the quality and the coordinates of the relationship with the provider. The article aims to investigate the mechanism leading to customer retention in the case of dental offices, both from the perspective of customers and providers.The authors conducted an in-depth interview-type qualitative research, which identified and pointed out the extent to which the marketing activity, as seen from the perspective of specific principles and scientific methodology, is implemented in the dental offices in Bucharest.The research was also focused on the perception of specialists, dental office/clinics managers or owners regarding the concept of customer retention, elements which could lead to keeping customers, and the image of the ideal office from the perspective of services adjusted to consumers.  

    By the sound of it:an ERP investigation of human action sound processing in 7-month-old infants

    Get PDF
    Recent evidence suggests that human adults perceive human action sounds as a distinct category from human vocalizations, environmental, and mechanical sounds, activating different neural networks (Engel et al., 2009 and Lewis et al., 2011). Yet, little is known about the development of such specialization. Using event-related potentials (ERP), this study investigated neural correlates of 7-month-olds’ processing of human action (HA) sounds in comparison to human vocalizations (HV), environmental (ENV), and mechanical (MEC) sounds. Relative to the other categories, HA sounds led to increased positive amplitudes between 470 and 570 ms post-stimulus onset at left anterior temporal locations, while HV led to increased negative amplitudes at the more posterior temporal locations in both hemispheres. Collectively, human produced sounds (HA + HV) led to significantly different response profiles compared to non-living sound sources (ENV + MEC) at parietal and frontal locations in both hemispheres. Overall, by 7 months of age human action sounds are being differentially processed in the brain, consistent with a dichotomy for processing living versus non-living things. This provides novel evidence regarding the typical categorical processing of socially relevant sounds

    Three-year-olds’ rapid facial electromyographic responses to emotional facial expressions and body postures

    Get PDF
    Rapid Facial Reactions (RFRs) to observed emotional expressions are proposed to be involved in a wide array of socioemotional skills, from empathy to social communication. Two of the most persuasive theoretical accounts propose RFRs to rely either on motor resonance mechanisms or on more complex mechanisms involving affective processes. Previous studies demonstrated that presentation of facial and bodily expressions can generate rapid changes in adult and school age children’s muscle activity. However, up to date, there is little to no evidence to suggest the existence of emotional RFRs from infancy to preschool age. To investigate whether RFRs are driven by motor mimicry or could also be a result of emotional appraisal processes, we recorded facial electromyographic (EMG) activation from the zygomaticus major and frontalis medialis muscles to presentation of static facial and bodily expressions of emotions (i.e, happiness, anger, fear and neutral) in 3-years old children. Results showed no specific EMG activation in response to bodily emotion expressions. However, observing others’ happy faces lead to the increased activation of the zygomaticus major and decreased activation of the frontalis medialis, while observing angry faces elicited the opposite pattern of activation. This study suggests that RFRs are the result of complex mechanisms in which both affective processes and motor resonance may play an important role

    The eyes know it: Toddlers' visual scanning of sad faces is predicted by their theory of mind skills

    Get PDF
    The current research explored toddlers’ gaze fixation during a scene showing a person expressing sadness after a ball is stolen from her. The relation between the duration of gaze fixation on different parts of the person’s sad face (e.g., eyes, mouth) and theory of mind skills was examined. Eye tracking data indicated that before the actor experienced the negative event, toddlers divided their fixation equally between the actor’s happy face and other distracting objects, but looked longer at the face after the ball was stolen and she expressed sadness. The strongest predictor of increased focus on the sad face versus other elements of the scene was toddlers’ ability to predict others’ emotional reactions when outcomes fulfilled (happiness) or failed to fulfill (sadness) desires, whereas toddlers’ visual perspective- taking skills predicted their more specific focusing on the actor’s eyes and, for boys only, mouth. Furthermore, gender differences emerged in toddlers’ fixation on parts of the scene. Taken together, these findings suggest that top-down processes are involved in the scanning of emotional facial expressions in toddlers

    Coherent emotional perception from body expressions and the voice

    Get PDF
    Perceiving emotion from multiple modalities enhances the perceptual sensitivity of an individual. This allows more accurate judgments of others’ emotional states, which is crucial to appropriate social interactions. It is known that body expressions effectively convey emotional messages, although fewer studies have examined how this information is combined with the auditory cues. The present study used event-related potentials (ERP) to investigate the interaction between emotional body expressions and vocalizations. We also examined emotional congruency between auditory and visual information to determine how preceding visual context influences later auditory processing. Consistent with prior findings, a reduced N1 amplitude was observed in the audiovisual condition compared to an auditory-only condition. While this component was not sensitive to the modality congruency, the P2 was sensitive to the emotionally incompatible audiovisual pairs. Further, the direction of these congruency effects was different in terms of facilitation or suppression based on the preceding contexts. Overall, the results indicate a functionally dissociated mechanism underlying two stages of emotional processing whereby N1 is involved in cross-modal processing, whereas P2 is related to assessing a unifying perceptual content. These data also indicate that emotion integration can be affected by the specific emotion that is presented

    EgoActive: Integrated wireless wearable sensors for capturing infant egocentric auditory-visual statistics and autonomic nervous system function ‘in the wild’

    Get PDF
    There have been sustained efforts toward using naturalistic methods in developmental science to measure infant behaviors in the real world from an egocentric perspective because statistical regularities in the environment can shape and be shaped by the developing infant. However, there is no user-friendly and unobtrusive technology to densely and reliably sample life in the wild. To address this gap, we present the design, implementation and validation of the EgoActive platform, which addresses limitations of existing wearable technologies for developmental research. EgoActive records the active infants’ egocentric perspective of the world via a miniature wireless head-mounted camera concurrently with their physiological responses to this input via a lightweight, wireless ECG/acceleration sensor. We also provide software tools to facilitate data analyses. Our validation studies showed that the cameras and body sensors performed well. Families also reported that the platform was comfortable, easy to use and operate, and did not interfere with daily activities. The synchronized multi-modal data from the EgoActive platform can help tease apart complex processes that are important for child development to further our understanding of areas ranging from executive function to emotion processing and social learning

    Eight-month-old infants’ behavioural responses to peers’ emotions as related to the asymmetric frontal cortex activity

    Get PDF
    Infants are sensitive to and converge emotionally with peers’ distress. It is unclear whether these responses extend to positive affect and whether observing peer emotions motivates infants’ behaviors. This study investigates 8-month-olds’ asymmetric frontal EEG during peers’ cry and laughter, and its relation to approach and withdrawal behaviors. Participants observed videos of infant crying or laughing during two separate sessions. Frontal EEG alpha power was recorded during the first, while infants’ behaviors and emotional expressions were recorded during the second session. Facial and vocal expressions of affect suggest that infants converge emotionally with their peers’ distress, and, to a certain extent, with their happiness. At group level, the crying peer elicited right lateralized frontal activity. However, those infants with reduced right and increased left frontal activity in this situation, were more likely to approach their peer. Overall, 8-month-olds did not show asymmetric frontal activity in response to peer laughter. But, those infants who tended to look longer at their happy peer were more likely to respond with left lateralized frontal activity. The link between variations in left frontal activity and simple approach behaviors indicates the presence of a motivational dimension to infants’ responses to distressed peers
    corecore