147 research outputs found

    A comparison of the development of audiovisual integration in children with autism spectrum disorders and typically developing children

    Get PDF
    This study aimed to investigate the development of audiovisual integration in children with Autism Spectrum Disorder (ASD). Audiovisual integration was measured using the McGurk effect in children with ASD aged 7–16 years and typically developing children (control group) matched approximately for age, sex, nonverbal ability and verbal ability. Results showed that the children with ASD were delayed in visual accuracy and audiovisual integration compared to the control group. However, in the audiovisual integration measure, children with ASD appeared to ‘catch-up’ with their typically developing peers at the older age ranges. The suggestion that children with ASD show a deficit in audiovisual integration which diminishes with age has clinical implications for those assessing and treating these children

    Atypical audiovisual speech integration in infants at risk for autism

    Get PDF
    The language difficulties often seen in individuals with autism might stem from an inability to integrate audiovisual information, a skill important for language development. We investigated whether 9-month-old siblings of older children with autism, who are at an increased risk of developing autism, are able to integrate audiovisual speech cues. We used an eye-tracker to record where infants looked when shown a screen displaying two faces of the same model, where one face is articulating/ba/and the other/ga/, with one face congruent with the syllable sound being presented simultaneously, the other face incongruent. This method was successful in showing that infants at low risk can integrate audiovisual speech: they looked for the same amount of time at the mouths in both the fusible visual/ga/− audio/ba/and the congruent visual/ba/− audio/ba/displays, indicating that the auditory and visual streams fuse into a McGurk-type of syllabic percept in the incongruent condition. It also showed that low-risk infants could perceive a mismatch between auditory and visual cues: they looked longer at the mouth in the mismatched, non-fusible visual/ba/− audio/ga/display compared with the congruent visual/ga/− audio/ga/display, demonstrating that they perceive an uncommon, and therefore interesting, speech-like percept when looking at the incongruent mouth (repeated ANOVA: displays x fusion/mismatch conditions interaction: F(1,16) = 17.153, p = 0.001). The looking behaviour of high-risk infants did not differ according to the type of display, suggesting difficulties in matching auditory and visual information (repeated ANOVA, displays x conditions interaction: F(1,25) = 0.09, p = 0.767), in contrast to low-risk infants (repeated ANOVA: displays x conditions x low/high-risk groups interaction: F(1,41) = 4.466, p = 0.041). In some cases this reduced ability might lead to the poor communication skills characteristic of autism

    Exposure to Non-Steroidal Anti-Inflammatory Drugs during Pregnancy and the Risk of Selected Birth Defects: A Prospective Cohort Study

    Get PDF
    Contains fulltext : 97906.pdf (publisher's version ) (Open Access)BACKGROUND: Since use of non-steroidal anti-inflammatory drugs (NSAIDs) during pregnancy is common, small increases in the risk of birth defects may have significant implications for public health. Results of human studies on the teratogenic risks of NSAIDs are inconsistent. Therefore, we evaluated the risk of selected birth defects after prenatal exposure to prescribed and over-the-counter NSAIDs. METHODS AND FINDINGS: We used data on 69,929 women enrolled in the Norwegian Mother and Child Cohort Study between 1999 and 2006. Data on NSAID exposure were available from a self-administered questionnaire completed around gestational week 17. Information on pregnancy outcome was obtained from the Medical Birth Registry of Norway. Only birth defects suspected to be associated with NSAID exposure based upon proposed teratogenic mechanisms and previous studies were included in the multivariable logistic regression analyses. A total of 3,023 women used NSAIDs in gestational weeks 0-12 and 64,074 women did not report NSAID use in early pregnancy. No associations were observed between overall exposure to NSAIDs during pregnancy and the selected birth defects separately or as a group (adjusted odds ratio 0.7, 95% confidence interval 0.4-1.1). Associations between maternal use of specific types of NSAIDs and the selected birth defects were not found either, although an increased risk was seen for septal defects and exposure to multiple NSAIDs based on small numbers (2 exposed cases; crude odds ratio 3.9, 95% confidence interval 0.9-15.7). CONCLUSIONS: Exposure to NSAIDs during the first 12 weeks of gestation does not seem to be associated with an increased risk of the selected birth defects. However, due to the small numbers of NSAID-exposed infants for the individual birth defect categories, increases in the risks of specific birth defects could not be excluded

    The significance of tumour microarchitectural features in breast cancer prognosis: a digital image analysis

    Get PDF
    BACKGROUND: As only a minor portion of the information present in histological sections is accessible by eye, recognition and quantification of complex patterns and relationships among constituents relies on digital image analysis. In this study, our working hypothesis was that, with the application of digital image analysis technology, visually unquantifiable breast cancer microarchitectural features can be rigorously assessed and tested as prognostic parameters for invasive breast carcinoma of no special type. METHODS: Digital image analysis was performed using public domain software (ImageJ) on tissue microarrays from a cohort of 696 patients, and validated with a commercial platform (Visiopharm). Quantified features included elements defining tumour microarchitecture, with emphasis on the extent of tumour-stroma interface. The differential prognostic impact of tumour nest microarchitecture in the four immunohistochemical surrogates for molecular classification was analysed. Prognostic parameters included axillary lymph node status, breast cancer-specific survival, and time to distant metastasis. Associations of each feature with prognostic parameters were assessed using logistic regression and Cox proportional models adjusting for age at diagnosis, grade, and tumour size. RESULTS: An arrangement in numerous small nests was associated with axillary lymph node involvement. The association was stronger in luminal tumours (odds ratio (OR) = 1.39, p = 0.003 for a 1-SD increase in nest number, OR = 0.75, p = 0.006 for mean nest area). Nest number was also associated with survival (hazard ratio (HR) = 1.15, p = 0.027), but total nest perimeter was the parameter most significantly associated with survival in luminal tumours (HR = 1.26, p = 0.005). In the relatively small cohort of triple-negative tumours, mean circularity showed association with time to distant metastasis (HR = 1.71, p = 0.027) and survival (HR = 1.8, p = 0.02). CONCLUSIONS: We propose that tumour arrangement in few large nests indicates a decreased metastatic potential. By contrast, organisation in numerous small nests provides the tumour with increased metastatic potential to regional lymph nodes. An outstretched pattern in small nests bestows tumours with a tendency for decreased breast cancer-specific survival. Although further validation studies are required before the argument for routine quantification of microarchitectural features is established, our approach is consistent with the demand for cost-effective methods for triaging breast cancer patients that are more likely to benefit from chemotherapy

    No rapid audiovisual recalibration in adults on the autism spectrum

    Get PDF
    Autism spectrum disorders (ASD) are characterized by difficulties in social cognition, but are also associated with atypicalities in sensory and perceptual processing. Several groups have reported that autistic individuals show reduced integration of socially relevant audiovisual signals, which may contribute to the higher-order social and cognitive difficulties observed in autism. Here we use a newly devised technique to study instantaneous adaptation to audiovisual asynchrony in autism. Autistic and typical participants were presented with sequences of brief visual and auditory stimuli, varying in asynchrony over a wide range, from 512 ms auditory-lead to 512 ms auditory-lag, and judged whether they seemed to be synchronous. Typical adults showed strong adaptation effects, with trials proceeded by an auditory-lead needing more auditory-lead to seem simultaneous, and vice versa. However, autistic observers showed little or no adaptation, although their simultaneity curves were as narrow as the typical adults. This result supports recent Bayesian models that predict reduced adaptation effects in autism. As rapid audiovisual recalibration may be fundamental for the optimisation of speech comprehension, recalibration problems could render language processing more difficult in autistic individuals, hindering social communication

    Context Modulation of Facial Emotion Perception Differed by Individual Difference

    Get PDF
    Background: Certain facial configurations are believed to be associated with distinct affective meanings (i.e. basic facial expressions), and such associations are common across cultures (i.e. universality of facial expressions). However, recently, many studies suggest that various types of contextual information, rather than facial configuration itself, are important factor for facial emotion perception. Methodology/Principal Findings: To examine systematically how contextual information influences individuals ’ facial emotion perception, the present study estimated direct observers ’ perceptual thresholds for detecting negative facial expressions via a forced-choice psychophysical procedure using faces embedded in various emotional contexts. We additionally measured the individual differences in affective information-processing tendency (BIS/BAS) as a possible factor that may determine the extent to which contextual information on facial emotion perception is used. It was found that contextual information influenced observers ’ perceptual thresholds for facial emotion. Importantly, individuals ’ affectiveinformation tendencies modulated the extent to which they incorporated context information into their facial emotion perceptions. Conclusions/Significance: The findings of this study suggest that facial emotion perception not only depends on facial configuration, but the context in which the face appears as well. This contextual influence appeared differently wit

    An Estimate of Avian Mortality at Communication Towers in the United States and Canada

    Get PDF
    Avian mortality at communication towers in the continental United States and Canada is an issue of pressing conservation concern. Previous estimates of this mortality have been based on limited data and have not included Canada. We compiled a database of communication towers in the continental United States and Canada and estimated avian mortality by tower with a regression relating avian mortality to tower height. This equation was derived from 38 tower studies for which mortality data were available and corrected for sampling effort, search efficiency, and scavenging where appropriate. Although most studies document mortality at guyed towers with steady-burning lights, we accounted for lower mortality at towers without guy wires or steady-burning lights by adjusting estimates based on published studies. The resulting estimate of mortality at towers is 6.8 million birds per year in the United States and Canada. Bootstrapped subsampling indicated that the regression was robust to the choice of studies included and a comparison of multiple regression models showed that incorporating sampling, scavenging, and search efficiency adjustments improved model fit. Estimating total avian mortality is only a first step in developing an assessment of the biological significance of mortality at communication towers for individual species or groups of species. Nevertheless, our estimate can be used to evaluate this source of mortality, develop subsequent per-species mortality estimates, and motivate policy action

    Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

    Get PDF
    Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions
    • …
    corecore