248 research outputs found

    How Easy is it to Read the Minds of People with Autism Spectrum Disorder?

    Get PDF
    yesHow well can neurotypical adults’ interpret mental states in people with ASD? ‘Targets’ (ASD and neurotypical) reactions to four events were video-recorded then shown to neurotypical participants whose task was to identify which event the target had experienced. In study 1 participants were more successful for neurotypical than ASD targets. In study 2, participants rated ASD targets equally expressive as neurotypical targets for three of the events, while in study 3 participants gave different verbal descriptions of the reactions of ASD and neurotypical targets. It thus seems people with ASD react differently but not less expressively to events. Because neurotypicals are ineffective in interpreting the behaviour of those with ASD, this could contribute to the social difficulties in ASD

    Assessing the long-term effectiveness of cladribine vs. placebo in the relapsing-remitting multiple sclerosis CLARITY randomized controlled trial and CLARITY extension using treatment switching adjustment methods

    Get PDF
    Objectives: Treatment switching adjustment methods are often used to adjust for switching in oncology randomized controlled trials (RCTs). In this exploratory analysis, we apply these methods to adjust for treatment changes in the setting of an RCT followed by an extension study in relapsing-remitting multiple sclerosis. Methods: The CLARITY trial evaluated cladribine tablets versus placebo over 96 weeks. In the 96-week CLARITY Extension, patients who received placebo in CLARITY received cladribine tablets; patients who received cladribine tablets in CLARITY were re-randomized to placebo or cladribine tablets. Endpoints were time to first qualifying relapse (FQR) and time to 3- and 6-month confirmed disability progression (3mCDP, 6mCDP). We aimed to compare the effectiveness of cladribine tablets to placebo over CLARITY and the extension. The rank preserving structural failure time model (RPSFTM) and Iterative Parameter Estimation (IPE) were used to estimate what would have happened if patients had received placebo in CLARITY and the extension, versus patients that received cladribine tablets and switched to placebo. To gauge whether treatment effect waned after the 96 weeks of CLARITY, we compared hazard ratios (HRs) from the adjustment analysis with HRs from CLARITY. Results: The RPSFTM resulted in a HR of 0.48 (95% confidence interval [CI] 0.36-0.62) for FQR, 0.62 (95% CI 0.46-0.84) for 3mCDP, and 0.62 (95% CI 0.44-0.88) for 6mCDP. IPE algorithm results were similar. CLARITY HRs were 0.44 (95% CI 0.34-0.58), 0.60 (95% CI 0.41-0.87) and 0.58 (95% CI 0.40-0.83) for FQR, 3mCDP and 6mCDP respectively. Conclusions: Treatment switching adjustment methods are applicable in non-oncology settings. Adjusted CLARITY plus CLARITY Extension HRs were similar to the CLARITY HRs, demonstrating significant treatment benefits associated with cladribine tablets versus placebo

    An innovative approach to modelling the optimal treatment sequence for patients with relapsing–remitting multiple sclerosis : implementation, validation, and impact of the decision-making approach

    Get PDF
    Introduction An innovative computational model was developed to address challenges regarding the evaluation of treatment sequences in patients with relapsing–remitting multiple sclerosis (RRMS) through the concept of a ‘virtual’ physician who observes and assesses patients over time. We describe the implementation and validation of the model, then apply this framework as a case study to determine the impact of different decision-making approaches on the optimal sequence of disease-modifying therapies (DMTs) and associated outcomes. Methods A patient-level discrete event simulation (DES) was used to model heterogeneity in disease trajectories and outcomes. The evaluation of DMT options was implemented through a Markov model representing the patient’s disease; outcomes included lifetime costs and quality of life. The DES and Markov models underwent internal and external validation. Analyses of the optimal treatment sequence for each patient were based on several decision-making criteria. These treatment sequences were compared to current treatment guidelines. Results Internal validation indicated that model outcomes for natural history were consistent with the input parameters used to inform the model. Costs and quality of life outcomes were successfully validated against published reference models. Whereas each decision-making criterion generated a different optimal treatment sequence, cladribine tablets were the only DMT common to all treatment sequences. By choosing treatments on the basis of minimising disease progression or number of relapses, it was possible to improve on current treatment guidelines; however, these treatment sequences were more costly. Maximising cost-effectiveness resulted in the lowest costs but was also associated with the worst outcomes. Conclusions The model was robust in generating outcomes consistent with published models and studies. It was also able to identify optimal treatment sequences based on different decision criteria. This innovative modelling framework has the potential to simulate individual patient trajectories in the current treatment landscape and may be useful for treatment switching and treatment positioning decisions in RRMS

    Onset of Superfluidity in 4He Films Adsorbed on Disordered Substrates

    Full text link
    We have studied 4He films adsorbed in two porous glasses, aerogel and Vycor, using high precision torsional oscillator and DC calorimetry techniques. Our investigation focused on the onset of superfluidity at low temperatures as the 4He coverage is increased. Torsional oscillator measurements of the 4He-aerogel system were used to determine the superfluid density of films with transition temperatures as low as 20 mK. Heat capacity measurements of the 4He-Vycor system probed the excitation spectrum of both non-superfluid and superfluid films for temperatures down to 10 mK. Both sets of measurements suggest that the critical coverage for the onset of superfluidity corresponds to a mobility edge in the chemical potential, so that the onset transition is the bosonic analog of a superconductor-insulator transition. The superfluid density measurements, however, are not in agreement with the scaling theory of an onset transition from a gapless, Bose glass phase to a superfluid. The heat capacity measurements show that the non-superfluid phase is better characterized as an insulator with a gap.Comment: 15 pages (RevTex), 21 figures (postscript

    Relic Neutrino Absorption Spectroscopy

    Full text link
    Resonant annihilation of extremely high-energy cosmic neutrinos on big-bang relic anti-neutrinos (and vice versa) into Z-bosons leads to sizable absorption dips in the neutrino flux to be observed at Earth. The high-energy edges of these dips are fixed, via the resonance energies, by the neutrino masses alone. Their depths are determined by the cosmic neutrino background density, by the cosmological parameters determining the expansion rate of the universe, and by the large redshift history of the cosmic neutrino sources. We investigate the possibility of determining the existence of the cosmic neutrino background within the next decade from a measurement of these absorption dips in the neutrino flux. As a by-product, we study the prospects to infer the absolute neutrino mass scale. We find that, with the presently planned neutrino detectors (ANITA, Auger, EUSO, OWL, RICE, and SalSA) operating in the relevant energy regime above 10^{21} eV, relic neutrino absorption spectroscopy becomes a realistic possibility. It requires, however, the existence of extremely powerful neutrino sources, which should be opaque to nucleons and high-energy photons to evade present constraints. Furthermore, the neutrino mass spectrum must be quasi-degenerate to optimize the dip, which implies m_{nu} >~ 0.1 eV for the lightest neutrino. With a second generation of neutrino detectors, these demanding requirements can be relaxed considerably.Comment: 19 pages, 26 figures, REVTeX

    Heavy quarkonium: progress, puzzles, and opportunities

    Get PDF
    A golden age for heavy quarkonium physics dawned a decade ago, initiated by the confluence of exciting advances in quantum chromodynamics (QCD) and an explosion of related experimental activity. The early years of this period were chronicled in the Quarkonium Working Group (QWG) CERN Yellow Report (YR) in 2004, which presented a comprehensive review of the status of the field at that time and provided specific recommendations for further progress. However, the broad spectrum of subsequent breakthroughs, surprises, and continuing puzzles could only be partially anticipated. Since the release of the YR, the BESII program concluded only to give birth to BESIII; the BB-factories and CLEO-c flourished; quarkonium production and polarization measurements at HERA and the Tevatron matured; and heavy-ion collisions at RHIC have opened a window on the deconfinement regime. All these experiments leave legacies of quality, precision, and unsolved mysteries for quarkonium physics, and therefore beg for continuing investigations. The plethora of newly-found quarkonium-like states unleashed a flood of theoretical investigations into new forms of matter such as quark-gluon hybrids, mesonic molecules, and tetraquarks. Measurements of the spectroscopy, decays, production, and in-medium behavior of c\bar{c}, b\bar{b}, and b\bar{c} bound states have been shown to validate some theoretical approaches to QCD and highlight lack of quantitative success for others. The intriguing details of quarkonium suppression in heavy-ion collisions that have emerged from RHIC have elevated the importance of separating hot- and cold-nuclear-matter effects in quark-gluon plasma studies. This review systematically addresses all these matters and concludes by prioritizing directions for ongoing and future efforts.Comment: 182 pages, 112 figures. Editors: N. Brambilla, S. Eidelman, B. K. Heltsley, R. Vogt. Section Coordinators: G. T. Bodwin, E. Eichten, A. D. Frawley, A. B. Meyer, R. E. Mitchell, V. Papadimitriou, P. Petreczky, A. A. Petrov, P. Robbe, A. Vair

    Scale-free static and dynamical correlations in melts of monodisperse and Flory-distributed homopolymers: A review of recent bond-fluctuation model studies

    Full text link
    It has been assumed until very recently that all long-range correlations are screened in three-dimensional melts of linear homopolymers on distances beyond the correlation length ξ\xi characterizing the decay of the density fluctuations. Summarizing simulation results obtained by means of a variant of the bond-fluctuation model with finite monomer excluded volume interactions and topology violating local and global Monte Carlo moves, we show that due to an interplay of the chain connectivity and the incompressibility constraint, both static and dynamical correlations arise on distances rξr \gg \xi. These correlations are scale-free and, surprisingly, do not depend explicitly on the compressibility of the solution. Both monodisperse and (essentially) Flory-distributed equilibrium polymers are considered.Comment: 60 pages, 49 figure

    Pharmacognostical Sources of Popular Medicine To Treat Alzheimer’s Disease

    Get PDF

    Team dynamics in emergency surgery teams: results from a first international survey

    Get PDF
    Background: Emergency surgery represents a unique context. Trauma teams are often multidisciplinary and need to operate under extreme stress and time constraints, sometimes with no awareness of the trauma\u2019s causes or the patient\u2019s personal and clinical information. In this perspective, the dynamics of how trauma teams function is fundamental to ensuring the best performance and outcomes. Methods: An online survey was conducted among the World Society of Emergency Surgery members in early 2021. 402 fully filled questionnaires on the topics of knowledge translation dynamics and tools, non-technical skills, and difficulties in teamwork were collected. Data were analyzed using the software R, and reported following the Checklist for Reporting Results of Internet E-Surveys (CHERRIES). Results: Findings highlight how several surgeons are still unsure about the meaning and potential of knowledge translation and its mechanisms. Tools like training, clinical guidelines, and non-technical skills are recognized and used in clinical practice. Others, like patients\u2019 and stakeholders\u2019 engagement, are hardly implemented, despite their increasing importance in the modern healthcare scenario. Several difficulties in working as a team are described, including the lack of time, communication, training, trust, and ego. Discussion: Scientific societies should take the lead in offering training and support about the abovementioned topics. Dedicated educational initiatives, practical cases and experiences, workshops and symposia may allow mitigating the difficulties highlighted by the survey\u2019s participants, boosting the performance of emergency teams. Additional investigation of the survey results and its characteristics may lead to more further specific suggestions and potential solutions

    Integrating sequence and array data to create an improved 1000 Genomes Project haplotype reference panel

    Get PDF
    A major use of the 1000 Genomes Project (1000GP) data is genotype imputation in genome-wide association studies (GWAS). Here we develop a method to estimate haplotypes from low-coverage sequencing data that can take advantage of single-nucleotide polymorphism (SNP) microarray genotypes on the same samples. First the SNP array data are phased to build a backbone (or 'scaffold') of haplotypes across each chromosome. We then phase the sequence data 'onto' this haplotype scaffold. This approach can take advantage of relatedness between sequenced and non-sequenced samples to improve accuracy. We use this method to create a new 1000GP haplotype reference set for use by the human genetic community. Using a set of validation genotypes at SNP and bi-allelic indels we show that these haplotypes have lower genotype discordance and improved imputation performance into downstream GWAS samples, especially at low-frequency variants. © 2014 Macmillan Publishers Limited. All rights reserved
    corecore