91 research outputs found

    GRFS and CRFS in alternative donor hematopoietic cell transplantation for pediatric patients with acute leukemia.

    Get PDF
    We report graft-versus-host disease (GVHD)-free relapse-free survival (GRFS) (a composite end point of survival without grade III-IV acute GVHD [aGVHD], systemic therapy-requiring chronic GVHD [cGVHD], or relapse) and cGVHD-free relapse-free survival (CRFS) among pediatric patients with acute leukemia (n = 1613) who underwent transplantation with 1 antigen-mismatched (7/8) bone marrow (BM; n = 172) or umbilical cord blood (UCB; n = 1441). Multivariate analysis was performed using Cox proportional hazards models. To account for multiple testing, P \u3c .01 for the donor/graft variable was considered statistically significant. Clinical characteristics were similar between UCB and 7/8 BM recipients, because most had acute lymphoblastic leukemia (62%), 64% received total body irradiation-based conditioning, and 60% received anti-thymocyte globulin or alemtuzumab. Methotrexate-based GVHD prophylaxis was more common with 7/8 BM (79%) than with UCB (15%), in which mycophenolate mofetil was commonly used. The univariate estimates of GRFS and CRFS were 22% (95% confidence interval [CI], 16-29) and 27% (95% CI, 20-34), respectively, with 7/8 BM and 33% (95% CI, 31-36) and 38% (95% CI, 35-40), respectively, with UCB (P \u3c .001). In multivariate analysis, 7/8 BM vs UCB had similar GRFS (hazard ratio [HR], 1.12; 95% CI, 0.87-1.45; P = .39), CRFS (HR, 1.06; 95% CI, 0.82-1.38; P = .66), overall survival (HR, 1.07; 95% CI, 0.80-1.44; P = .66), and relapse (HR, 1.44; 95% CI, 1.03-2.02; P = .03). However, the 7/8 BM group had a significantly higher risk for grade III-IV aGVHD (HR, 1.70; 95% CI, 1.16-2.48; P = .006) compared with the UCB group. UCB and 7/8 BM groups had similar outcomes, as measured by GRFS and CRFS. However, given the higher risk for grade III-IV aGVHD, UCB might be preferred for patients lacking matched donors. © 2019 American Society of Hematology. All rights reserved

    Antimicrobial resistance among migrants in Europe: a systematic review and meta-analysis

    Get PDF
    BACKGROUND: Rates of antimicrobial resistance (AMR) are rising globally and there is concern that increased migration is contributing to the burden of antibiotic resistance in Europe. However, the effect of migration on the burden of AMR in Europe has not yet been comprehensively examined. Therefore, we did a systematic review and meta-analysis to identify and synthesise data for AMR carriage or infection in migrants to Europe to examine differences in patterns of AMR across migrant groups and in different settings. METHODS: For this systematic review and meta-analysis, we searched MEDLINE, Embase, PubMed, and Scopus with no language restrictions from Jan 1, 2000, to Jan 18, 2017, for primary data from observational studies reporting antibacterial resistance in common bacterial pathogens among migrants to 21 European Union-15 and European Economic Area countries. To be eligible for inclusion, studies had to report data on carriage or infection with laboratory-confirmed antibiotic-resistant organisms in migrant populations. We extracted data from eligible studies and assessed quality using piloted, standardised forms. We did not examine drug resistance in tuberculosis and excluded articles solely reporting on this parameter. We also excluded articles in which migrant status was determined by ethnicity, country of birth of participants' parents, or was not defined, and articles in which data were not disaggregated by migrant status. Outcomes were carriage of or infection with antibiotic-resistant organisms. We used random-effects models to calculate the pooled prevalence of each outcome. The study protocol is registered with PROSPERO, number CRD42016043681. FINDINGS: We identified 2274 articles, of which 23 observational studies reporting on antibiotic resistance in 2319 migrants were included. The pooled prevalence of any AMR carriage or AMR infection in migrants was 25·4% (95% CI 19·1-31·8; I2 =98%), including meticillin-resistant Staphylococcus aureus (7·8%, 4·8-10·7; I2 =92%) and antibiotic-resistant Gram-negative bacteria (27·2%, 17·6-36·8; I2 =94%). The pooled prevalence of any AMR carriage or infection was higher in refugees and asylum seekers (33·0%, 18·3-47·6; I2 =98%) than in other migrant groups (6·6%, 1·8-11·3; I2 =92%). The pooled prevalence of antibiotic-resistant organisms was slightly higher in high-migrant community settings (33·1%, 11·1-55·1; I2 =96%) than in migrants in hospitals (24·3%, 16·1-32·6; I2 =98%). We did not find evidence of high rates of transmission of AMR from migrant to host populations. INTERPRETATION: Migrants are exposed to conditions favouring the emergence of drug resistance during transit and in host countries in Europe. Increased antibiotic resistance among refugees and asylum seekers and in high-migrant community settings (such as refugee camps and detention facilities) highlights the need for improved living conditions, access to health care, and initiatives to facilitate detection of and appropriate high-quality treatment for antibiotic-resistant infections during transit and in host countries. Protocols for the prevention and control of infection and for antibiotic surveillance need to be integrated in all aspects of health care, which should be accessible for all migrant groups, and should target determinants of AMR before, during, and after migration. FUNDING: UK National Institute for Health Research Imperial Biomedical Research Centre, Imperial College Healthcare Charity, the Wellcome Trust, and UK National Institute for Health Research Health Protection Research Unit in Healthcare-associated Infections and Antimictobial Resistance at Imperial College London

    Eye Size at Birth in Prosimian Primates: Life History Correlates and Growth Patterns

    Get PDF
    BACKGROUND: Primates have large eyes relative to head size, which profoundly influence the ontogenetic emergence of facial form. However, growth of the primate eye is only understood in a narrow taxonomic perspective, with information biased toward anthropoids.\ud \ud METHODOLOGY/PRINCIPAL FINDINGS: We measured eye and bony orbit size in perinatal prosimian primates (17 strepsirrhine taxa and Tarsius syrichta) to infer the extent of prenatal as compared to postnatal eye growth. In addition, multiple linear regression was used to detect relationships of relative eye and orbit diameter to life history variables. ANOVA was used to determine if eye size differed according to activity pattern. In most of the species, eye diameter at birth measures more than half of that for adults. Two exceptions include Nycticebus and Tarsius, in which more than half of eye diameter growth occurs postnatally. Ratios of neonate/adult eye and orbit diameters indicate prenatal growth of the eye is actually more rapid than that of the orbit. For example, mean neonatal transverse eye diameter is 57.5% of the adult value (excluding Nycticebus and Tarsius), compared to 50.8% for orbital diameter. If Nycticebus is excluded, relative gestation age has a significant positive correlation with relative eye diameter in strepsirrhines, explaining 59% of the variance in relative transverse eye diameter. No significant differences were found among species with different activity patterns.\ud \ud CONCLUSIONS/SIGNIFICANCE: The primate developmental strategy of relatively long gestations is probably tied to an extended period of neural development, and this principle appears to apply to eye growth as well. Our findings indicate that growth rates of the eye and bony orbit are disassociated, with eyes growing faster prenatally, and the growth rate of the bony orbit exceeding that of the eyes after birth. Some well-documented patterns of orbital morphology in adult primates, such as the enlarged orbits of nocturnal species, mainly emerge during postnatal development.\ud \u

    Visual Scan Paths and Recognition of Facial Identity in Autism Spectrum Disorder and Typical Development

    Get PDF
    Background: Previous research suggests that many individuals with autism spectrum disorder (ASD) have impaired facial identity recognition, and also exhibit abnormal visual scanning of faces. Here, two hypotheses accounting for an association between these observations were tested: i) better facial identity recognition is associated with increased gaze time on the Eye region; ii) better facial identity recognition is associated with increased eye-movements around the face. Methodology and Principal Findings: Eye-movements of 11 children with ASD and 11 age-matched typically developing (TD) controls were recorded whilst they viewed a series of faces, and then completed a two alternative forced-choice recognition memory test for the faces. Scores on the memory task were standardized according to age. In both groups, there was no evidence of an association between the proportion of time spent looking at the Eye region of faces and age-standardized recognition performance, thus the first hypothesis was rejected. However, the 'Dynamic Scanning Index' - which was incremented each time the participant saccaded into and out of one of the core-feature interest areas - was strongly asso ciated with age-standardized face recognition scores in both groups, even after controlling for various other potential predictors of performance. Conclusions and Significance: In support of the second hypothesis, results suggested that increased saccading between core-features was associated with more accurate face recognition ability, both in typical development and ASD. Causal directions of this relationship remain undetermined.10 page(s

    Natural Variation of Model Mutant Phenotypes in Ciona intestinalis

    Get PDF
    BACKGROUND: The study of ascidians (Chordata, Tunicata) has made a considerable contribution to our understanding of the origin and evolution of basal chordates. To provide further information to support forward genetics in Ciona intestinalis, we used a combination of natural variation and neutral population genetics as an approach for the systematic identification of new mutations. In addition to the significance of developmental variation for phenotype-driven studies, this approach can encompass important implications in evolutionary and population biology. METHODOLOGY/PRINCIPAL FINDINGS: Here, we report a preliminary survey for naturally occurring mutations in three geographically interconnected populations of C. intestinalis. The influence of historical, geographical and environmental factors on the distribution of abnormal phenotypes was assessed by means of 12 microsatellites. We identified 37 possible mutant loci with stereotyped defects in embryonic development that segregate in a way typical of recessive alleles. Local populations were found to differ in genetic organization and frequency distribution of phenotypic classes. CONCLUSIONS/SIGNIFICANCE: Natural genetic polymorphism of C. intestinalis constitutes a valuable source of phenotypes for studying embryonic development in ascidians. Correlating genetic structure and the occurrence of abnormal phenotypes is a crucial focus for understanding the selective forces that shape natural finite populations, and may provide insights of great importance into the evolutionary mechanisms that generate animal diversity

    Grid Cells, Place Cells, and Geodesic Generalization for Spatial Reinforcement Learning

    Get PDF
    Reinforcement learning (RL) provides an influential characterization of the brain's mechanisms for learning to make advantageous choices. An important problem, though, is how complex tasks can be represented in a way that enables efficient learning. We consider this problem through the lens of spatial navigation, examining how two of the brain's location representations—hippocampal place cells and entorhinal grid cells—are adapted to serve as basis functions for approximating value over space for RL. Although much previous work has focused on these systems' roles in combining upstream sensory cues to track location, revisiting these representations with a focus on how they support this downstream decision function offers complementary insights into their characteristics. Rather than localization, the key problem in learning is generalization between past and present situations, which may not match perfectly. Accordingly, although neural populations collectively offer a precise representation of position, our simulations of navigational tasks verify the suggestion that RL gains efficiency from the more diffuse tuning of individual neurons, which allows learning about rewards to generalize over longer distances given fewer training experiences. However, work on generalization in RL suggests the underlying representation should respect the environment's layout. In particular, although it is often assumed that neurons track location in Euclidean coordinates (that a place cell's activity declines “as the crow flies” away from its peak), the relevant metric for value is geodesic: the distance along a path, around any obstacles. We formalize this intuition and present simulations showing how Euclidean, but not geodesic, representations can interfere with RL by generalizing inappropriately across barriers. Our proposal that place and grid responses should be modulated by geodesic distances suggests novel predictions about how obstacles should affect spatial firing fields, which provides a new viewpoint on data concerning both spatial codes

    Optic chiasm in the species of order Clupeiformes, family Clupeidae: Optic chiasm of Spratelloides gracilis shows an opposite laterality to that of Etrumeus teres

    Get PDF
    In most teleost fishes, the optic nerves decussate completely as they project to the mesencephalic region. Examination of the decussation pattern of 25 species from 11 different orders in Pisces revealed that each species shows a specific chiasmic type. In 11 species out of the 25, laterality of the chiasmic pattern was not determined; in half of the individuals examined, the left optic nerve ran dorsally to the right optic nerve, while in the other half, the right optic nerve was dorsal. In eight other species the optic nerves from both eyes branched into several bundles at the chiasmic point, and intercalated to form a complicated decussation pattern. In the present study we report our findings that Spratelloides gracilis, of the order Clupeiformes, family Clupeidae, shows a particular laterality of decussation: the left optic nerve ran dorsally to the right (n = 200/202). In contrast, Etrumeus teres, of the same order and family, had a strong preference of the opposite (complementary) chiasmic pattern to that of S. gracilis (n = 59/59), revealing that these two species display opposite left–right optic chiasm patterning. As far as we investigated, other species of Clupeiformes have not shown left–right preference in the decussation pattern. We conclude that the opposite laterality of the optic chiasms of these two closely related species, S. gracilis and E. teres, enables investigation of species-specific laterality in fishes of symmetric shapes

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study

    Get PDF
    Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world. Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231. Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001). Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication

    Finishing the euchromatic sequence of the human genome

    Get PDF
    The sequence of the human genome encodes the genetic instructions for human physiology, as well as rich information about human evolution. In 2001, the International Human Genome Sequencing Consortium reported a draft sequence of the euchromatic portion of the human genome. Since then, the international collaboration has worked to convert this draft into a genome sequence with high accuracy and nearly complete coverage. Here, we report the result of this finishing process. The current genome sequence (Build 35) contains 2.85 billion nucleotides interrupted by only 341 gaps. It covers ∼99% of the euchromatic genome and is accurate to an error rate of ∼1 event per 100,000 bases. Many of the remaining euchromatic gaps are associated with segmental duplications and will require focused work with new methods. The near-complete sequence, the first for a vertebrate, greatly improves the precision of biological analyses of the human genome including studies of gene number, birth and death. Notably, the human enome seems to encode only 20,000-25,000 protein-coding genes. The genome sequence reported here should serve as a firm foundation for biomedical research in the decades ahead
    corecore