269 research outputs found

    The Prevalence and Risk Factors for Pneumococcal Colonization of the Nasopharynx among Children in Kilifi District, Kenya

    Get PDF
    BACKGROUND: Pneumococcal conjugate vaccines (PCV) reduce nasopharyngeal carriage of vaccine-serotype pneumococci but increase in the carriage of non-vaccine serotypes. We studied the epidemiology of carriage among children 3-59 months old before vaccine introduction in Kilifi, Kenya. METHODS: In a rolling cross-sectional study from October 2006 to December 2008 we approached 3570 healthy children selected at random from the population register of the Kilifi Health and Demographic Surveillance System and 134 HIV-infected children registered at a specialist clinic. A single nasopharyngeal swab was transported in STGG and cultured on gentamicin blood agar. A single colony of pneumococcus was serotyped by Quellung reaction. RESULTS: Families of 2840 children in the population-based sample and 99 in the HIV-infected sample consented to participate; carriage prevalence was 65.8% (95% CI, 64.0-67.5%) and 76% (95% CI, 66-84%) in the two samples, respectively. Carriage prevalence declined progressively with age from 79% at 6-11 months to 51% at 54-59 months (p<0.0005). Carriage was positively associated with coryza (Odds ratio 2.63, 95%CI 2.12-3.25) and cough (1.55, 95%CI 1.26-1.91) and negatively associated with recent antibiotic use (0.53 95%CI 0.34-0.81). 53 different serotypes were identified and 42% of isolates were of serotypes contained in the 10-valent PCV. Common serotypes declined in prevalence with age while less common serotypes did not. CONCLUSION: Carriage prevalence in children was high, serotypes were diverse, and the majority of strains were of serotypes not represented in the 10-valent PCV. Vaccine introduction in Kenya will provide a natural test of virulence for the many circulating non-vaccine serotypes

    Is analysing the nitrogen use at the plant canopy level a matter of choosing the right optimization criterion?

    Get PDF
    Optimization theory in combination with canopy modeling is potentially a powerful tool for evaluating the adaptive significance of photosynthesis-related plant traits. Yet its successful application has been hampered by a lack of agreement on the appropriate optimization criterion. Here we review how models based on different types of optimization criteria have been used to analyze traits—particularly N reallocation and leaf area indices—that determine photosynthetic nitrogen-use efficiency at the canopy level. By far the most commonly used approach is static-plant simple optimization (SSO). Static-plant simple optimization makes two assumptions: (1) plant traits are considered to be optimal when they maximize whole-stand daily photosynthesis, ignoring competitive interactions between individuals; (2) it assumes static plants, ignoring canopy dynamics (production and loss of leaves, and the reallocation and uptake of nitrogen) and the respiration of nonphotosynthetic tissue. Recent studies have addressed either the former problem through the application of evolutionary game theory (EGT) or the latter by applying dynamic-plant simple optimization (DSO), and have made considerable progress in our understanding of plant photosynthetic traits. However, we argue that future model studies should focus on combining these two approaches. We also point out that field observations can fit predictions from two models based on very different optimization criteria. In order to enhance our understanding of the adaptive significance of photosynthesis-related plant traits, there is thus an urgent need for experiments that test underlying optimization criteria and competing hypotheses about underlying mechanisms of optimization

    Prognostic model to predict postoperative acute kidney injury in patients undergoing major gastrointestinal surgery based on a national prospective observational cohort study.

    Get PDF
    Background: Acute illness, existing co-morbidities and surgical stress response can all contribute to postoperative acute kidney injury (AKI) in patients undergoing major gastrointestinal surgery. The aim of this study was prospectively to develop a pragmatic prognostic model to stratify patients according to risk of developing AKI after major gastrointestinal surgery. Methods: This prospective multicentre cohort study included consecutive adults undergoing elective or emergency gastrointestinal resection, liver resection or stoma reversal in 2-week blocks over a continuous 3-month period. The primary outcome was the rate of AKI within 7 days of surgery. Bootstrap stability was used to select clinically plausible risk factors into the model. Internal model validation was carried out by bootstrap validation. Results: A total of 4544 patients were included across 173 centres in the UK and Ireland. The overall rate of AKI was 14·2 per cent (646 of 4544) and the 30-day mortality rate was 1·8 per cent (84 of 4544). Stage 1 AKI was significantly associated with 30-day mortality (unadjusted odds ratio 7·61, 95 per cent c.i. 4·49 to 12·90; P < 0·001), with increasing odds of death with each AKI stage. Six variables were selected for inclusion in the prognostic model: age, sex, ASA grade, preoperative estimated glomerular filtration rate, planned open surgery and preoperative use of either an angiotensin-converting enzyme inhibitor or an angiotensin receptor blocker. Internal validation demonstrated good model discrimination (c-statistic 0·65). Discussion: Following major gastrointestinal surgery, AKI occurred in one in seven patients. This preoperative prognostic model identified patients at high risk of postoperative AKI. Validation in an independent data set is required to ensure generalizability

    Cancer Biomarker Discovery: The Entropic Hallmark

    Get PDF
    Background: It is a commonly accepted belief that cancer cells modify their transcriptional state during the progression of the disease. We propose that the progression of cancer cells towards malignant phenotypes can be efficiently tracked using high-throughput technologies that follow the gradual changes observed in the gene expression profiles by employing Shannon's mathematical theory of communication. Methods based on Information Theory can then quantify the divergence of cancer cells' transcriptional profiles from those of normally appearing cells of the originating tissues. The relevance of the proposed methods can be evaluated using microarray datasets available in the public domain but the method is in principle applicable to other high-throughput methods. Methodology/Principal Findings: Using melanoma and prostate cancer datasets we illustrate how it is possible to employ Shannon Entropy and the Jensen-Shannon divergence to trace the transcriptional changes progression of the disease. We establish how the variations of these two measures correlate with established biomarkers of cancer progression. The Information Theory measures allow us to identify novel biomarkers for both progressive and relatively more sudden transcriptional changes leading to malignant phenotypes. At the same time, the methodology was able to validate a large number of genes and processes that seem to be implicated in the progression of melanoma and prostate cancer. Conclusions/Significance: We thus present a quantitative guiding rule, a new unifying hallmark of cancer: the cancer cell's transcriptome changes lead to measurable observed transitions of Normalized Shannon Entropy values (as measured by high-throughput technologies). At the same time, tumor cells increment their divergence from the normal tissue profile increasing their disorder via creation of states that we might not directly measure. This unifying hallmark allows, via the the Jensen-Shannon divergence, to identify the arrow of time of the processes from the gene expression profiles, and helps to map the phenotypical and molecular hallmarks of specific cancer subtypes. The deep mathematical basis of the approach allows us to suggest that this principle is, hopefully, of general applicability for other diseases

    Measurement and interpretation of same-sign W boson pair production in association with two jets in pp collisions at s = 13 TeV with the ATLAS detector

    Get PDF
    This paper presents the measurement of fducial and diferential cross sections for both the inclusive and electroweak production of a same-sign W-boson pair in association with two jets (W±W±jj) using 139 fb−1 of proton-proton collision data recorded at a centre-of-mass energy of √s = 13 TeV by the ATLAS detector at the Large Hadron Collider. The analysis is performed by selecting two same-charge leptons, electron or muon, and at least two jets with large invariant mass and a large rapidity diference. The measured fducial cross sections for electroweak and inclusive W±W±jj production are 2.92 ± 0.22 (stat.) ± 0.19 (syst.)fb and 3.38±0.22 (stat.)±0.19 (syst.)fb, respectively, in agreement with Standard Model predictions. The measurements are used to constrain anomalous quartic gauge couplings by extracting 95% confdence level intervals on dimension-8 operators. A search for doubly charged Higgs bosons H±± that are produced in vector-boson fusion processes and decay into a same-sign W boson pair is performed. The largest deviation from the Standard Model occurs for an H±± mass near 450 GeV, with a global signifcance of 2.5 standard deviations

    Combination of searches for heavy spin-1 resonances using 139 fb−1 of proton-proton collision data at s = 13 TeV with the ATLAS detector

    Get PDF
    A combination of searches for new heavy spin-1 resonances decaying into different pairings of W, Z, or Higgs bosons, as well as directly into leptons or quarks, is presented. The data sample used corresponds to 139 fb−1 of proton-proton collisions at = 13 TeV collected during 2015–2018 with the ATLAS detector at the CERN Large Hadron Collider. Analyses selecting quark pairs (qq, bb, , and tb) or third-generation leptons (τν and ττ) are included in this kind of combination for the first time. A simplified model predicting a spin-1 heavy vector-boson triplet is used. Cross-section limits are set at the 95% confidence level and are compared with predictions for the benchmark model. These limits are also expressed in terms of constraints on couplings of the heavy vector-boson triplet to quarks, leptons, and the Higgs boson. The complementarity of the various analyses increases the sensitivity to new physics, and the resulting constraints are stronger than those from any individual analysis considered. The data exclude a heavy vector-boson triplet with mass below 5.8 TeV in a weakly coupled scenario, below 4.4 TeV in a strongly coupled scenario, and up to 1.5 TeV in the case of production via vector-boson fusion

    Avian movements in a modern world - cognitive challenges

    Get PDF
    Different movement patterns have evolved as a response to predictable and unpredictable variation in the environment with migration being an adaptation to predictable environments, nomadism to unpredictable environments and partial migration to a mixture of predictable and unpredictable conditions. Along different movement patterns different cognitive abilities have evolved which are reviewed and discussed in relation to an organism’s ability to respond to largely unpredictable environmental change due to climate and human-induced change and linked to population trends. In brief, migrants have a combination of reliance on memory, low propensity to explore and high avoidance of environmental change that in combination with overall small brain sizes results in low flexibility to respond to unpredictable environmental change. In line with this, many migrants have negative population trends. In contrast, while nomads may use their memory to find suitable habitats they can counteract negative effects of finding such habitats disturbed by large-scale exploratory movements and paying attention to environmental cues. They are also little avoidant of environmental change. Population trends are largely stable or increasing indicating their ability to cope with climate and human-induced change. Cognitive abilities in partial migrants are little investigated but indicate attention to environmental cues coupled with high exploratory tendencies that allow them a flexible response to unpredictable environmental change. Indeed, their population trends are mainly stable or increasing. In conclusion, cognitive abilities have evolved in conjunction with different movement patterns and affect an organism’s ability to adapt to rapidly human-induced changes in the environment

    Identifying the deficiencies of current diagnostic criteria for neurofibromatosis 2 using databases of 2777 individuals with molecular testing

    Get PDF
    Purpose We have evaluated deficiencies in existing diagnostic criteria for neurofibromatosis 2 (NF2). Methods Two large databases of individuals fulfilling NF2 criteria (n = 1361) and those tested for NF2 variants with criteria short of diagnosis (n = 1416) were interrogated. We assessed the proportions meeting each diagnostic criterion with constitutional or mosaic NF2 variants and the positive predictive value (PPV) with regard to definite diagnosis. Results There was no evidence for usefulness of old criteria “glioma“ or “neurofibroma.” “Ependymoma” had 100% PPV and high levels of confirmed NF2 diagnosis (67.7%). Those with bilateral vestibular schwannoma (VS) alone aged ≥60 years had the lowest confirmation rate (6.6%) and reduced PPV (80%). Siblings as a first-degree relative, without an affected parent, had 0% PPV. All three individuals with unilateral VS and an affected sibling were proven not to have NF2. The biggest overlap was with LZTR1-associated schwannomatosis. In this category, seven individuals with unilateral VS plus ≥2 nondermal schwannomas reduced PPV to 67%. Conclusions The present study confirms important deficiencies in NF2 diagnostic criteria. The term “glioma” should be dropped and replaced by “ependymoma.” Similarly “neurofibroma” should be removed. Dropping “sibling” from first-degree relatives should be considered and testing of LZTR1 should be recommended for unilateral VS

    Data Descriptor: A global multiproxy database for temperature reconstructions of the Common Era

    Get PDF
    Reproducible climate reconstructions of the Common Era (1 CE to present) are key to placing industrial-era warming into the context of natural climatic variability. Here we present a community-sourced database of temperature-sensitive proxy records from the PAGES2k initiative. The database gathers 692 records from 648 locations, including all continental regions and major ocean basins. The records are from trees, ice, sediment, corals, speleothems, documentary evidence, and other archives. They range in length from 50 to 2000 years, with a median of 547 years, while temporal resolution ranges from biweekly to centennial. Nearly half of the proxy time series are significantly correlated with HadCRUT4.2 surface temperature over the period 1850-2014. Global temperature composites show a remarkable degree of coherence between high-and low-resolution archives, with broadly similar patterns across archive types, terrestrial versus marine locations, and screening criteria. The database is suited to investigations of global and regional temperature variability over the Common Era, and is shared in the Linked Paleo Data (LiPD) format, including serializations in Matlab, R and Python.(TABLE)Since the pioneering work of D'Arrigo and Jacoby1-3, as well as Mann et al. 4,5, temperature reconstructions of the Common Era have become a key component of climate assessments6-9. Such reconstructions depend strongly on the composition of the underlying network of climate proxies10, and it is therefore critical for the climate community to have access to a community-vetted, quality-controlled database of temperature-sensitive records stored in a self-describing format. The Past Global Changes (PAGES) 2k consortium, a self-organized, international group of experts, recently assembled such a database, and used it to reconstruct surface temperature over continental-scale regions11 (hereafter, ` PAGES2k-2013').This data descriptor presents version 2.0.0 of the PAGES2k proxy temperature database (Data Citation 1). It augments the PAGES2k-2013 collection of terrestrial records with marine records assembled by the Ocean2k working group at centennial12 and annual13 time scales. In addition to these previously published data compilations, this version includes substantially more records, extensive new metadata, and validation. Furthermore, the selection criteria for records included in this version are applied more uniformly and transparently across regions, resulting in a more cohesive data product.This data descriptor describes the contents of the database, the criteria for inclusion, and quantifies the relation of each record with instrumental temperature. In addition, the paleotemperature time series are summarized as composites to highlight the most salient decadal-to centennial-scale behaviour of the dataset and check mutual consistency between paleoclimate archives. We provide extensive Matlab code to probe the database-processing, filtering and aggregating it in various ways to investigate temperature variability over the Common Era. The unique approach to data stewardship and code-sharing employed here is designed to enable an unprecedented scale of investigation of the temperature history of the Common Era, by the scientific community and citizen-scientists alike

    Search for dark photons in rare Z boson decays with the ATLAS detector

    Get PDF
    A search for events with a dark photon produced in association with a dark Higgs boson via rare decays of the standard model Z boson is presented, using 139     fb − 1 of √ s = 13     TeV proton-proton collision data recorded by the ATLAS detector at the Large Hadron Collider. The dark boson decays into a pair of dark photons, and at least two of the three dark photons must each decay into a pair of electrons or muons, resulting in at least two same-flavor opposite-charge lepton pairs in the final state. The data are found to be consistent with the background prediction, and upper limits are set on the dark photon’s coupling to the dark Higgs boson times the kinetic mixing between the standard model photon and the dark photon, α D ϵ 2 , in the dark photon mass range of [5, 40] GeV except for the Υ mass window [8.8, 11.1] GeV. This search explores new parameter space not previously excluded by other experiments
    corecore