276 research outputs found

    Evidence for a Positive Cosmological Constant from Flows of Galaxies and Distant Supernovae

    Full text link
    Recent observations of high-redshift supernovae seem to suggest that the global geometry of the Universe may be affected by a `cosmological constant', which acts to accelerate the expansion rate with time. But these data by themselves still permit an open universe of low mass density and no cosmological constant. Here we derive an independent constraint on the lower bound to the mass density, based on deviations of galaxy velocities from a smooth universal expansion. This constraint rules out a low-density open universe with a vanishing cosmological constant, and together the two favour a nearly flat universe in which the contributions from mass density and the cosmological constant are comparable. This type of universe, however, seems to require a degree of fine tuning of the initial conditions that is in apparent conflict with `common wisdom'.Comment: 8 pages, 1 figure. Slightly revised version. Letter to Natur

    EMBASE search strategies for identifying methodologically sound diagnostic studies for use by clinicians and researchers

    Get PDF
    BACKGROUND: Accurate diagnosis by clinicians is the cornerstone of decision making for recommending clinical interventions. The current best evidence from research concerning diagnostic tests changes unpredictably as science advances. Both clinicians and researchers need dependable access to published evidence concerning diagnostic accuracy. Bibliographic databases such as EMBASE provide the most widely available entrĂŠe to this literature. The objective of this study was to develop search strategies that optimize the retrieval of methodologically sound diagnostic studies from EMBASE for use by clinicians. METHODS: An analytic survey was conducted, comparing hand searches of 55 journals with retrievals from EMBASE for 4,843 candidate search terms and 6,574 combinations. All articles were rated using purpose and quality indicators, and clinically relevant diagnostic accuracy articles were categorized as 'pass' or 'fail' according to explicit criteria for scientific merit. Candidate search strategies were run in EMBASE, the retrievals being compared with the hand search data. The proposed search strategies were treated as "diagnostic tests" for sound studies and the manual review of the literature was treated as the "gold standard." The sensitivity, specificity, precision and accuracy of the search strategies were calculated. RESULTS: Of the 433 articles about diagnostic tests, 97 (22.4%) met basic criteria for scientific merit. Combinations of search terms reached peak sensitivities of 100% with specificity at 70.4%. Compared with best single terms, best multiple terms increased sensitivity for sound studies by 8.2% (absolute increase), but decreased specificity (absolute decrease 6%) when sensitivity was maximized. When terms were combined to maximize specificity, the single term "specificity.tw." (specificity of 98.2%) outperformed combinations of terms. CONCLUSION: Empirically derived search strategies combining indexing terms and textwords can achieve high sensitivity and specificity for retrieving sound diagnostic studies from EMBASE. These search filters will enhance the searching efforts of clinicians

    Chapter 4: Effective Search Strategies for Systematic Reviews of Medical Tests

    Get PDF
    This article discusses techniques that are appropriate when developing search strategies for systematic reviews of medical tests. This includes general advice for searching for systematic reviews and issues specific to systematic reviews of medical tests. Diagnostic search filters are currently not sufficiently developed for use when searching for systematic reviews. Instead, authors should construct a highly sensitive search strategy that uses both controlled vocabulary and text words. A comprehensive search should include multiple databases and sources of grey literature. A list of subject-specific databases is included in this article

    The identification of informative genes from multiple datasets with increasing complexity

    Get PDF
    Background In microarray data analysis, factors such as data quality, biological variation, and the increasingly multi-layered nature of more complex biological systems complicates the modelling of regulatory networks that can represent and capture the interactions among genes. We believe that the use of multiple datasets derived from related biological systems leads to more robust models. Therefore, we developed a novel framework for modelling regulatory networks that involves training and evaluation on independent datasets. Our approach includes the following steps: (1) ordering the datasets based on their level of noise and informativeness; (2) selection of a Bayesian classifier with an appropriate level of complexity by evaluation of predictive performance on independent data sets; (3) comparing the different gene selections and the influence of increasing the model complexity; (4) functional analysis of the informative genes. Results In this paper, we identify the most appropriate model complexity using cross-validation and independent test set validation for predicting gene expression in three published datasets related to myogenesis and muscle differentiation. Furthermore, we demonstrate that models trained on simpler datasets can be used to identify interactions among genes and select the most informative. We also show that these models can explain the myogenesis-related genes (genes of interest) significantly better than others (P < 0.004) since the improvement in their rankings is much more pronounced. Finally, after further evaluating our results on synthetic datasets, we show that our approach outperforms a concordance method by Lai et al. in identifying informative genes from multiple datasets with increasing complexity whilst additionally modelling the interaction between genes. Conclusions We show that Bayesian networks derived from simpler controlled systems have better performance than those trained on datasets from more complex biological systems. Further, we present that highly predictive and consistent genes, from the pool of differentially expressed genes, across independent datasets are more likely to be fundamentally involved in the biological process under study. We conclude that networks trained on simpler controlled systems, such as in vitro experiments, can be used to model and capture interactions among genes in more complex datasets, such as in vivo experiments, where these interactions would otherwise be concealed by a multitude of other ongoing events

    A systematic review of techniques and interventions for improving adherence to inclusion and exclusion criteria during enrolment into randomised controlled trials

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Enrolment of patients into a randomised controlled trial (RCT) in violation of key inclusion or exclusion criteria, may lead to excess avoidable harm. The purpose of this paper was to systematically identify and review techniques and interventions proven to prevent or avoid inappropriate enrolment of patients into RCTs.</p> <p>Methods</p> <p>EMBASE, MEDLINE, Cochrane Database of Systematic Reviews, Cochrane Methodology Register, online abstract repositories, and conference websites were searched. Experts were contacted and bibliographies of retrieved papers hand-searched. The search cut-off date was 31 August 2009.</p> <p>Results</p> <p>No primary publications were found. We identified one study in the grey literature (conference abstracts and presentations) reporting the results of an evaluation of the effectiveness of an intervention designed to prevent or avoid inappropriate enrolment of patients into an RCT. In the context of a multicentre trial, use of a dummy enrolment run-in phase was shown to reduce enrolment errors significantly (<it>P </it>< 0.001), from 16.1% during the run-in phase to < 1% after trial initiation.</p> <p>Conclusions</p> <p>Our systematic search yielded only one technique or intervention shown to improve adherence to eligibility criteria during enrolment into RCTs. Given the potential harm involved in recruiting patients into a clinical trial in violation of key eligibility criteria, future research is needed to better inform those conducting clinical trials of how best to prevent enrolment errors</p

    Nitrogen and sulphur management: challenges for organic sources in temperate agricultural systems

    Get PDF
    A current global trend towards intensification or specialization of agricultural enterprises has been accompanied by increasing public awareness of associated environmental consequences. Air and water pollution from losses of nutrients, such as nitrogen (N) and sulphur (S), are a major concern. Governments have initiated extensive regulatory frameworks, including various land use policies, in an attempt to control or reduce the losses. This paper presents an overview of critical input and loss processes affecting N and S for temperate climates, and provides some background to the discussion in subsequent papers evaluating specific farming systems. Management effects on potential gaseous and leaching losses, the lack of synchrony between supply of nutrients and plant demand, and options for optimizing the efficiency of N and S use are reviewed. Integration of inorganic and organic fertilizer inputs and the equitable re-distribution of nutrients from manure are discussed. The paper concludes by highlighting a need for innovative research that is also targeted to practical approaches for reducing N and S losses, and improving the overall synchrony between supply and demand

    Evaluating the impact of MEDLINE filters on evidence retrieval: study protocol

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Rather than searching the entire MEDLINE database, clinicians can perform searches on a filtered set of articles where relevant information is more likely to be found. Members of our team previously developed two types of MEDLINE filters. The 'methods' filters help identify clinical research of high methodological merit. The 'content' filters help identify articles in the discipline of renal medicine. We will now test the utility of these filters for physician MEDLINE searching.</p> <p>Hypothesis</p> <p>When a physician searches MEDLINE, we hypothesize the use of filters will increase the number of relevant articles retrieved (increase 'recall,' also called sensitivity) and decrease the number of non-relevant articles retrieved (increase 'precision,' also called positive predictive value), compared to the performance of a physician's search unaided by filters.</p> <p>Methods</p> <p>We will survey a random sample of 100 nephrologists in Canada to obtain the MEDLINE search that they would first perform themselves for a focused clinical question. Each question we provide to a nephrologist will be based on the topic of a recently published, well-conducted systematic review. We will examine the performance of a physician's unaided MEDLINE search. We will then apply a total of eight filter combinations to the search (filters used in isolation or in combination). We will calculate the recall and precision of each search. The filter combinations that most improve on unaided physician searches will be identified and characterized.</p> <p>Discussion</p> <p>If these filters improve search performance, physicians will be able to search MEDLINE for renal evidence more effectively, in less time, and with less frustration. Additionally, our methodology can be used as a proof of concept for the evaluation of search filters in other disciplines.</p

    A structured review of long-term care demand modelling

    Get PDF
    Long-term care (LTC) represents a significant and substantial proportion of healthcare spends across the globe. Its main aim is to assist individuals suffering with more or more chronic illnesses, disabilities or cognitive impairments, to carry out activities associated with daily living. Shifts in several economic, demographic and social factors have raised concerns surrounding the sustainability of current systems of LTC. Substantial effort has been put into modelling the LTC demand process itself so as to increase understanding of the factors driving demand for LTC and its related services. Furthermore, such modeling efforts have also been used to plan the operation and future composition of the LTC system itself. The main aim of this paper is to provide a structured review of the literature surrounding LTC demand modeling and any such industrial application, whilst highlighting any potential direction for future researchers
    • …
    corecore