6,981 research outputs found

    Lost in Translation: Piloting a Novel Framework to Assess the Challenges in Translating Scientific Uncertainty From Empirical Findings to WHO Policy Statements.

    Get PDF
    BACKGROUND:Calls for evidence-informed public health policy, with implicit promises of greater program effectiveness, have intensified recently. The methods to produce such policies are not self-evident, requiring a conciliation of values and norms between policy-makers and evidence producers. In particular, the translation of uncertainty from empirical research findings, particularly issues of statistical variability and generalizability, is a persistent challenge because of the incremental nature of research and the iterative cycle of advancing knowledge and implementation. This paper aims to assess how the concept of uncertainty is considered and acknowledged in World Health Organization (WHO) policy recommendations and guidelines. METHODS:We selected four WHO policy statements published between 2008-2013 regarding maternal and child nutrient supplementation, infant feeding, heat action plans, and malaria control to represent topics with a spectrum of available evidence bases. Each of these four statements was analyzed using a novel framework to assess the treatment of statistical variability and generalizability. RESULTS:WHO currently provides substantial guidance on addressing statistical variability through GRADE (Grading of Recommendations Assessment, Development, and Evaluation) ratings for precision and consistency in their guideline documents. Accordingly, our analysis showed that policy-informing questions were addressed by systematic reviews and representations of statistical variability (eg, with numeric confidence intervals). In contrast, the presentation of contextual or "background" evidence regarding etiology or disease burden showed little consideration for this variability. Moreover, generalizability or "indirectness" was uniformly neglected, with little explicit consideration of study settings or subgroups. CONCLUSION:In this paper, we found that non-uniform treatment of statistical variability and generalizability factors that may contribute to uncertainty regarding recommendations were neglected, including the state of evidence informing background questions (prevalence, mechanisms, or burden or distributions of health problems) and little assessment of generalizability, alternate interventions, and additional outcomes not captured by systematic review. These other factors often form a basis for providing policy recommendations, particularly in the absence of a strong evidence base for intervention effects. Consequently, they should also be subject to stringent and systematic evaluation criteria. We suggest that more effort is needed to systematically acknowledge (1) when evidence is missing, conflicting, or equivocal, (2) what normative considerations were also employed, and (3) how additional evidence may be accrued

    Evaluation of Effects of Wastewater Treatment Discharge on Estuarine Water Quality

    Get PDF
    This report marks the completion of a two-year project focused on observed and estimated effects of wastewater treatment facilities (WWTFs) on estuarine water quality within the New Hampshire (NH) Seacoast region. This study was designed and carried out in an effort to help the NH Department of Environmental Services (NHDES) and NH Estuaries Project (NHEP) evaluate the effects of WWTF effluent quality on bacterial and nutrient concentrations in New Hampshire’s estuarine waters, as well as to help NHDES/NHEP identify related WWTF infrastructure problems. An extensive database of bacterial and nutrient concentrations in effluent collected post-disinfection from 9 NH WWTFs and 2 Maine WWTFs that discharge into the Great Bay and Hampton/Seabrook estuaries was developed. The data were used to determine ratios between different bacterial indicators in WWTF effluent, estimates of in-stream bacterial concentrations following effluent discharge to receiving waters and estimates of nutrient loading from selected WWTFs

    Money Growth, Inflation, and Causality (Empirical Evidence for Pakistan, 1973-1985)

    Get PDF
    This paper uses the Granger direct test to evaluate the causal relationship between growth in money supply and inflation in Pakistan. The historical period investigated extends from 1973 to 1985. The results of the test show that money growth had a significant impact on inflation during the period considered. In addition, there is some evidence at hand showing that inflation, too, affected money growth over the 1973- 1985 period. The empirical issue of the impact of money supply on rate of inflation continues to be a much debated topic. For example, Tumovsky and Wohar (1984) do not find any identifiable relationship between money supply and prices over the 1929- 1978 period in the U.S., while Benderly and Zwick (1985) find money supply affecting prices in the U,S, over the 1955-1982 period. Jones and Uri (1986) also find evidence of money supply influencing price level in the U.S. during the 1953- 1984 period. Studies• for other countries, e,g. Driscoll, Ford, and Mullineux (1985) for the U.K., invariably report similar conflicting results about the relationship between money supply and prices. While there are numerous empirical studies that have examined the causal relationship .between money supply and prices in developed countries, there are also several recent studies that have addressed this particular issue for developing countries. In one such study, Aghevli and Khan (1978) use the Haugh-Pierce test to investigate the causal relationship between money growth and inflation in Brazil, Colombia. the Dominican Republic, and Thailand. t The results of the tests show a feedback or bidirectional causality between money and inflation in all the four developing countries over the 1964-1974 period

    Does money matter in inflation forecasting?

    Get PDF
    This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two non-linear techniques, namely, recurrent neural networks and kernel recursive least squares regression - techniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naive random walk model. The best models were non-linear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation.Forecasting ; Inflation (Finance) ; Monetary theory

    Error tolerance in an NMR Implementation of Grover's Fixed-Point Quantum Search Algorithm

    Full text link
    We describe an implementation of Grover's fixed-point quantum search algorithm on a nuclear magnetic resonance (NMR) quantum computer, searching for either one or two matching items in an unsorted database of four items. In this new algorithm the target state (an equally weighted superposition of the matching states) is a fixed point of the recursive search operator, and so the algorithm always moves towards the desired state. The effects of systematic errors in the implementation are briefly explored.Comment: 5 Pages RevTex4 including three figures. Changes made at request of referees; now in press at Phys Rev

    Control theory for principled heap sizing

    Get PDF
    We propose a new, principled approach to adaptive heap sizing based on control theory. We review current state-of-the-art heap sizing mechanisms, as deployed in Jikes RVM and HotSpot. We then formulate heap sizing as a control problem, apply and tune a standard controller algorithm, and evaluate its performance on a set of well-known benchmarks. We find our controller adapts the heap size more responsively than existing mechanisms. This responsiveness allows tighter virtual machine memory footprints while preserving target application throughput, which is ideal for both embedded and utility computing domains. In short, we argue that formal, systematic approaches to memory management should be replacing ad-hoc heuristics as the discipline matures. Control-theoretic heap sizing is one such systematic approach

    Experimental Heat-Bath Cooling of Spins

    Get PDF
    Algorithmic cooling (AC) is a method to purify quantum systems, such as ensembles of nuclear spins, or cold atoms in an optical lattice. When applied to spins, AC produces ensembles of highly polarized spins, which enhance the signal strength in nuclear magnetic resonance (NMR). According to this cooling approach, spin-half nuclei in a constant magnetic field are considered as bits, or more precisely, quantum bits, in a known probability distribution. Algorithmic steps on these bits are then translated into specially designed NMR pulse sequences using common NMR quantum computation tools. The algorithmicalgorithmic cooling of spins is achieved by alternately combining reversible, entropy-preserving manipulations (borrowed from data compression algorithms) with selectiveselective resetreset, the transfer of entropy from selected spins to the environment. In theory, applying algorithmic cooling to sufficiently large spin systems may produce polarizations far beyond the limits due to conservation of Shannon entropy. Here, only selective reset steps are performed, hence we prefer to call this process "heat-bath" cooling, rather than algorithmic cooling. We experimentally implement here two consecutive steps of selective reset that transfer entropy from two selected spins to the environment. We performed such cooling experiments with commercially-available labeled molecules, on standard liquid-state NMR spectrometers. Our experiments yielded polarizations that bypassbypass ShannonsShannon's entropyentropy-conservationconservation boundbound, so that the entire spin-system was cooled. This paper was initially submitted in 2005, first to Science and then to PNAS, and includes additional results from subsequent years (e.g. for resubmission in 2007). The Postscriptum includes more details.Comment: 20 pages, 8 figures, replaces quant-ph/051115

    Innate Immune Recognition of Francisella Tularensis: Activation of Type-I Interferons and the Inflammasome

    Get PDF
    Francisella tularensis is an intracellular pathogen that can cause severe disease in a wide range of mammalian hosts. Primarily residing in host macrophages, F. tularensis escapes phagosomal degradation, and replicates in the macrophage cytosol. The macrophage uses a series of pattern recognition receptors to detect conserved microbial molecules from invading pathogens, and initiates an appropriate host response. In the cytosol, F. tularensis is recognized by the inflammasome, a multiprotein complex responsible for the activation of the cysteine protease caspase-1. Caspase-1 activation leads to processing and release of proinflammatory cytokines and host cell death. Here we review recent work on the molecular mechanisms of inflammasome activation by F. tularensis, and its consequences both in vitro and in vivo. Finally, we discuss the coordination between the inflammasome and other cytosolic host responses, and the evidence for F. tularensis virulence factors that suppress inflammasome activation
    corecore