376 research outputs found

    Mortality in Dutch hospitals: Trends in time, place and cause of death after admission for myocardial infarction and stroke. An observational study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Patterns in time, place and cause of death can have an important impact on calculated hospital mortality rates. Objective is to quantify these patterns following myocardial infarction and stroke admissions in Dutch hospitals during the period 1996–2003, and to compare trends in the commonly used 30-day in-hospital mortality rates with other types of mortality rates which use more extensive follow-up in time and place of death.</p> <p>Methods</p> <p>Discharge data for all Dutch admissions for index conditions (1996–2003) were linked to the death certification registry. Then, mortality rates within the first 30, 90 and 365 days following admissions were analyzed for deaths occurring within and outside hospitals.</p> <p>Results</p> <p>Most deaths within a year after admission occurred within 30 days (60–70%). No significant trends in this distribution of deaths over time were observed. Significant trends in the distribution over place of death were observed for both conditions. For myocardial infarction, the proportion of deaths after transfer to another hospital has doubled from 1996–2003. For stroke a significant rise of the proportion of deaths outside hospital was found. For MI the proportion of deaths attributed to a circulatory disease has significantly fallen ovtime. Seven types of hospital mortality indicators, different in scope and observation period, all show a drop of hospital mortality for both MI and stroke over the period 1996–2003. For stroke the observed absolute reduction in death rate increases for the first year after admission, for MI the observed drop in 365-day overall mortality almost equals the observed drop in 30-day in hospital mortality over 1996–2003.</p> <p>Conclusion</p> <p>Changes in the timing, place and causes of death following admissions for myocardial infarction and stroke have important implications for the definitions of in-hospital and post-admission mortality rates as measures of hospital performance. Although necessary for understanding mortality patterns over time, including within mortality rates deaths which occur outside hospitals and after longer periods following index admissions remain debatable and may not reflect actual hospital performance but probably mirrors transfer, efficiency, and other health care policies.</p

    Sample size requirements to detect the effect of a group of genetic variants in case-control studies

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Because common diseases are caused by complex interactions among many genetic variants along with environmental risk factors, very large sample sizes are usually needed to detect such effects in case-control studies. Nevertheless, many genetic variants act in well defined biologic systems or metabolic pathways. Therefore, a reasonable first step may be to detect the effect of a group of genetic variants before assessing specific variants.</p> <p>Methods</p> <p>We present a simple method for determining approximate sample sizes required to detect the average joint effect of a group of genetic variants in a case-control study for multiplicative models.</p> <p>Results</p> <p>For a range of reasonable numbers of genetic variants, the sample size requirements for the test statistic proposed here are generally not larger than those needed for assessing marginal effects of individual variants and actually decline with increasing number of genetic variants in many situations considered in the group.</p> <p>Conclusion</p> <p>When a significant effect of the group of genetic variants is detected, subsequent multiple tests could be conducted to detect which individual genetic variants and their combinations are associated with disease risk. When testing for an effect size in a group of genetic variants, one can use our global test described in this paper, because the sample size required to detect an effect size in the group is comparatively small. Our method could be viewed as a screening tool for assessing groups of genetic variants involved in pathogenesis and etiology of common complex human diseases.</p

    Liquid-gas phase transition in nuclear multifragmentation

    Get PDF
    The equation of state of nuclear matter suggests that at suitable beam energies the disassembling hot system formed in heavy ion collisions will pass through a liquid-gas coexistence region. Searching for the signatures of the phase transition has been a very important focal point of experimental endeavours in heavy ion collisions, in the last fifteen years. Simultaneously theoretical models have been developed to provide information about the equation of state and reaction mechanisms consistent with the experimental observables. This article is a review of this endeavour.Comment: 63 pages, 27 figures, submitted to Adv. Nucl. Phys. Some typos corrected, minor text change

    Realizing the promise of population biobanks: a new model for translation

    Get PDF
    The promise of science lies in expectations of its benefits to societies and is matched by expectations of the realisation of the significant public investment in that science. In this paper, we undertake a methodological analysis of the science of biobanking and a sociological analysis of translational research in relation to biobanking. Part of global and local endeavours to translate raw biomedical evidence into practice, biobanks aim to provide a platform for generating new scientific knowledge to inform development of new policies, systems and interventions to enhance the public’s health. Effectively translating scientific knowledge into routine practice, however, involves more than good science. Although biobanks undoubtedly provide a fundamental resource for both clinical and public health practice, their potentiating ontology—that their outputs are perpetually a promise of scientific knowledge generation—renders translation rather less straightforward than drug discovery and treatment implementation. Biobanking science, therefore, provides a perfect counterpoint against which to test the bounds of translational research. We argue that translational research is a contextual and cumulative process: one that is necessarily dynamic and interactive and involves multiple actors. We propose a new multidimensional model of translational research which enables us to imagine a new paradigm: one that takes us from bench to bedside to backyard and beyond, that is, attentive to the social and political context of translational science, and is cognisant of all the players in that process be they researchers, health professionals, policy makers, industry representatives, members of the public or research participants, amongst others

    Differential analysis for high density tiling microarray data

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>High density oligonucleotide tiling arrays are an effective and powerful platform for conducting unbiased genome-wide studies. The <it>ab initio </it>probe selection method employed in tiling arrays is unbiased, and thus ensures consistent sampling across coding and non-coding regions of the genome. These arrays are being increasingly used to study the associated processes of transcription, transcription factor binding, chromatin structure and their association. Studies of differential expression and/or regulation provide critical insight into the mechanics of transcription and regulation that occurs during the developmental program of a cell. The time-course experiment, which comprises an <it>in-vivo </it>system and the proposed analyses, is used to determine if annotated and un-annotated portions of genome manifest coordinated differential response to the induced developmental program.</p> <p>Results</p> <p>We have proposed a novel approach, based on a piece-wise function – to analyze genome-wide differential response. This enables segmentation of the response based on protein-coding and non-coding regions; for genes the methodology also partitions differential response with a 5' versus 3' versus intra-genic bias.</p> <p>Conclusion</p> <p>The algorithm built upon the framework of Significance Analysis of Microarrays, uses a generalized logic to define regions/patterns of coordinated differential change. By not adhering to the gene-centric paradigm, discordant differential expression patterns between exons and introns have been identified at a FDR of less than 12 percent. A co-localization of differential binding between RNA Polymerase II and tetra-acetylated histone has been quantified at a p-value < 0.003; it is most significant at the 5' end of genes, at a p-value < 10<sup>-13</sup>. The prototype R code has been made available as supplementary material [see Additional file <supplr sid="S1">1</supplr>].</p> <suppl id="S1"> <title> <p>Additional file 1</p> </title> <text> <p>gsam_prototypercode.zip. File archive comprising of prototype R code for gSAM implementation including readme and examples.</p> </text> <file name="1471-2105-8-359-S1.zip"> <p>Click here for file</p> </file> </suppl

    Bayesian versus frequentist statistical inference for investigating a one-off cancer cluster reported to a health department

    Get PDF
    Background. The problem of silent multiple comparisons is one of the most difficult statistical problems faced by scientists. It is a particular problem for investigating a one-off cancer cluster reported to a health department because any one of hundreds, or possibly thousands, of neighbourhoods, schools, or workplaces could have reported a cluster, which could have been for any one of several types of cancer or any one of several time periods. Methods. This paper contrasts the frequentist approach with a Bayesian approach for dealing with silent multiple comparisons in the context of a one-off cluster reported to a health department. Two published cluster investigations were re-analysed using the Dunn-Sidak method to adjust frequentist p-values and confidence intervals for silent multiple comparisons. Bayesian methods were based on the Gamma distribution. Results. Bayesian analysis with non-informative priors produced results similar to the frequentist analysis, and suggested that both clusters represented a statistical excess. In the frequentist framework, the statistical significance of both clusters was extremely sensitive to the number of silent multiple comparisons, which can only ever be a subjective "guesstimate". The Bayesian approach is also subjective: whether there is an apparent statistical excess depends on the specified prior. Conclusion. In cluster investigations, the frequentist approach is just as subjective as the Bayesian approach, but the Bayesian approach is less ambitious in that it treats the analysis as a synthesis of data and personal judgements (possibly poor ones), rather than objective reality. Bayesian analysis is (arguably) a useful tool to support complicated decision-making, because it makes the uncertainty associated with silent multiple comparisons explicit

    Heterochronic faecal transplantation boosts gut germinal centres in aged mice

    Get PDF
    Ageing is a complex multifactorial process associated with a plethora of disorders, which contribute significantly to morbidity worldwide. One of the organs significantly affected by age is the gut. Age-dependent changes of the gut-associated microbiome have been linked to increased frailty and systemic inflammation. This change in microbial composition with age occurs in parallel with a decline in function of the gut immune system, however it is not clear if there is a causal link between the two. Here we report that the defective germinal centre reaction in Peyer’s patches of aged mice can be rescued by faecal transfers from younger adults into aged mice and by immunisations with cholera toxin, without affecting germinal centre reactions in peripheral lymph nodes. This demonstrates that the poor germinal centre reaction in aged animals is not irreversible, and that it is possible to improve this response in older individuals by providing appropriate stimuli

    Rapidity and Centrality Dependence of Proton and Anti-proton Production from Au+Au Collisions at sqrt(sNN) = 130GeV

    Full text link
    We report on the rapidity and centrality dependence of proton and anti-proton transverse mass distributions from Au+Au collisions at sqrt(sNN) = 130GeV as measured by the STAR experiment at RHIC. Our results are from the rapidity and transverse momentum range of |y|<0.5 and 0.35 <p_t<1.00GeV/c. For both protons and anti-protons, transverse mass distributions become more convex from peripheral to central collisions demonstrating characteristics of collective expansion. The measured rapidity distributions and the mean transverse momenta versus rapidity are flat within |y|<0.5. Comparisons of our data with results from model calculations indicate that in order to obtain a consistent picture of the proton(anti-proton) yields and transverse mass distributions the possibility of pre-hadronic collective expansion may have to be taken into account.Comment: 4 pages, 3 figures, 1 table, submitted to PR

    Azimuthal anisotropy and correlations at large transverse momenta in p+pp+p and Au+Au collisions at sNN\sqrt{s_{_{NN}}}= 200 GeV

    Get PDF
    Results on high transverse momentum charged particle emission with respect to the reaction plane are presented for Au+Au collisions at sNN\sqrt{s_{_{NN}}}= 200 GeV. Two- and four-particle correlations results are presented as well as a comparison of azimuthal correlations in Au+Au collisions to those in p+pp+p at the same energy. Elliptic anisotropy, v2v_2, is found to reach its maximum at pt3p_t \sim 3 GeV/c, then decrease slowly and remain significant up to pt7p_t\approx 7 -- 10 GeV/c. Stronger suppression is found in the back-to-back high-ptp_t particle correlations for particles emitted out-of-plane compared to those emitted in-plane. The centrality dependence of v2v_2 at intermediate ptp_t is compared to simple models based on jet quenching.Comment: 4 figures. Published version as PRL 93, 252301 (2004

    Automated Ensemble Modeling with modelMaGe: Analyzing Feedback Mechanisms in the Sho1 Branch of the HOG Pathway

    Get PDF
    In systems biology uncertainty about biological processes translates into alternative mathematical model candidates. Here, the goal is to generate, fit and discriminate several candidate models that represent different hypotheses for feedback mechanisms responsible for downregulating the response of the Sho1 branch of the yeast high osmolarity glycerol (HOG) signaling pathway after initial stimulation. Implementing and testing these candidate models by hand is a tedious and error-prone task. Therefore, we automatically generated a set of candidate models of the Sho1 branch with the tool modelMaGe. These candidate models are automatically documented, can readily be simulated and fitted automatically to data. A ranking of the models with respect to parsimonious data representation is provided, enabling discrimination between candidate models and the biological hypotheses underlying them. We conclude that a previously published model fitted spurious effects in the data. Moreover, the discrimination analysis suggests that the reported data does not support the conclusion that a desensitization mechanism leads to the rapid attenuation of Hog1 signaling in the Sho1 branch of the HOG pathway. The data rather supports a model where an integrator feedback shuts down the pathway. This conclusion is also supported by dedicated experiments that can exclusively be predicted by those models including an integrator feedback
    corecore