953 research outputs found

    Calculating confidence intervals for impact numbers

    Get PDF
    BACKGROUND: Standard effect measures such as risk difference and attributable risk are frequently used in epidemiological studies and public health research to describe the effect of exposures. Recently, so-called impact numbers have been proposed, which express the population impact of exposures in form of specific person or case numbers. To describe estimation uncertainty, it is necessary to calculate confidence intervals for these new effect measures. In this paper, we present methods to calculate confidence intervals for the new impact numbers in the situation of cohort studies. METHODS: Beside the exposure impact number (EIN), which is equivalent to the well-known number needed to treat (NNT), two other impact numbers are considered: the case impact number (CIN) and the exposed cases impact number (ECIN), which describe the number of cases (CIN) and the number of exposed cases (ECIN) with an outcome among whom one case is attributable to the exposure. The CIN and ECIN represent reciprocals of the population attributable risk (PAR) and the attributable fraction among the exposed (AF(e)), respectively. Thus, confidence intervals for these impact numbers can be calculated by inverting and exchanging the confidence limits of the PAR and AF(e). EXAMPLES: We considered a British and a Japanese cohort study that investigated the association between smoking and death from coronary heart disease (CHD) and between smoking and stroke, respectively. We used the reported death and disease rates and calculated impact numbers with corresponding 95% confidence intervals. In the British study, the CIN was 6.46, i.e. on average, of any 6 to 7 persons who died of CHD, one case was attributable to smoking with corresponding 95% confidence interval of [3.84, 20.36]. For the exposed cases, the results of ECIN = 2.64 with 95% confidence interval [1.76, 5.29] were obtained. In the Japanese study, the CIN was 6.67, i.e. on average, of the 6 to 7 persons who had a stroke, one case was attributable to smoking with corresponding 95% confidence interval of [3.80, 27.27]. For the exposed cases, the results of ECIN = 4.89 with 95% confidence interval of [2.86, 16.67] were obtained. CONCLUSION: The consideration of impact numbers in epidemiological analyses provides additional information and helps the interpretation of study results, e.g. in public health research. In practical applications, it is necessary to describe estimation uncertainty. We have shown that the calculation of confidence intervals for the new impact numbers is possible by means of known methods for attributable risk measures. Therefore, estimated impact numbers should always be complemented by appropriate confidence intervals

    Conserving the Stage: Climate Change and the Geophysical Underpinnings of Species Diversity

    Get PDF
    Conservationists have proposed methods for adapting to climate change that assume species distributions are primarily explained by climate variables. The key idea is to use the understanding of species-climate relationships to map corridors and to identify regions of faunal stability or high species turnover. An alternative approach is to adopt an evolutionary timescale and ask ultimately what factors control total diversity, so that over the long run the major drivers of total species richness can be protected. Within a single climatic region, the temperate area encompassing all of the Northeastern U.S. and Maritime Canada, we hypothesized that geologic factors may take precedence over climate in explaining diversity patterns. If geophysical diversity does drive regional diversity, then conserving geophysical settings may offer an approach to conservation that protects diversity under both current and future climates. Here we tested how well geology predicts the species diversity of 14 US states and three Canadian provinces, using a comprehensive new spatial dataset. Results of linear regressions of species diversity on all possible combinations of 23 geophysical and climatic variables indicated that four geophysical factors; the number of geological classes, latitude, elevation range and the amount of calcareous bedrock, predicted species diversity with certainty (adj. R2 = 0.94). To confirm the species-geology relationships we ran an independent test using 18,700 location points for 885 rare species and found that 40% of the species were restricted to a single geology. Moreover, each geology class supported 5–95 endemic species and chi-square tests confirmed that calcareous bedrock and extreme elevations had significantly more rare species than expected by chance (P<0.0001), strongly corroborating the regression model. Our results suggest that protecting geophysical settings will conserve the stage for current and future biodiversity and may be a robust alternative to species-level predictions

    When One Size Does Not Fit All: A Simple Statistical Method to Deal with Across-Individual Variations of Effects

    Get PDF
    In science, it is a common experience to discover that although the investigated effect is very clear in some individuals, statistical tests are not significant because the effect is null or even opposite in other individuals. Indeed, t-tests, Anovas and linear regressions compare the average effect with respect to its inter-individual variability, so that they can fail to evidence a factor that has a high effect in many individuals (with respect to the intra-individual variability). In such paradoxical situations, statistical tools are at odds with the researcher’s aim to uncover any factor that affects individual behavior, and not only those with stereotypical effects. In order to go beyond the reductive and sometimes illusory description of the average behavior, we propose a simple statistical method: applying a Kolmogorov-Smirnov test to assess whether the distribution of p-values provided by individual tests is significantly biased towards zero. Using Monte-Carlo studies, we assess the power of this two-step procedure with respect to RM Anova and multilevel mixed-effect analyses, and probe its robustness when individual data violate the assumption of normality and homoscedasticity. We find that the method is powerful and robust even with small sample sizes for which multilevel methods reach their limits. In contrast to existing methods for combining p-values, the Kolmogorov-Smirnov test has unique resistance to outlier individuals: it cannot yield significance based on a high effect in one or two exceptional individuals, which allows drawing valid population inferences. The simplicity and ease of use of our method facilitates the identification of factors that would otherwise be overlooked because they affect individual behavior in significant but variable ways, and its power and reliability with small sample sizes (<30–50 individuals) suggest it as a tool of choice in exploratory studies

    The relationship between alcohol use and dementia in adults aged more than 60 years: a combined analysis of prospective, individual-participant data from 15 international studies

    Full text link
    Aim: To synthesize international findings on the alcohol–dementia relationship, including representation from low- and middle-income countries. Methods: Individual participant data meta-analysis of 15 prospective epidemiological cohort studies from countries situated in six continents. Cox regression investigated the dementia risk associated with alcohol use in older adults aged over 60 years. Additional analyses assessed the alcohol–dementia relationship in the sample stratified by sex and by continent. Participants included 24 478 community dwelling individuals without a history of dementia at baseline and at least one follow-up dementia assessment. The main outcome measure was all-cause dementia as determined by clinical interview. Results: At baseline, the mean age across studies was 71.8 (standard deviation = 7.5, range = 60–102 years), 14 260 (58.3%) were female and 13 269 (54.2%) were current drinkers. During 151 636 person-years of follow-up, there were 2124 incident cases of dementia (14.0 per 1000 person-years). When compared with abstainers, the risk for dementia was lower in occasional [hazard ratio (HR) = 0.78; 95% confidence interval (CI) = 0.68–0.89], light–moderate (HR = 0.78; 95% CI = 0.70–0.87) and moderate–heavy drinkers (HR = 0.62; 95% CI = 0.51–0.77). There was no evidence of differences between life-time abstainers and former drinkers in terms of dementia risk (HR = 0.98; 95% CI = 0.81–1.18). In dose–response analyses, moderate drinking up to 40 g/day was associated with a lower risk of dementia when compared with lif-time abstaining. Among current drinkers, there was no consistent evidence for differences in terms of dementia risk. Results were similar when the sample was stratified by sex. When analysed at the continent level, there was considerable heterogeneity in the alcohol–dementia relationship. Conclusions: Abstinence from alcohol appears to be associated with an increased risk for all-cause dementia. Among current drinkers, there appears to be no consistent evidence to suggest that the amount of alcohol consumed in later life is associated with dementia risk

    From DNA sequence to application: possibilities and complications

    Get PDF
    The development of sophisticated genetic tools during the past 15 years have facilitated a tremendous increase of fundamental and application-oriented knowledge of lactic acid bacteria (LAB) and their bacteriophages. This knowledge relates both to the assignments of open reading frames (ORF’s) and the function of non-coding DNA sequences. Comparison of the complete nucleotide sequences of several LAB bacteriophages has revealed that their chromosomes have a fixed, modular structure, each module having a set of genes involved in a specific phase of the bacteriophage life cycle. LAB bacteriophage genes and DNA sequences have been used for the construction of temperature-inducible gene expression systems, gene-integration systems, and bacteriophage defence systems. The function of several LAB open reading frames and transcriptional units have been identified and characterized in detail. Many of these could find practical applications, such as induced lysis of LAB to enhance cheese ripening and re-routing of carbon fluxes for the production of a specific amino acid enantiomer. More knowledge has also become available concerning the function and structure of non-coding DNA positioned at or in the vicinity of promoters. In several cases the mRNA produced from this DNA contains a transcriptional terminator-antiterminator pair, in which the antiterminator can be stabilized either by uncharged tRNA or by interaction with a regulatory protein, thus preventing formation of the terminator so that mRNA elongation can proceed. Evidence has accumulated showing that also in LAB carbon catabolite repression in LAB is mediated by specific DNA elements in the vicinity of promoters governing the transcription of catabolic operons. Although some biological barriers have yet to be solved, the vast body of scientific information presently available allows the construction of tailor-made genetically modified LAB. Today, it appears that societal constraints rather than biological hurdles impede the use of genetically modified LAB.

    Temporal changes in frequency of severe hypoglycemia treated by emergency medical services in types 1 and 2 diabetes:a population-based data-linkage cohort study

    Get PDF
    Background&nbsp; Almost 20 years ago, the frequencies of severe hypoglycemia requiring emergency medical treatment were reported in people with types 1 and 2 diabetes in the Tayside region of Scotland. With subsequent improvements in the treatment of diabetes, concurrent with changes in the provision of emergency medical care, a decline in the frequency of severe hypoglycemia could be anticipated. The present population-based data-linkage cohort study aimed to ascertain whether a temporal change has occurred in the incidence rates of hypoglycemia requiring emergency medical services in people with types 1 and 2 diabetes.&nbsp; Methods&nbsp; The study population comprised all people with diabetes in Tayside, Scotland over the period 1 January 2011 to 31 December 2012. Patients&rsquo; data from different healthcare sources were linked anonymously to measure the incidence rates of hypoglycemia requiring emergency medical services that include treatment by ambulance staff and in hospital emergency departments, and necessitated hospital admission. These were compared with data recorded in 1997&ndash;1998 in the same region.&nbsp; Results&nbsp; In January 2011 to December 2012, 2029 people in Tayside had type 1 diabetes and 21,734 had type 2 diabetes, compared to 977 and 7678, respectively, in June 1997 to May 1998. In people with type 2 diabetes, the proportion treated with sulfonylureas had declined from 36.8 to 22.4% (p&lt;0.001), while insulin-treatment had increased from 11.7 to 18.7% (p&lt;0.001). The incidence rate of hypoglycemia requiring emergency medical treatment had significantly fallen from 0.115 (95% CI: 0.094&ndash;0.136) to 0.082 (0.073&ndash;0.092) events per person per year in type 1 diabetes (p&lt;0.001), and from 0.118 (0.095&ndash;0.141) to 0.037 (0.003&ndash;0.041) in insulin-treated type 2 diabetes (p=0.008). However, the absolute annual number of hypoglycemia events requiring emergency treatment was 1.4-fold higher.&nbsp; Conclusions&nbsp; Although from 1998 to 2012 the incidences of hypoglycemia requiring emergency medical services appeared to have declined by a third in type 1 diabetes and by two thirds in insulin-treated type 2 diabetes, because the prevalence of diabetes was higher (2.7 fold), the number of severe hypoglycemia events requiring emergency medical treatment was greater

    Social sciences research in neglected tropical diseases 2: A bibliographic analysis

    Get PDF
    The official published version of the article can be found at the link below.Background There are strong arguments for social science and interdisciplinary research in the neglected tropical diseases. These diseases represent a rich and dynamic interplay between vector, host, and pathogen which occurs within social, physical and biological contexts. The overwhelming sense, however, is that neglected tropical diseases research is a biomedical endeavour largely excluding the social sciences. The purpose of this review is to provide a baseline for discussing the quantum and nature of the science that is being conducted, and the extent to which the social sciences are a part of that. Methods A bibliographic analysis was conducted of neglected tropical diseases related research papers published over the past 10 years in biomedical and social sciences. The analysis had textual and bibliometric facets, and focussed on chikungunya, dengue, visceral leishmaniasis, and onchocerciasis. Results There is substantial variation in the number of publications associated with each disease. The proportion of the research that is social science based appears remarkably consistent (<4%). A textual analysis, however, reveals a degree of misclassification by the abstracting service where a surprising proportion of the "social sciences" research was pure clinical research. Much of the social sciences research also tends to be "hand maiden" research focused on the implementation of biomedical solutions. Conclusion There is little evidence that scientists pay any attention to the complex social, cultural, biological, and environmental dynamic involved in human pathogenesis. There is little investigator driven social science and a poor presence of interdisciplinary science. The research needs more sophisticated funders and priority setters who are not beguiled by uncritical biomedical promises

    Phantom evaluation of a cardiac SPECT/VCT system that uses a common set of solid-state detectors for both emission and transmission scans

    Get PDF
    We developed a cardiac SPECT system (X-ACT) with low dose volume CT transmission-based attenuation correction (AC). Three solid-state detectors are configured to form a triple-head system for emission scans and reconfigured to form a 69-cm field-of-view detector arc for transmission scans. A near mono-energetic transmission line source is produced from the collimated fluorescence x-ray emitted from a lead target when the target is illuminated by a narrow polychromatic x-ray beam from an x-ray tube. Transmission scans can be completed in 1 min with insignificant patient dose (deep dose equivalent &lt;5 μSv). We used phantom studies to evaluate (1) the accuracy of the reconstructed attenuation maps, (2) the effect of AC on image uniformity, and (3) the effect of AC on defect contrast (DC). The phantoms we used included an ACR phantom, an anthropomorphic phantom with a uniform cardiac insert, and an anthropomorphic phantom with two defects in the cardiac insert. The reconstructed attenuation coefficient of water at 140 keV was .150 ± .003/cm in the uniform region of the ACR phantom, .151 ± .003/cm and .151 ± .002/cm in the liver and cardiac regions of the anthropomorphic phantom. The ACR phantom images with AC showed correction of the bowing effect due to attenuation in the images without AC (NC). The 17-segment scores of the images of the uniform cardiac insert were 78.3 ± 6.5 before and 87.9 ± 3.3 after AC (average ± standard deviation). The inferior-to-anterior wall ratio and the septal-to-lateral wall ratio were .99 and 1.16 before and 1.02 and 1.00 after AC. The DC of the two defects was .528 and .156 before and .628 and .173 after AC. The X-ACT system generated accurate attenuation maps with 1-minute transmission scans. AC improved image quality and uniformity over NC

    Reversible Induction of Phantom Auditory Sensations through Simulated Unilateral Hearing Loss

    Get PDF
    Tinnitus, a phantom auditory sensation, is associated with hearing loss in most cases, but it is unclear if hearing loss causes tinnitus. Phantom auditory sensations can be induced in normal hearing listeners when they experience severe auditory deprivation such as confinement in an anechoic chamber, which can be regarded as somewhat analogous to a profound bilateral hearing loss. As this condition is relatively uncommon among tinnitus patients, induction of phantom sounds by a lesser degree of auditory deprivation could advance our understanding of the mechanisms of tinnitus. In this study, we therefore investigated the reporting of phantom sounds after continuous use of an earplug. 18 healthy volunteers with normal hearing wore a silicone earplug continuously in one ear for 7 days. The attenuation provided by the earplugs simulated a mild high-frequency hearing loss, mean attenuation increased from <10 dB at 0.25 kHz to >30 dB at 3 and 4 kHz. 14 out of 18 participants reported phantom sounds during earplug use. 11 participants presented with stable phantom sounds on day 7 and underwent tinnitus spectrum characterization with the earplug still in place. The spectra showed that the phantom sounds were perceived predominantly as high-pitched, corresponding to the frequency range most affected by the earplug. In all cases, the auditory phantom disappeared when the earplug was removed, indicating a causal relation between auditory deprivation and phantom sounds. This relation matches the predictions of our computational model of tinnitus development, which proposes a possible mechanism by which a stabilization of neuronal activity through homeostatic plasticity in the central auditory system could lead to the development of a neuronal correlate of tinnitus when auditory nerve activity is reduced due to the earplug

    An Iterative Jackknife Approach for Assessing Reliability and Power of fMRI Group Analyses

    Get PDF
    For functional magnetic resonance imaging (fMRI) group activation maps, so-called second-level random effect approaches are commonly used, which are intended to be generalizable to the population as a whole. However, reliability of a certain activation focus as a function of group composition or group size cannot directly be deduced from such maps. This question is of particular relevance when examining smaller groups (<20–27 subjects). The approach presented here tries to address this issue by iteratively excluding each subject from a group study and presenting the overlap of the resulting (reduced) second-level maps in a group percent overlap map. This allows to judge where activation is reliable even upon excluding one, two, or three (or more) subjects, thereby also demonstrating the inherent variability that is still present in second-level analyses. Moreover, when progressively decreasing group size, foci of activation will become smaller and/or disappear; hence, the group size at which a given activation disappears can be considered to reflect the power necessary to detect this particular activation. Systematically exploiting this effect allows to rank clusters according to their observable effect size. The approach is tested using different scenarios from a recent fMRI study (children performing a “dual-use” fMRI task, n = 39), and the implications of this approach are discussed
    corecore