381 research outputs found

    Captivating thoughts: nocturnal pollution, imagination and the sleeping mind in the twelfth and thirteenth centuries

    Get PDF
    Medieval attempts to understand nocturnal emissions – involuntary bodily excretions during sleep which were identified as morally ambiguous – became extensive explorations of the unique and problematic features of sleep and the mental state it produced. During the twelfth and thirteenth centuries, nocturnal pollutions became the object of an intensive scrutiny of sleep as a site of moral concern. Causal explanations often centred on human psychology, in particular the unusual status of the sleeping mind, in an attempt to understand the intricate ways in which mind, body and soul were uniquely bound together in sleep. The mental states before, during and after sleep were understood to interact with one another in complex ways which centred on questions of culpability and its lack. A comparison of medical, natural-philosophical, theological and canon law materials discussing nocturnal pollution reveals a preoccupation with the sleeper’s mind as exceptional, uncontrollable and problematic

    A Foucauldian Analysis of ‘Troubled Families’

    Get PDF
    The ‘Troubled Families Agenda’ (TFA), a national initiative launched by the UK Government in 2011, aimed to identify and work with families defined by the Government as ‘troubled’, in order to decrease their ‘anti-social behaviour’, help children back into school and support parents into employment. This research, undertaken from a social constructionist critical realist epistemological position, attempted to gain an understanding of the Government’s construction of ‘troubled families’, and to consider what ways of thinking about, and working with, families these constructions might have enabled and silenced. The dataset consisted of: the seven policy and guidance documents available on the Government’s TFA website; five speeches concerning the TFA made by leading politicians; and four parliamentary debate and Commons’ Select Committee report extracts. The dataset inclusion criteria required government policy documents and texts of speeches and debates to have been published between 6th March 2010 and 31st March 2013, and to refer to ‘troubled families’ more than twice. The analysis of this dataset was conducted using a discourse analytic approach, drawing on the work of Michel Foucault. Seven analytic steps were followed, which included repeated readings and coding of the texts. Four dominant governmental constructions of ‘troubled families’ were identified, that of: ‘violent’; ‘workless’; ‘helpless’ families that are ultimately a ‘costly waste of human productivity’. The Government seems to have presented the TFA as an innovative, benevolent social care agenda. However, at its root, the TFA appears to be driven by neo-liberal economic forces, intent on reducing the cost of families that may have a range of difficulties. The Government seems to have taken a reductive approach towards their construction of ‘troubled families’, allowing families to be produced as homogenous and less complex discursive objects. This has allowed the Government to set simple material outcomes for services to achieve with families that may have a range of complex difficulties. These outcomes neatly connect to the financial models underpinning the TFA, enabling the introduction of financial products, such as social impact bonds, which might allow private investors to exert influence upon the TFA services. The Government appears to be using families who may have a range of difficulties as vehicles to grow the social investment market. It is argued that this is likely to negatively impact the design of services, which might hinder social and health care professionals’ ability to work in a manner that will meet the complex needs of families. This research calls for the financial models that underpin services to be designed in the best interest of the service users, rather than that of investors and Government. This research also echoes calls for the perspectives and experiences of families with complex needs to be more effectively incorporated into the development of family initiatives, such as the TFA. Finally, this study encourages frontline workers and clinical psychologists to be more aware of the political forces and neo-liberal assumptions that are shaping the services in which they work, if effective forms of resistance are to be made possible

    Bayesian Methods for Highly Correlated Exposures: an Application to Tap Water Disinfection By-Products and Spontaneous Abortion

    Get PDF
    Highly correlated exposures are common in epidemiology. However, standard maximum likelihood techniques frequently fail to provide reliable estimates in the presence of highly correlated exposures. As a result, hierarchical regression methods are increasingly being used. Hierarchical regression places a prior distribution on the exposure-specific regression coe±cients in order to stabilize estimates and incorporate prior knowledge. We examine three types of hierarchical models: semi-Bayes, fully-Bayes, and Dirichlet Process Priors. In the semi-Bayes approach, the prior mean and variance are treated as fixed constants chosen by the epidemiologist. An alternative is the fully-Bayes approach that places hyperprior distributions on the mean and variance of the prior distribution to allow the data to inform about their values. Both of these approaches rely on a parametric specification for the exposure-speciffic coe±cients. As a more flexible nonparametric option, one can use a Dirichlet process prior which also serves to cluster exposures into groups, effectively reducing dimensionality. We examine the properties of these three models and compare their mean squared error in simulated datasets. We use these hierarchical models to examine the relationship between disinfection by-products and spontaneous abortion. Spontaneous abortion is a common pregnancy outcome, although relatively little is known about its causes. Previous research has generally indicated an increased risk of spontaneous abortion among those who consume higher amounts of disinfection by-products. Right from the Start is a large multi-center cohort study of women who were followed through early pregnancy. Disinfection by-product concentrations were measured each week during the study, allowing for more precise exposure measurement than previous epidemiologic studies. We focus our attention on the concentrations of 13 constituent disinfection by-products (4 trihalomethanes and 9 haloacetic acids), some of which are so highly correlated that conventional maximum likelihood estimates are unreliable. To allow simultaneous estimation of effects, we implement 4 Bayesian hierarchical models: semi-Bayes, fully-Bayes, Dirichlet process prior (DPP1) and Dirichlet process prior with a selection component (DPP2). Models that allowed prior parameters to be updated from the data tended to give far more precise coe±cients and be more robust to prior specification. The DPP1 and DPP2 models were in close agreement in estimating no effect of any constituent disinfection by-products on spontaneous. The fully-Bayes model largely agreed with the DPP1 and DPP2 models but had less precision, while the semi-Bayes model provided the least precise estimates.Doctor of Philosoph

    Intrarectal quinine for treating Plasmodium falciparum malaria: a systematic review

    Get PDF
    BACKGROUND: In children with malaria caused by Plasmodium falciparum, quinine administered rectally may be easier to use and less painful than intramuscular or intravenous administration. The objective of this review was to compare the effectiveness of intrarectal with intravenous or intramuscular quinine for treating falciparum malaria. METHODS: All randomized and quasi-randomized controlled trials comparing intrarectal with intramuscular or intravenous quinine for treating people with falciparum malaria located through the following sources were included: Cochrane Infectious Diseases Group Specialized Register, CENTRAL, MEDLINE, EMBASE, LILACS and CINAHL. Trial quality was assessed and data, including adverse event data, were extracted. Dichotomous data were analysed using odds ratios and continuous data using weighted mean difference. RESULTS: Eight randomized controlled trials (1,247 children) fulfilled the inclusion criteria. The same principal investigator led seven of the trials. Five compared intrarectal with intravenous quinine, and six compared intrarectal with intramuscular treatment. No statistically significant difference was detected for death, parasite clearance by 48 hours and seven days, parasite and fever clearance time, coma recovery time, duration of hospitalization and time before drinking began. One trial (898 children) reported that intrarectal was less painful than intramuscular administration. CONCLUSION: No difference in the effect on parasites and clinical illness was detected for the use of intrarectal quinine compared with other routes, but most trials were small. Pain during application may be less with intrarectal quinine. Further larger trials, in patients with severe malaria and in adults, are required before the intrarectal route could be recommended

    A further critique of the analytic strategy of adjusting for covariates to identify biologic mediation

    Get PDF
    BACKGROUND: Epidemiologic research is often devoted to etiologic investigation, and so techniques that may facilitate mechanistic inferences are attractive. Some of these techniques rely on rigid and/or unrealistic assumptions, making the biologic inferences tenuous. The methodology investigated here is effect decomposition: the contrast between effect measures estimated with and without adjustment for one or more variables hypothesized to lie on the pathway through which the exposure exerts its effect. This contrast is typically used to distinguish the exposure's indirect effect, through the specified intermediate variables, from its direct effect, transmitted via pathways that do not involve the specified intermediates. METHODS: We apply a causal framework based on latent potential response types to describe the limitations inherent in effect decomposition analysis. For simplicity, we assume three measured binary variables with monotonic effects and randomized exposure, and use difference contrasts as measures of causal effect. Previous authors showed that confounding between intermediate and the outcome threatens the validity of the decomposition strategy, even if exposure is randomized. We define exchangeability conditions for absence of confounding of causal effects of exposure and intermediate, and generate two example populations in which the no-confounding conditions are satisfied. In one population we impose an additional prohibition against unit-level interaction (synergism). We evaluate the performance of the decomposition strategy against true values of the causal effects, as defined by the proportions of latent potential response types in the two populations. RESULTS: We demonstrate that even when there is no confounding, partition of the total effect into direct and indirect effects is not reliably valid. Decomposition is valid only with the additional restriction that the population contain no units in which exposure and intermediate interact to cause the outcome. This restriction implies homogeneity of causal effects across strata of the intermediate. CONCLUSIONS: Reliable effect decomposition requires not only absence of confounding, but also absence of unit-level interaction and use of linear contrasts as measures of causal effect. Epidemiologists should be wary of etiologic inference based on adjusting for intermediates, especially when using ratio effect measures or when absence of interacting potential response types cannot be confidently asserted

    (Errors in statistical tests)3

    Get PDF
    In 2004, Garcia-Berthou and Alcaraz published "Incongruence between test statistics and P values in medical papers," a critique of statistical errors that received a tremendous amount of attention. One of their observations was that the final reported digit of p-values in articles published in the journal Nature departed substantially from the uniform distribution that they suggested should be expected. In 2006, Jeng critiqued that critique, observing that the statistical analysis of those terminal digits had been based on comparing the actual distribution to a uniform continuous distribution, when digits obviously are discretely distributed. Jeng corrected the calculation and reported statistics that did not so clearly support the claim of a digit preference. However delightful it may be to read a critique of statistical errors in a critique of statistical errors, we nevertheless found several aspects of the whole exchange to be quite troubling, prompting our own meta-critique of the analysis

    Association of Educational Attainment With Lifetime Risk of Cardiovascular Disease: The Atherosclerosis Risk in Communities Study

    Get PDF
    Estimates of lifetime risk may help raise awareness of the extent to which educational inequalities are associated with risk of cardiovascular disease (CVD). To estimate lifetime risks of CVD according to categories of educational attainment. Participants were followed from 1987 through December 31, 2013. All CVD events (coronary heart disease, heart failure, and stroke) were confirmed by physician review and International Classification of Diseases codes. A total of 13 948 whites and African Americans who were 45 to 64 years old and free of CVD at baseline were included from 4 US communities (Washington County, Maryland; Forsyth County, North Carolina; Jackson, Mississippi; and suburbs of Minneapolis, Minnesota). The data analysis was performed from June 7 to August 31, 2016. Educational attainment. We used a life table approach to estimate lifetime risks of CVD from age 45 through 85 years according to educational attainment. We adjusted for competing risks of death from underlying causes other than CVD. The sample of 13 948 participants was 56% female and 27% African American. During 269 210 person-years of follow-up, we documented 4512 CVD events and 2401 non-CVD deaths. Educational attainment displayed an inverse dose-response relation with cumulative risk of CVD, which became evident in middle age, with the most striking gap between those not completing vs completing high school. In men, lifetime risks of CVD were 59.0% (95% CI, 54.0%-64.1%) for grade school, 52.5% (95% CI, 47.7%-56.8%) for high school education without graduation, 50.9% (95% CI, 47.3%-53.9%) for high school graduation, 47.2% (95% CI, 41.5%-52.5%) for vocational school, 46.4% (95% CI, 42.8%-49.6%) for college with or without graduation, and 42.2% (95% CI, 36.6%-47.0%) for graduate/professional school; in women, 50.8% (95% CI, 45.7%-55.8%), 49.3% (95% CI, 45.1%-53.1%), 36.3% (95% CI, 33.4%-39.1%), 32.2% (95% CI, 26.0%-37.3%), 32.8% (95% CI, 29.1%-35.9%), and 28.0% (95% CI, 21.9%-33.3%), respectively. Educational attainment was inversely associated with CVD even within categories of family income, income change, occupation, or parental educational level. More than 1 in 2 individuals with less than high school education had a lifetime CVD event. Educational attainment was inversely associated with the lifetime risk of CVD, regardless of other important socioeconomic characteristics. Our findings emphasize the need for further efforts to reduce CVD inequalities related to educational disparities

    Observed and Expected Mortality in Cohort Studies

    Get PDF
    Epidemiologists often compare the observed number of deaths in a cohort with the expected number of deaths, obtained by multiplying person-time accrued in the cohort by mortality rates for a reference population (ideally, a reference that represents the mortality rate in the cohort in the absence of exposure). However, if exposure is hazardous (or salutary), this calculation will not consistently estimate the number of deaths expected in the absence of exposure because exposure will have affected the distribution of person-time observed in the study cohort. While problems with interpretation of this standard calculation of expected counts were discussed more than 2 decades ago, these discussions had little impact on epidemiologic practice. The logic of counterfactuals may help clarify this topic as we revisit these issues. In this paper, we describe a simple way to consistently estimate the expected number of deaths in such settings, and we illustrate the approach using data from a cohort study of mortality among underground miners

    Markov Chain Monte Carlo: an introduction for epidemiologists

    Get PDF
    Markov Chain Monte Carlo (MCMC) methods are increasingly popular among epidemiologists. The reason for this may in part be that MCMC offers an appealing approach to handling some difficult types of analyses. Additionally, MCMC methods are those most commonly used for Bayesian analysis. However, epidemiologists are still largely unfamiliar with MCMC. They may lack familiarity either with he implementation of MCMC or with interpretation of the resultant output. As with tutorials outlining the calculus behind maximum likelihood in previous decades, a simple description of the machinery of MCMC is needed. We provide an introduction to conducting analyses with MCMC, and show that, given the same data and under certain model specifications, the results of an MCMC simulation match those of methods based on standard maximum-likelihood estimation (MLE). In addition, we highlight examples of instances in which MCMC approaches to data analysis provide a clear advantage over MLE. We hope that this brief tutorial will encourage epidemiologists to consider MCMC approaches as part of their analytic tool-kit

    Translating Cochrane Reviews to Ensure that Healthcare Decision-Making is Informed by High-Quality Research Evidence

    Get PDF
    Erik von Elm and colleagues discuss plans to increase access and global reach of Cochrane Reviews through translations into other languages. Please see later in the article for the Editors' Summar
    • 

    corecore