319 research outputs found
Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?
© 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio
Climate structuring of Batrachochytrium dendrobatidis infection in the threatened amphibians of the northern Western Ghats, India
Batrachochytrium dendrobatidis (Bd) is a pathogen killing amphibians worldwide. Its impact across much of Asia is poorly characterized. This study systematically surveyed amphibians for Bd across rocky plateaus in the northern section of the Western Ghats biodiversity hotspot, India, including the first surveys of the plateaus in the coastal region. These ecosystems offer an epidemiological model system since they are characterized by differing levels of connectivity, edaphic and climatic conditions, and anthropogenic stressors. One hundred and eighteen individuals of 21 species of Anura and Apoda on 13 plateaus ranging from 67 to 1179 m above sea level and 15.89 to 17.92° North latitude were sampled. Using qPCR protocols, 79% of species and 27% of individuals tested were positive for Bd. This is the first record of Bd in caecilians in India, the Critically Endangered Xanthophryne tigerina and Endangered Fejervarya cf. sahyadris. Mean site prevalence was 28.15%. Prevalence below the escarpment was 31.2% and 25.4% above. The intensity of infection (GE) showed the reverse pattern. Infection may be related to elevational temperature changes, thermal exclusion, inter-site connectivity and anthropogenic disturbance. Coastal plateaus may be thermal refuges from Bd. Infected amphibians represented a wide range of ecological traits posing interesting questions about transmission routes
Clinical outcomes in typhoid fever: adverse impact of infection with nalidixic acid-resistant Salmonella typhi
BACKGROUND: Widespread use of fluoroquinolones has resulted in emergence of Salmonella typhi strains with decreased susceptibility to fluoroquinolones. These strains are identifiable by their nalidixic acid-resistance. We studied the impact of infection with nalidixic acid-resistant S. typhi (NARST) on clinical outcomes in patients with bacteriologically-confirmed typhoid fever. METHODS: Clinical and laboratory features, fever clearance time and complications were prospectively studied in patients with blood culture-proven typhoid fever, treated at a tertiary care hospital in north India, during the period from November 2001 to October 2003. Susceptibility to amoxycillin, co-trimoxazole, chloramphenicol, ciprofloxacin and ceftriaxone were tested by disc diffusion method. Minimum inhibitory concentrations (MIC) of ciprofloxacin and ceftriaxone were determined by E-test method. RESULTS: During a two-year period, 60 patients (age [mean ± SD]: 15 ± 9 years; males: 40 [67%]) were studied. All isolates were sensitive to ciprofloxacin and ceftriaxone by disc diffusion and MIC breakpoints. However, 11 patients had clinical failure of fluoroquinolone therapy. Infections with NARST isolates (47 [78%]) were significantly associated with longer duration of fever at presentation (median [IQR] 10 [7-15] vs. 4 [3-6] days; P = 0.000), higher frequency of hepatomegaly (57% vs. 15%; P = 0.021), higher levels of aspartate aminotransferase (121 [66–235] vs. 73 [44–119] IU/L; P = 0.033), and increased MIC of ciprofloxacin (0.37 ± 0.21 vs. 0.17 ± 0.14 μg/mL; P = 0.005), as compared to infections with nalidixic acid-susceptible isolates. All 11 patients with complications were infected with NARST isolates. Total duration of illness was significantly longer in patients who developed complications than in patients who did not (22 [14.8–32] vs. 12 [9.3–20.3] days; P = 0.011). Duration of prior antibiotic intake had a strong positive correlation with the duration of fever at presentation (r = 0.61; P = 0.000) as well as the total duration of illness (r = 0.53; P = 0.000). CONCLUSION: Typhoid fever caused by NARST infection is associated with poor clinical outcomes, probably due to delay in initiating appropriate antibiotic therapy. Fluoroquinolone breakpoints for S. typhi need to be redefined and fluoroquinolones should no longer be used as first-line therapy, if the prevalence of NARST is high
Multiple and Multidimensional life transitions in the context of life-limiting health conditions:Longitudinal study focussing on perspectives of Young Adults, Families and Professionals
Background:
There is a dearth of literature that investigates life transitions of young adults (YAs) with life-limiting conditions, families and professionals. The scant literature that is available has methodological limitations, including not listening to the voice of YAs, collecting data retrospectively, at one time point, from one group’s perspective and single case studies. The aim of this study was to address the gaps found in our literature review and provide a clearer understanding of the multiple and multi-dimensional life transitions experienced by YAs and significant others, over a period of time.
Methods:
This qualitative study used a longitudinal design and data were collected using semi-structured interviews over a 6-month period at 3 time points. Participants included 12 YAs with life-limiting conditions and their nominated significant others (10 family members and 11 professionals). Data were analysed using a thematic analysis approach.
Results:
Life transitions of YA and significant others are complex; they experience multiple and multi-dimensional transitions across several domains. The findings challenge the notion that all life transitions are triggered by health transitions of YAs, and has highlighted environmental factors (attitudinal and systemic) that can be changed to facilitate smoother transitions in various aspects of their lives.
Conclusions:
This study makes a unique and significant contribution to literature. It provides evidence and rich narratives for policy makers and service providers to change policies and practices that are in line with the needs of YAs with life-limiting conditions as they transition to adulthood. Families and professionals have specific training needs that have not yet been met fully
Transitions in bacterial communities along the 2000 km salinity gradient of the Baltic Sea
Salinity is a major factor controlling the distribution of biota in aquatic systems, and most aquatic multicellular organisms are either adapted to life in saltwater or freshwater conditions. Consequently, the saltwater–freshwater mixing zones in coastal or estuarine areas are characterized by limited faunal and floral diversity. Although changes in diversity and decline in species richness in brackish waters is well documented in aquatic ecology, it is unknown to what extent this applies to bacterial communities. Here, we report a first detailed bacterial inventory from vertical profiles of 60 sampling stations distributed along the salinity gradient of the Baltic Sea, one of world's largest brackish water environments, generated using 454 pyrosequencing of partial (400 bp) 16S rRNA genes. Within the salinity gradient, bacterial community composition altered at broad and finer-scale phylogenetic levels. Analogous to faunal communities within brackish conditions, we identified a bacterial brackish water community comprising a diverse combination of freshwater and marine groups, along with populations unique to this environment. As water residence times in the Baltic Sea exceed 3 years, the observed bacterial community cannot be the result of mixing of fresh water and saltwater, but our study represents the first detailed description of an autochthonous brackish microbiome. In contrast to the decline in the diversity of multicellular organisms, reduced bacterial diversity at brackish conditions could not be established. It is possible that the rapid adaptation rate of bacteria has enabled a variety of lineages to fill what for higher organisms remains a challenging and relatively unoccupied ecological niche
A typhoid fever outbreak in a slum of South Dumdum municipality, West Bengal, India, 2007: Evidence for foodborne and waterborne transmission
<p>Abstract</p> <p>Background</p> <p>In April 2007, a slum of South Dumdum municipality, West Bengal reported an increase in fever cases. We investigated to identify the agent, the source and to propose recommendations.</p> <p>Methods</p> <p>We defined a suspected case of typhoid fever as occurrence of fever for ≥ one week among residents of ward 1 of South Dumdum during February – May 2007. We searched for suspected cases in health care facilities and collected blood specimens. We described the outbreak by time, place and person. We compared probable cases (Widal positive >= 1:80) with neighbourhood-matched controls. We assessed the environment and collected water specimens.</p> <p>Results</p> <p>We identified 103 suspected cases (Attack rate: 74/10,000, highest among 5–14 years old group, no deaths). Salmonella (enterica) Typhi was isolated from one of four blood specimens and 65 of 103 sera were >= 1:80 Widal positive. The outbreak started on 13 February, peaked twice during the last week of March and second week of April and lasted till 27 April. Suspected cases clustered around three public taps. Among 65 probable cases and 65 controls, eating milk products from a sweet shop (Matched odds ratio [MOR]: 6.2, 95% confidence interval [CI]: 2.4–16, population attributable fraction [PAF]: 53%) and drinking piped water (MOR: 7.3, 95% CI: 2.5–21, PAF-52%) were associated with illness. The sweet shop food handler suffered from typhoid in January. The pipelines of intermittent non-chlorinated water supply ran next to an open drain connected with sewerage system and water specimens showed faecal contamination.</p> <p>Conclusion</p> <p>The investigation suggested that an initial foodborne outbreak of typhoid led to the contamination of the water supply resulting in a secondary, waterborne wave. We educated the food handler, repaired the pipelines and ensured chlorination of the water.</p
Sampling strategies to measure the prevalence of common recurrent infections in longitudinal studies
<p>Abstract</p> <p>Background</p> <p>Measuring recurrent infections such as diarrhoea or respiratory infections in epidemiological studies is a methodological challenge. Problems in measuring the incidence of recurrent infections include the episode definition, recall error, and the logistics of close follow up. Longitudinal prevalence (LP), the proportion-of-time-ill estimated by repeated prevalence measurements, is an alternative measure to incidence of recurrent infections. In contrast to incidence which usually requires continuous sampling, LP can be measured at intervals. This study explored how many more participants are needed for infrequent sampling to achieve the same study power as frequent sampling.</p> <p>Methods</p> <p>We developed a set of four empirical simulation models representing low and high risk settings with short or long episode durations. The model was used to evaluate different sampling strategies with different assumptions on recall period and recall error.</p> <p>Results</p> <p>The model identified three major factors that influence sampling strategies: (1) the clustering of episodes in individuals; (2) the duration of episodes; (3) the positive correlation between an individual's disease incidence and episode duration. Intermittent sampling (e.g. 12 times per year) often requires only a slightly larger sample size compared to continuous sampling, especially in cluster-randomized trials. The collection of period prevalence data can lead to highly biased effect estimates if the exposure variable is associated with episode duration. To maximize study power, recall periods of 3 to 7 days may be preferable over shorter periods, even if this leads to inaccuracy in the prevalence estimates.</p> <p>Conclusion</p> <p>Choosing the optimal approach to measure recurrent infections in epidemiological studies depends on the setting, the study objectives, study design and budget constraints. Sampling at intervals can contribute to making epidemiological studies and trials more efficient, valid and cost-effective.</p
Global, regional, and national comparative risk assessment of 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks, 1990-2015: a systematic analysis for the Global Burden of Disease Study 2015
SummaryBackground The Global Burden of Diseases, Injuries, and Risk Factors Study 2015 provides an up-to-date synthesis of the evidence for risk factor exposure and the attributable burden of disease. By providing national and subnational assessments spanning the past 25 years, this study can inform debates on the importance of addressing risks in context. Methods We used the comparative risk assessment framework developed for previous iterations of the Global Burden of Disease Study to estimate attributable deaths, disability-adjusted life-years (DALYs), and trends in exposure by age group, sex, year, and geography for 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks from 1990 to 2015. This study included 388 risk-outcome pairs that met World Cancer Research Fund-defined criteria for convincing or probable evidence. We extracted relative risk and exposure estimates from randomised controlled trials, cohorts, pooled cohorts, household surveys, census data, satellite data, and other sources. We used statistical models to pool data, adjust for bias, and incorporate covariates. We developed a metric that allows comparisons of exposure across risk factors—the summary exposure value. Using the counterfactual scenario of theoretical minimum risk level, we estimated the portion of deaths and DALYs that could be attributed to a given risk. We decomposed trends in attributable burden into contributions from population growth, population age structure, risk exposure, and risk-deleted cause-specific DALY rates. We characterised risk exposure in relation to a Socio-demographic Index (SDI). Findings Between 1990 and 2015, global exposure to unsafe sanitation, household air pollution, childhood underweight, childhood stunting, and smoking each decreased by more than 25%. Global exposure for several occupational risks, high body-mass index (BMI), and drug use increased by more than 25% over the same period. All risks jointly evaluated in 2015 accounted for 57·8% (95% CI 56·6–58·8) of global deaths and 41·2% (39·8–42·8) of DALYs. In 2015, the ten largest contributors to global DALYs among Level 3 risks were high systolic blood pressure (211·8 million [192·7 million to 231·1 million] global DALYs), smoking (148·6 million [134·2 million to 163·1 million]), high fasting plasma glucose (143·1 million [125·1 million to 163·5 million]), high BMI (120·1 million [83·8 million to 158·4 million]), childhood undernutrition (113·3 million [103·9 million to 123·4 million]), ambient particulate matter (103·1 million [90·8 million to 115·1 million]), high total cholesterol (88·7 million [74·6 million to 105·7 million]), household air pollution (85·6 million [66·7 million to 106·1 million]), alcohol use (85·0 million [77·2 million to 93·0 million]), and diets high in sodium (83·0 million [49·3 million to 127·5 million]). From 1990 to 2015, attributable DALYs declined for micronutrient deficiencies, childhood undernutrition, unsafe sanitation and water, and household air pollution; reductions in risk-deleted DALY rates rather than reductions in exposure drove these declines. Rising exposure contributed to notable increases in attributable DALYs from high BMI, high fasting plasma glucose, occupational carcinogens, and drug use. Environmental risks and childhood undernutrition declined steadily with SDI; low physical activity, high BMI, and high fasting plasma glucose increased with SDI. In 119 countries, metabolic risks, such as high BMI and fasting plasma glucose, contributed the most attributable DALYs in 2015. Regionally, smoking still ranked among the leading five risk factors for attributable DALYs in 109 countries; childhood underweight and unsafe sex remained primary drivers of early death and disability in much of sub-Saharan Africa. Interpretation Declines in some key environmental risks have contributed to declines in critical infectious diseases. Some risks appear to be invariant to SDI. Increasing risks, including high BMI, high fasting plasma glucose, drug use, and some occupational exposures, contribute to rising burden from some conditions, but also provide opportunities for intervention. Some highly preventable risks, such as smoking, remain major causes of attributable DALYs, even as exposure is declining. Public policy makers need to pay attention to the risks that are increasingly major contributors to global burden. Funding Bill & Melinda Gates Foundation
- …