230 research outputs found
Global, regional, and national comparative risk assessment of 84 behavioural, environmental and occupational, and metabolic risks or clusters of risks for 195 countries and territories, 1990–2017 : a systematic analysis for the Global Burden of Disease Study 2017
Background: The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2017 comparative risk assessment (CRA) is a comprehensive approach to risk factor quantification that offers a useful tool for synthesising evidence on risks and risk outcome associations. With each annual GBD study, we update the GBD CRA to incorporate improved methods, new risks and risk outcome pairs, and new data on risk exposure levels and risk outcome associations.
Methods: We used the CRA framework developed for previous iterations of GBD to estimate levels and trends in exposure, attributable deaths, and attributable disability-adjusted life-years (DALYs), by age group, sex, year, and location for 84 behavioural, environmental and occupational, and metabolic risks or groups of risks from 1990 to 2017. This study included 476 risk outcome pairs that met the GBD study criteria for convincing or probable evidence of causation. We extracted relative risk and exposure estimates from 46 749 randomised controlled trials, cohort studies, household surveys, census data, satellite data, and other sources. We used statistical models to pool data, adjust for bias, and incorporate covariates. Using the counterfactual scenario of theoretical minimum risk exposure level (TMREL), we estimated the portion of deaths and DALYs that could be attributed to a given risk. We explored the relationship between development and risk exposure by modelling the relationship between the Socio-demographic Index (SDI) and risk-weighted exposure prevalence and estimated expected levels of exposure and risk-attributable burden by SDI. Finally, we explored temporal changes in risk-attributable DALYs by decomposing those changes into six main component drivers of change as follows: (1) population growth; (2) changes in population age structures; (3) changes in exposure to environmental and occupational risks; (4) changes in exposure to behavioural risks; (5) changes in exposure to metabolic risks; and (6) changes due to all other factors, approximated as the risk-deleted death and DALY rates, where the risk-deleted rate is the rate that would be observed had we reduced the exposure levels to the TMREL for all risk factors included in GBD 2017.
Findings: In 2017,34.1 million (95% uncertainty interval [UI] 33.3-35.0) deaths and 121 billion (144-1.28) DALYs were attributable to GBD risk factors. Globally, 61.0% (59.6-62.4) of deaths and 48.3% (46.3-50.2) of DALYs were attributed to the GBD 2017 risk factors. When ranked by risk-attributable DALYs, high systolic blood pressure (SBP) was the leading risk factor, accounting for 10.4 million (9.39-11.5) deaths and 218 million (198-237) DALYs, followed by smoking (7.10 million [6.83-7.37] deaths and 182 million [173-193] DALYs), high fasting plasma glucose (6.53 million [5.23-8.23] deaths and 171 million [144-201] DALYs), high body-mass index (BMI; 4.72 million [2.99-6.70] deaths and 148 million [98.6-202] DALYs), and short gestation for birthweight (1.43 million [1.36-1.51] deaths and 139 million [131-147] DALYs). In total, risk-attributable DALYs declined by 4.9% (3.3-6.5) between 2007 and 2017. In the absence of demographic changes (ie, population growth and ageing), changes in risk exposure and risk-deleted DALYs would have led to a 23.5% decline in DALYs during that period. Conversely, in the absence of changes in risk exposure and risk-deleted DALYs, demographic changes would have led to an 18.6% increase in DALYs during that period. The ratios of observed risk exposure levels to exposure levels expected based on SDI (O/E ratios) increased globally for unsafe drinking water and household air pollution between 1990 and 2017. This result suggests that development is occurring more rapidly than are changes in the underlying risk structure in a population. Conversely, nearly universal declines in O/E ratios for smoking and alcohol use indicate that, for a given SDI, exposure to these risks is declining. In 2017, the leading Level 4 risk factor for age-standardised DALY rates was high SBP in four super-regions: central Europe, eastern Europe, and central Asia; north Africa and Middle East; south Asia; and southeast Asia, east Asia, and Oceania. The leading risk factor in the high-income super-region was smoking, in Latin America and Caribbean was high BMI, and in sub-Saharan Africa was unsafe sex. O/E ratios for unsafe sex in sub-Saharan Africa were notably high, and those for alcohol use in north Africa and the Middle East were notably low.
Interpretation: By quantifying levels and trends in exposures to risk factors and the resulting disease burden, this assessment offers insight into where past policy and programme efforts might have been successful and highlights current priorities for public health action. Decreases in behavioural, environmental, and occupational risks have largely offset the effects of population growth and ageing, in relation to trends in absolute burden. Conversely, the combination of increasing metabolic risks and population ageing will probably continue to drive the increasing trends in non-communicable diseases at the global level, which presents both a public health challenge and opportunity. We see considerable spatiotemporal heterogeneity in levels of risk exposure and risk-attributable burden. Although levels of development underlie some of this heterogeneity, O/E ratios show risks for which countries are overperforming or underperforming relative to their level of development. As such, these ratios provide a benchmarking tool to help to focus local decision making. Our findings reinforce the importance of both risk exposure monitoring and epidemiological research to assess causal connections between risks and health outcomes, and they highlight the usefulness of the GBD study in synthesising data to draw comprehensive and robust conclusions that help to inform good policy and strategic health planning
Mapping 123 million neonatal, infant and child deaths between 2000 and 2017
Since 2000, many countries have achieved considerable success in improving child survival, but localized progress remains unclear. To inform efforts towards United Nations Sustainable Development Goal 3.2—to end preventable child deaths by 2030—we need consistently estimated data at the subnational level regarding child mortality rates and trends. Here we quantified, for the period 2000–2017, the subnational variation in mortality rates and number of deaths of neonates, infants and children under 5 years of age within 99 low- and middle-income countries using a geostatistical survival model. We estimated that 32% of children under 5 in these countries lived in districts that had attained rates of 25 or fewer child deaths per 1,000 live births by 2017, and that 58% of child deaths between 2000 and 2017 in these countries could have been averted in the absence of geographical inequality. This study enables the identification of high-mortality clusters, patterns of progress and geographical inequalities to inform appropriate investments and implementations that will help to improve the health of all populations
A phase II trial of docetaxel and erlotinib as first-line therapy for elderly patients with androgen-independent prostate cancer
Background: Docetaxel is the standard first-line agent for the treatment of androgen-independent prostate cancer (AIPC). The combination of docetaxel with molecularly targeted therapies may offer the potential to increase the efficacy and decrease the toxicity of cytotoxic chemotherapy for prostate cancer. Previous studies demonstrate activation of the human epidermal growth factor receptor (EGFR) in prostate cancer. Erlotinib is a specific inhibitor of the tyrosine-kinase activity of EGFR. The goal of this study is to determine the anti-cancer activity docetaxel combined with erlotinib for the treatment of elderly subjects with AIPC.
Methods: This is a multi-institutional Phase II study in patients with histologically confirmed adenocarcinoma of the prostate and age [greater than or equal to] 65 years. Patients were requred to have progressive disease despite androgen-deprivation therapy as determined by: (1) measurable lesions on cross-sectional imaging; (2) metastatic disease by radionucleotide bone imaging; or (3) elevated prostate specific antigen (PSA). Treatment cycles consisted of
docetaxel 60 mg/m2 IV on day 1 and erlotinib 150 mg PO days 1-21. Patients with responding or stable disease
after 9 cycles were eligible to continue on erlotinib alone as maintenance therapy.
Results: Characteristics of 22 patients enrolled included: median age 73.5 years (range, 65-80); median Karnofsky Performance Status 90 (range 70-100); median hemoglobin 12.1 g/dl (range, 10.0-14.3); median PSA 218.3 ng/ml (range, 9-5754). A median of 6 treatment cycles were delivered per patient (range 1-17). No objective responses were observed in 8 patients with measurable lesions (0%, 95% CI 0-31%). Bone scan
improvement and PSA decline was seen in 1 patient (5%, 95% CI 0.1-25%). Five of 22 patients experienced [greater than or equal to] 50% decline in PSA (23%, 95% CI 8-45%). Hematologic toxicity included grade 3 neutropenia in 9 patients and neutropenic fever in 2 patients. Common non-hematologic toxicities ([greater than or equal to] grade 3) included fatigue, anorexia, and
diarrhea.
Conclusion: Docetaxel/erlotinib can be delivered safely in elderly patients with AIPC. Anti-cancer disease activity appears generally comparable to docetaxel when used as monotherapy. Hematologic and nonhematologic toxicity may be increased over docetaxel monotherapy. Prospective randomized studies would be required to determine if the toxicity of docetaxel and erlotinib justifies its use in this setting.This study was supported by NIH Prostate SPORE P50 CA92131 to DBA. Phase One Foundation to MEG and DBA
Demineralized Freeze-Dried Bovine Cortical Bone: Its Potential for Guided Bone Regeneration Membrane
Background. Bovine pericardium collagen membrane (BPCM) had been widely used in guided bone regeneration (GBR) whose
manufacturing process usually required chemical cross-linking to prolong its biodegradation. However, cross-linking of collagen
fibrils was associated with poorer tissue integration and delayed vascular invasion. Objective.This study evaluated the potential of
bovine cortical bone collagen membrane for GBR by evaluating its antigenicity potential, cytotoxicity, immune and tissue response,
and biodegradation behaviors. Material and Methods. Antigenicity potential of demineralized freeze-dried bovine cortical bone
membrane (DFDBCBM) was done with histology-based anticellularity evaluation, while cytotoxicity was analyzed using MTT
Assay. Evaluation of immune response, tissue response, and biodegradation was done by randomly implanting DFDBCBM and
BPCM in rat’s subcutaneous dorsum. Samples were collected at 2, 5, and 7 days and 7, 14, 21, and 28 days for biocompatibility
and tissue response-biodegradation study, respectively. Result. DFDBCBM, histologically, showed no retained cells; however, it
showed some level of in vitro cytotoxicity. In vivo study exhibited increased immune response toDFDBCBMin early healing phase;
however, normal tissue response and degradation rate were observed up to 4 weeks after DFDBCBM implantation. Conclusion.
Demineralized freeze-dried bovine cortical bone membrane showed potential for clinical application; however, it needs to be
optimized in its biocompatibility to fulfill all requirements for GBR membrane
Impact of Protein Stability, Cellular Localization, and Abundance on Proteomic Detection of Tumor-Derived Proteins in Plasma
Tumor-derived, circulating proteins are potentially useful as biomarkers for detection of cancer, for monitoring of disease progression, regression and recurrence, and for assessment of therapeutic response. Here we interrogated how a protein's stability, cellular localization, and abundance affect its observability in blood by mass-spectrometry-based proteomics techniques. We performed proteomic profiling on tumors and plasma from two different xenograft mouse models. A statistical analysis of this data revealed protein properties indicative of the detection level in plasma. Though 20% of the proteins identified in plasma were tumor-derived, only 5% of the proteins observed in the tumor tissue were found in plasma. Both intracellular and extracellular tumor proteins were observed in plasma; however, after normalizing for tumor abundance, extracellular proteins were seven times more likely to be detected. Although proteins that were more abundant in the tumor were also more likely to be observed in plasma, the relationship was nonlinear: Doubling the spectral count increased detection rate by only 50%. Many secreted proteins, even those with relatively low spectral count, were observed in plasma, but few low abundance intracellular proteins were observed. Proteins predicted to be stable by dipeptide composition were significantly more likely to be identified in plasma than less stable proteins. The number of tryptic peptides in a protein was not significantly related to the chance of a protein being observed in plasma. Quantitative comparison of large versus small tumors revealed that the abundance of proteins in plasma as measured by spectral count was associated with the tumor size, but the relationship was not one-to-one; a 3-fold decrease in tumor size resulted in a 16-fold decrease in protein abundance in plasma. This study provides quantitative support for a tumor-derived marker prioritization strategy that favors secreted and stable proteins over all but the most abundant intracellular proteins
Effects of pH, lactate, hematocrit and potassium level on the accuracy of continuous glucose monitoring (CGM) in pediatric intensive care unit
A Glycemia Risk Index (GRI) of Hypoglycemia and Hyperglycemia for Continuous Glucose Monitoring Validated by Clinician Ratings
BackgroundA composite metric for the quality of glycemia from continuous glucose monitor (CGM) tracings could be useful for assisting with basic clinical interpretation of CGM data.MethodsWe assembled a data set of 14-day CGM tracings from 225 insulin-treated adults with diabetes. Using a balanced incomplete block design, 330 clinicians who were highly experienced with CGM analysis and interpretation ranked the CGM tracings from best to worst quality of glycemia. We used principal component analysis and multiple regressions to develop a model to predict the clinician ranking based on seven standard metrics in an Ambulatory Glucose Profile: very low-glucose and low-glucose hypoglycemia; very high-glucose and high-glucose hyperglycemia; time in range; mean glucose; and coefficient of variation.ResultsThe analysis showed that clinician rankings depend on two components, one related to hypoglycemia that gives more weight to very low-glucose than to low-glucose and the other related to hyperglycemia that likewise gives greater weight to very high-glucose than to high-glucose. These two components should be calculated and displayed separately, but they can also be combined into a single Glycemia Risk Index (GRI) that corresponds closely to the clinician rankings of the overall quality of glycemia (r = 0.95). The GRI can be displayed graphically on a GRI Grid with the hypoglycemia component on the horizontal axis and the hyperglycemia component on the vertical axis. Diagonal lines divide the graph into five zones (quintiles) corresponding to the best (0th to 20th percentile) to worst (81st to 100th percentile) overall quality of glycemia. The GRI Grid enables users to track sequential changes within an individual over time and compare groups of individuals.ConclusionThe GRI is a single-number summary of the quality of glycemia. Its hypoglycemia and hyperglycemia components provide actionable scores and a graphical display (the GRI Grid) that can be used by clinicians and researchers to determine the glycemic effects of prescribed and investigational treatments
Diabetes with Hypertension as Risk Factors for Adult Dengue Hemorrhagic Fever in a Predominantly Dengue Serotype 2 Epidemic: A Case Control Study
Dengue is a major vector borne disease in the tropical and subtropical regions. An estimated 50 million infections occur per annum in over 100 countries. A severe form of dengue, characterized by bleeding and plasma leakage, known as dengue hemorrhagic fever (DHF) is estimated to occur in 1–5% of hospitalized cases. It can be fatal if unrecognized and not treated in a timely manner. Previous studies had found a number of risk factors for DHF. However, screening and clinical management strategies based on these risk factors may not be applicable to all populations and epidemics of different serotypes. In this study, we found significant association between DHF and diabetes mellitus and diabetes mellitus with hypertension during the epidemic of predominantly serotype 2 (year 2007 and 2008), but not during the epidemic of predominantly serotype 1 (year 2006). Diabetes mellitus and hypertension are prevalent in Singapore and most parts of South-East Asia, where dengue is endemic. Therefore, it is important to address the risk effect of these co-morbidities on the development of DHF so as to reduce morbidity and mortality. Our findings may have impact on screening and clinical management of dengue patients, when confirmed in more studies
The Diabetes Technology Society Error Grid and Trend Accuracy Matrix for Glucose Monitors.
INTRODUCTION: An error grid compares measured versus reference glucose concentrations to assign clinical risk values to observed errors. Widely used error grids for blood glucose monitors (BGMs) have limited value because they do not also reflect clinical accuracy of continuous glucose monitors (CGMs).
METHODS: Diabetes Technology Society (DTS) convened 89 international experts in glucose monitoring to (1) smooth the borders of the Surveillance Error Grid (SEG) zones and create a user-friendly tool-the DTS Error Grid; (2) define five risk zones of clinical point accuracy (A-E) to be identical for BGMs and CGMs; (3) determine a relationship between DTS Error Grid percent in Zone A and mean absolute relative difference (MARD) from analyzing 22 BGM and nine CGM accuracy studies; and (4) create trend risk categories (1-5) for CGM trend accuracy.
RESULTS: The DTS Error Grid for point accuracy contains five risk zones (A-E) with straight-line borders that can be applied to both BGM and CGM accuracy data. In a data set combining point accuracy data from 18 BGMs, 2.6% of total data pairs equally moved from Zones A to B and vice versa (SEG compared with DTS Error Grid). For every 1% increase in percent data in Zone A, the MARD decreased by approximately 0.33%. We also created a DTS Trend Accuracy Matrix with five trend risk categories (1-5) for CGM-reported trend indicators compared with reference trends calculated from reference glucose.
CONCLUSION: The DTS Error Grid combines contemporary clinician input regarding clinical point accuracy for BGMs and CGMs. The DTS Trend Accuracy Matrix assesses accuracy of CGM trend indicators
Evolutionary Modeling of Combination Treatment Strategies To Overcome Resistance to Tyrosine Kinase Inhibitors in Non-Small Cell Lung Cancer
- …
