111 research outputs found

    Involvement of bcl-2 and p21waf1 proteins in response of human breast cancer cell clones to Tomudex

    Get PDF
    Mechanisms of resistance to Tomudex include increased thymidylate synthase activity, as well as reduced intracellular drug uptake and polyglutamation. However, little is known about other mechanisms of resistance, such as a possible protection against Tomudex-induced apoptosis mediated by bcl-2. We transfected the MDA-MB-435 human breast cancer cell line, which is characterized by a mutated p53 gene, with cDNA of the bcl-2 gene and generated two clones (MDA-bcl4 and MDA-bcl7) characterized by bcl-2 expression twofold and fourfold that observed in the control cell clone (MDAneo). A concomitant overexpression of p21wafl was also detected in the MDA-bcl7 clone. The MDA-bcl4 clone was three times more resistant to a 24-h Tomudex exposure than the MDAneo clone, whereas the MDA-bcl7 clone was as sensitive to Tomudex as the control cell clone. A lower sensitivity of the MDA-bcl4 clone than MDAneo and MDA-bcl7 clones to 5-fluorouracil and gemcitabine was also observed. No significant difference was noted in the susceptibility of clones to fludarabine and methothrexate. Basal levels of thymidylate synthase activity were superimposable in the three clones. Tomudex induced a marked accumulation of cells in the S phase in all the clones. However, an apoptotic hypodiploid DNA peak and the characteristic nuclear morphology of apoptosis were observed only in the MDA-bcl7 clone after exposure to Tomudex. No difference in the treatment-induced modulation of proteins involved in cell cycle progression (cyclin A, cdk2, pRB, E2F-1) and apoptosis (bcl-2, bax) was observed in the three clones. The only exception was that the expression of p21wafl in the MDA-bcl4 clone was inducible at a Tomudex concentration much higher than that required to induce the protein in the other clones. Overall, the results indicate that bcl-2 and p21wafl proteins concur in determining the cellular profile of sensitivity/resistance to Tomudex. © 1999 Cancer Research Campaig

    Miiuy Croaker Hepcidin Gene and Comparative Analyses Reveal Evidence for Positive Selection

    Get PDF
    Hepcidin antimicrobial peptide (HAMP) is a small cysteine-rich peptide and a key molecule of the innate immune system against bacterial infections. Molecular cloning and genomic characterization of HAMP gene in the miiuy croaker (Miichthys miiuy) were reported in this study. The miiuy croaker HAMP was predicted to encode a prepropeptide of 99 amino acids, a tentative RX(K/R)R cleavage motif and eight characteristic cysteine residues were also identified. The gene organization is also similar to corresponding genes in mammals and fish consisting of three exons and two introns. Sequence polymorphism analysis showed that only two different sequences were identified and encoded two proteins in six individuals. As reported for most other species, the expression level was highest in liver and an up-regulation of transcription was seen in spleen, intestine and kidney examined at 24 h after injection of pathogenic bacteria, Vibrio anguillarum, the expression pattern implied that miiuy croaker HAMP is an important component of the first line defense against invading pathogens. In addition, we report on the underlying mechanism that maintains sequences diversity among fish and mammalian species, respectively. A series of site-model tests implemented in the CODEML program revealed that moderate positive Darwinian selection is likely to cause the molecular evolution in the fish HAMP2 genes and it also showed that the fish HAMP1 genes and HAMP2 genes under different selection pressures

    Diabetes Alters Intracellular Calcium Transients in Cardiac Endothelial Cells

    Get PDF
    Diabetic cardiomyopathy (DCM) is a diabetic complication, which results in myocardial dysfunction independent of other etiological factors. Abnormal intracellular calcium ([Ca2+]i) homeostasis has been implicated in DCM and may precede clinical manifestation. Studies in cardiomyocytes have shown that diabetes results in impaired [Ca2+]i homeostasis due to altered sarcoplasmic reticulum Ca2+ ATPase (SERCA) and sodium-calcium exchanger (NCX) activity. Importantly, altered calcium homeostasis may also be involved in diabetes-associated endothelial dysfunction, including impaired endothelium-dependent relaxation and a diminished capacity to generate nitric oxide (NO), elevated cell adhesion molecules, and decreased angiogenic growth factors. However, the effect of diabetes on Ca2+ regulatory mechanisms in cardiac endothelial cells (CECs) remains unknown. The objective of this study was to determine the effect of diabetes on [Ca2+]i homeostasis in CECs in the rat model (streptozotocin-induced) of DCM. DCM-associated cardiac fibrosis was confirmed using picrosirius red staining of the myocardium. CECs isolated from the myocardium of diabetic and wild-type rats were loaded with Fura-2, and UTP-evoked [Ca2+]i transients were compared under various combinations of SERCA, sarcoplasmic reticulum Ca2+ ATPase (PMCA) and NCX inhibitors. Diabetes resulted in significant alterations in SERCA and NCX activities in CECs during [Ca2+]i sequestration and efflux, respectively, while no difference in PMCA activity between diabetic and wild-type cells was observed. These results improve our understanding of how diabetes affects calcium regulation in CECs, and may contribute to the development of new therapies for DCM treatment

    Global and regional burden of chronic respiratory disease in 2016 arising from non-infectious airborne occupational exposures: a systematic analysis for the Global Burden of Disease Study 2016

    Get PDF
    OBJECTIVES: This paper presents detailed analysis of the global and regional burden of chronic respiratory disease arising from occupational airborne exposures, as estimated in the Global Burden of Disease 2016 study. METHODS: The burden of chronic obstructive pulmonary disease (COPD) due to occupational exposure to particulate matter, gases and fumes, and secondhand smoke, and the burden of asthma resulting from occupational exposure to asthmagens, was estimated using the population attributable fraction (PAF), calculated using exposure prevalence and relative risks from the literature. PAFs were applied to the number of deaths and disability-adjusted life years (DALYs) for COPD and asthma. Pneumoconioses were estimated directly from cause of death data. Age-standardised rates were based only on persons aged 15 years and above. RESULTS: The estimated PAFs (based on DALYs) were 17% (95% uncertainty interval (UI) 14%-20%) for COPD and 10% (95% UI 9%-11%) for asthma. There were estimated to be 519 000 (95% UI 441,000-609,000) deaths from chronic respiratory disease in 2016 due to occupational airborne risk factors (COPD: 460,100 [95% UI 382,000-551,000]; asthma: 37,600 [95% UI 28,400-47,900]; pneumoconioses: 21,500 [95% UI 17,900-25,400]. The equivalent overall burden estimate was 13.6 million (95% UI 11.9-15.5 million); DALYs (COPD: 10.7 [95% UI 9.0-12.5] million; asthma: 2.3 [95% UI 1.9-2.9] million; pneumoconioses: 0.58 [95% UI 0.46-0.67] million). Rates were highest in males; older persons and mainly in Oceania, Asia and sub-Saharan Africa; and decreased from 1990 to 2016. CONCLUSIONS: Workplace exposures resulting in COPD, asthma and pneumoconiosis continue to be important contributors to the burden of disease in all regions of the world. This should be reducible through improved prevention and control of relevant exposures

    Global, regional, and national burden of stroke and its risk factors, 1990-2019: a systematic analysis for the Global Burden of Disease Study 2019

    Get PDF
    Background: Regularly updated data on stroke and its pathological types, including data on their incidence, prevalence, mortality, disability, risk factors, and epidemiological trends, are important for evidence-based stroke care planning and resource allocation. The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) aims to provide a standardised and comprehensive measurement of these metrics at global, regional, and national levels. Methods: We applied GBD 2019 analytical tools to calculate stroke incidence, prevalence, mortality, disability-adjusted life-years (DALYs), and the population attributable fraction (PAF) of DALYs (with corresponding 95% uncertainty intervals [UIs]) associated with 19 risk factors, for 204 countries and territories from 1990 to 2019. These estimates were provided for ischaemic stroke, intracerebral haemorrhage, subarachnoid haemorrhage, and all strokes combined, and stratified by sex, age group, and World Bank country income level. Findings: In 2019, there were 12·2 million (95% UI 11·0–13·6) incident cases of stroke, 101 million (93·2–111) prevalent cases of stroke, 143 million (133–153) DALYs due to stroke, and 6·55 million (6·00–7·02) deaths from stroke. Globally, stroke remained the second-leading cause of death (11·6% [10·8–12·2] of total deaths) and the third-leading cause of death and disability combined (5·7% [5·1–6·2] of total DALYs) in 2019. From 1990 to 2019, the absolute number of incident strokes increased by 70·0% (67·0–73·0), prevalent strokes increased by 85·0% (83·0–88·0), deaths from stroke increased by 43·0% (31·0–55·0), and DALYs due to stroke increased by 32·0% (22·0–42·0). During the same period, age-standardised rates of stroke incidence decreased by 17·0% (15·0–18·0), mortality decreased by 36·0% (31·0–42·0), prevalence decreased by 6·0% (5·0–7·0), and DALYs decreased by 36·0% (31·0–42·0). However, among people younger than 70 years, prevalence rates increased by 22·0% (21·0–24·0) and incidence rates increased by 15·0% (12·0–18·0). In 2019, the age-standardised stroke-related mortality rate was 3·6 (3·5–3·8) times higher in the World Bank low-income group than in the World Bank high-income group, and the age-standardised stroke-related DALY rate was 3·7 (3·5–3·9) times higher in the low-income group than the high-income group. Ischaemic stroke constituted 62·4% of all incident strokes in 2019 (7·63 million [6·57–8·96]), while intracerebral haemorrhage constituted 27·9% (3·41 million [2·97–3·91]) and subarachnoid haemorrhage constituted 9·7% (1·18 million [1·01–1·39]). In 2019, the five leading risk factors for stroke were high systolic blood pressure (contributing to 79·6 million [67·7–90·8] DALYs or 55·5% [48·2–62·0] of total stroke DALYs), high body-mass index (34·9 million [22·3–48·6] DALYs or 24·3% [15·7–33·2]), high fasting plasma glucose (28·9 million [19·8–41·5] DALYs or 20·2% [13·8–29·1]), ambient particulate matter pollution (28·7 million [23·4–33·4] DALYs or 20·1% [16·6–23·0]), and smoking (25·3 million [22·6–28·2] DALYs or 17·6% [16·4–19·0]). Interpretation: The annual number of strokes and deaths due to stroke increased substantially from 1990 to 2019, despite substantial reductions in age-standardised rates, particularly among people older than 70 years. The highest age-standardised stroke-related mortality and DALY rates were in the World Bank low-income group. The fastest-growing risk factor for stroke between 1990 and 2019 was high body-mass index. Without urgent implementation of effective primary prevention strategies, the stroke burden will probably continue to grow across the world, particularly in low-income countries. Funding: Bill & Melinda Gates Foundation

    Mapping geographical inequalities in oral rehydration therapy coverage in low-income and middle-income countries, 2000-17

    Get PDF
    Background Oral rehydration solution (ORS) is a form of oral rehydration therapy (ORT) for diarrhoea that has the potential to drastically reduce child mortality; yet, according to UNICEF estimates, less than half of children younger than 5 years with diarrhoea in low-income and middle-income countries (LMICs) received ORS in 2016. A variety of recommended home fluids (RHF) exist as alternative forms of ORT; however, it is unclear whether RHF prevent child mortality. Previous studies have shown considerable variation between countries in ORS and RHF use, but subnational variation is unknown. This study aims to produce high-resolution geospatial estimates of relative and absolute coverage of ORS, RHF, and ORT (use of either ORS or RHF) in LMICs. Methods We used a Bayesian geostatistical model including 15 spatial covariates and data from 385 household surveys across 94 LMICs to estimate annual proportions of children younger than 5 years of age with diarrhoea who received ORS or RHF (or both) on continuous continent-wide surfaces in 2000-17, and aggregated results to policy-relevant administrative units. Additionally, we analysed geographical inequality in coverage across administrative units and estimated the number of diarrhoeal deaths averted by increased coverage over the study period. Uncertainty in the mean coverage estimates was calculated by taking 250 draws from the posterior joint distribution of the model and creating uncertainty intervals (UIs) with the 2 center dot 5th and 97 center dot 5th percentiles of those 250 draws. Findings While ORS use among children with diarrhoea increased in some countries from 2000 to 2017, coverage remained below 50% in the majority (62 center dot 6%; 12 417 of 19 823) of second administrative-level units and an estimated 6 519 000 children (95% UI 5 254 000-7 733 000) with diarrhoea were not treated with any form of ORT in 2017. Increases in ORS use corresponded with declines in RHF in many locations, resulting in relatively constant overall ORT coverage from 2000 to 2017. Although ORS was uniformly distributed subnationally in some countries, within-country geographical inequalities persisted in others; 11 countries had at least a 50% difference in one of their units compared with the country mean. Increases in ORS use over time were correlated with declines in RHF use and in diarrhoeal mortality in many locations, and an estimated 52 230 diarrhoeal deaths (36 910-68 860) were averted by scaling up of ORS coverage between 2000 and 2017. Finally, we identified key subnational areas in Colombia, Nigeria, and Sudan as examples of where diarrhoeal mortality remains higher than average, while ORS coverage remains lower than average. Interpretation To our knowledge, this study is the first to produce and map subnational estimates of ORS, RHF, and ORT coverage and attributable child diarrhoeal deaths across LMICs from 2000 to 2017, allowing for tracking progress over time. Our novel results, combined with detailed subnational estimates of diarrhoeal morbidity and mortality, can support subnational needs assessments aimed at furthering policy makers' understanding of within-country disparities. Over 50 years after the discovery that led to this simple, cheap, and life-saving therapy, large gains in reducing mortality could still be made by reducing geographical inequalities in ORS coverage. Copyright (c) 2020 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY 4.0 license.Peer reviewe

    Estimating global injuries morbidity and mortality: methods and data used in the Global Burden of Disease 2017 study

    Get PDF
    BACKGROUND: While there is a long history of measuring death and disability from injuries, modern research methods must account for the wide spectrum of disability that can occur in an injury, and must provide estimates with sufficient demographic, geographical and temporal detail to be useful for policy makers. The Global Burden of Disease (GBD) 2017 study used methods to provide highly detailed estimates of global injury burden that meet these criteria. METHODS: In this study, we report and discuss the methods used in GBD 2017 for injury morbidity and mortality burden estimation. In summary, these methods included estimating cause-specific mortality for every cause of injury, and then estimating incidence for every cause of injury. Non-fatal disability for each cause is then calculated based on the probabilities of suffering from different types of bodily injury experienced. RESULTS: GBD 2017 produced morbidity and mortality estimates for 38 causes of injury. Estimates were produced in terms of incidence, prevalence, years lived with disability, cause-specific mortality, years of life lost and disability-adjusted life-years for a 28-year period for 22 age groups, 195 countries and both sexes. CONCLUSIONS: GBD 2017 demonstrated a complex and sophisticated series of analytical steps using the largest known database of morbidity and mortality data on injuries. GBD 2017 results should be used to help inform injury prevention policy making and resource allocation. We also identify important avenues for improving injury burden estimation in the future

    Mapping geographical inequalities in access to drinking water and sanitation facilities in low-income and middle-income countries, 2000-17

    Get PDF
    Background Universal access to safe drinking water and sanitation facilities is an essential human right, recognised in the Sustainable Development Goals as crucial for preventing disease and improving human wellbeing. Comprehensive, high-resolution estimates are important to inform progress towards achieving this goal. We aimed to produce high-resolution geospatial estimates of access to drinking water and sanitation facilities. Methods We used a Bayesian geostatistical model and data from 600 sources across more than 88 low-income and middle-income countries (LMICs) to estimate access to drinking water and sanitation facilities on continuous continent-wide surfaces from 2000 to 2017, and aggregated results to policy-relevant administrative units. We estimated mutually exclusive and collectively exhaustive subcategories of facilities for drinking water (piped water on or off premises, other improved facilities, unimproved, and surface water) and sanitation facilities (septic or sewer sanitation, other improved, unimproved, and open defecation) with use of ordinal regression. We also estimated the number of diarrhoeal deaths in children younger than 5 years attributed to unsafe facilities and estimated deaths that were averted by increased access to safe facilities in 2017, and analysed geographical inequality in access within LMICs. Findings Across LMICs, access to both piped water and improved water overall increased between 2000 and 2017, with progress varying spatially. For piped water, the safest water facility type, access increased from 40.0% (95% uncertainty interval [UI] 39.4-40.7) to 50.3% (50.0-50.5), but was lowest in sub-Saharan Africa, where access to piped water was mostly concentrated in urban centres. Access to both sewer or septic sanitation and improved sanitation overall also increased across all LMICs during the study period. For sewer or septic sanitation, access was 46.3% (95% UI 46.1-46.5) in 2017, compared with 28.7% (28.5-29.0) in 2000. Although some units improved access to the safest drinking water or sanitation facilities since 2000, a large absolute number of people continued to not have access in several units with high access to such facilities (>80%) in 2017. More than 253 000 people did not have access to sewer or septic sanitation facilities in the city of Harare, Zimbabwe, despite 88.6% (95% UI 87.2-89.7) access overall. Many units were able to transition from the least safe facilities in 2000 to safe facilities by 2017; for units in which populations primarily practised open defecation in 2000, 686 (95% UI 664-711) of the 1830 (1797-1863) units transitioned to the use of improved sanitation. Geographical disparities in access to improved water across units decreased in 76.1% (95% UI 71.6-80.7) of countries from 2000 to 2017, and in 53.9% (50.6-59.6) of countries for access to improved sanitation, but remained evident subnationally in most countries in 2017. Interpretation Our estimates, combined with geospatial trends in diarrhoeal burden, identify where efforts to increase access to safe drinking water and sanitation facilities are most needed. By highlighting areas with successful approaches or in need of targeted interventions, our estimates can enable precision public health to effectively progress towards universal access to safe water and sanitation. Copyright (C) 2020 The Author(s). Published by Elsevier Ltd.Peer reviewe

    Intraperitoneal drain placement and outcomes after elective colorectal surgery: international matched, prospective, cohort study

    Get PDF
    Despite current guidelines, intraperitoneal drain placement after elective colorectal surgery remains widespread. Drains were not associated with earlier detection of intraperitoneal collections, but were associated with prolonged hospital stay and increased risk of surgical-site infections.Background Many surgeons routinely place intraperitoneal drains after elective colorectal surgery. However, enhanced recovery after surgery guidelines recommend against their routine use owing to a lack of clear clinical benefit. This study aimed to describe international variation in intraperitoneal drain placement and the safety of this practice. Methods COMPASS (COMPlicAted intra-abdominal collectionS after colorectal Surgery) was a prospective, international, cohort study which enrolled consecutive adults undergoing elective colorectal surgery (February to March 2020). The primary outcome was the rate of intraperitoneal drain placement. Secondary outcomes included: rate and time to diagnosis of postoperative intraperitoneal collections; rate of surgical site infections (SSIs); time to discharge; and 30-day major postoperative complications (Clavien-Dindo grade at least III). After propensity score matching, multivariable logistic regression and Cox proportional hazards regression were used to estimate the independent association of the secondary outcomes with drain placement. Results Overall, 1805 patients from 22 countries were included (798 women, 44.2 per cent; median age 67.0 years). The drain insertion rate was 51.9 per cent (937 patients). After matching, drains were not associated with reduced rates (odds ratio (OR) 1.33, 95 per cent c.i. 0.79 to 2.23; P = 0.287) or earlier detection (hazard ratio (HR) 0.87, 0.33 to 2.31; P = 0.780) of collections. Although not associated with worse major postoperative complications (OR 1.09, 0.68 to 1.75; P = 0.709), drains were associated with delayed hospital discharge (HR 0.58, 0.52 to 0.66; P < 0.001) and an increased risk of SSIs (OR 2.47, 1.50 to 4.05; P < 0.001). Conclusion Intraperitoneal drain placement after elective colorectal surgery is not associated with earlier detection of postoperative collections, but prolongs hospital stay and increases SSI risk
    corecore