123 research outputs found

    Evaluation of cadmium, lead, nickel and zinc status in biological samples of smokers and nonsmokers hypertensive patients

    Get PDF
    The objective of this study was to evaluate the association between trace and toxic elements zinc (Zn), cadmium (Cd), nickel (Ni) and lead (Pb) in biological samples (scalp hair, blood and urine) of smoker and nonsmoker hypertensive patients (n=457), residents of Hyderabad, Pakistan. For the purpose of comparison, the biological samples of age-matched healthy controls were selected as referents. The concentrations of trace and toxic elements were measured by atomic absorption spectrophotometer prior to microwave-assisted acid digestion. The validity and accuracy of the methodology were checked using certified reference materials and by the conventional wet acid digestion method on the same certified reference materials and real samples. The recovery of all the studied elements was found to be in the range of 97.8–99.3% in certified reference materials. The results of this study showed that the mean values of Cd, Ni and Pb were significantly higher in scalp hair, blood and urine samples of both smoker and nonsmoker patients than in referents (P<0.001), whereas the concentration of Zn was lower in the scalp hair and blood, but higher in the urine samples of hypertensive patients. The deficiency of Zn and the high exposure of toxic metals as a result of tobacco smoking may be synergistic with risk factors associated with hypertension

    Serum ferritin levels, socio-demographic factors and desferrioxamine therapy in multi-transfused thalassemia major patients at a government tertiary care hospital of Karachi, Pakistan

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Beta thalassemia is the most frequent genetic disorder of haemoglobin synthesis in Pakistan. Recurrent transfusions lead to iron-overload manifested by increased serum Ferritin levels, for which chelation therapy is required.</p> <p>Findings</p> <p>The study was conducted in the Pediatric Emergency unit of Civil Hospital Karachi after ethical approval by the Institutional Review Board of Dow University of Health Sciences. Seventy nine cases of beta thalassemia major were included after a written consent. The care takers were interviewed for the socio-demographic variables and the use of Desferrioxamine therapy, after which a blood sample was drawn to assess the serum Ferritin level. SPSS 15.0 was employed for data entry and analysis.</p> <p>Of the seventy-nine patients included in the study, 46 (58.2%) were males while 33 (41.8%) were females. The mean age was 10.8 (± 4.5) years with the dominant age group (46.2%) being 10 to 14 years. In 62 (78.8%) cases, the care taker education was below the tenth grade. The mean serum Ferritin level in our study were 4236.5 ng/ml and showed a directly proportional relationship with age. Desferrioxamine was used by patients in 46 (58.2%) cases with monthly house hold income significant factor to the use of therapy.</p> <p>Conclusions</p> <p>The mean serum Ferritin levels are approximately ten times higher than the normal recommended levels for normal individuals, with two-fifths of the patients not receiving iron chelation therapy at all. Use of iron chelation therapy and titrating the dose according to the need can significantly lower the iron load reducing the risk of iron-overload related complications leading to a better quality of life and improving survival in Pakistani beta thalassemia major patients.</p> <p>Conflicts of Interest: None</p

    Resource allocation within the National AIDS Control Program of Pakistan: a qualitative assessment of decision maker's opinions

    Get PDF
    BACKGROUND: Limited resources, whether public or private, demand prioritisation among competing needs to maximise productivity. With a substantial increase in the number of reported HIV cases, little work has been done to understand how resources have been distributed and what factors may have influenced allocation within the newly introduced Enhanced National AIDS Control Program of Pakistan. The objective of this study was to identify perceptions of decision makers about the process of resource allocation within Pakistan's Enhanced National AIDS Control Program. METHODS: A qualitative study was undertaken and in-depth interviews of decision makers at provincial and federal levels responsible to allocate resources within the program were conducted. RESULTS: HIV was not considered a priority issue by all study participants and external funding for the program was thought to have been accepted because of poor foreign currency reserves and donor agency influence rather than local need. Political influences from the federal government and donor agencies were thought to manipulate distribution of funds within the program. These influences were thought to occur despite the existence of a well-laid out procedure to determine allocation of public resources. Lack of collaboration among departments involved in decision making, a pervasive lack of technical expertise, paucity of information and an atmosphere of ad hoc decision making were thought to reduce resistance to external pressures. CONCLUSION: Development of a unified program vision through a consultative process and advocacy is necessary to understand goals to be achieved, to enhance program ownership and develop consensus about how money and effort should be directed. Enhancing public sector expertise in planning and budgeting is essential not just for the program, but also to reduce reliance on external agencies for technical support. Strengthening available databases for effective decision making is required to make financial allocations based on real, rather than perceived needs. With a large part of HIV program funding dedicated to public-private partnerships, it becomes imperative to develop public sector capacity to administer contracts, coordinate and monitor activities of the non-governmental sector

    The Effect of Selenium Supplementation in the Prevention of DNA Damage in White Blood Cells of Hemodialyzed Patients: A Pilot Study

    Get PDF
    Patients with chronic kidney disease (CKD) have an increased incidence of cancer. It is well known that long periods of hemodialysis (HD) treatment are linked to DNA damage due to oxidative stress. In this study, we examined the effect of selenium (Se) supplementation to CKD patients on HD on the prevention of oxidative DNA damage in white blood cells. Blood samples were drawn from 42 CKD patients on HD (at the beginning of the study and after 1 and 3 months) and from 30 healthy controls. Twenty-two patients were supplemented with 200 Όg Se (as Se-rich yeast) per day and 20 with placebo (baker's yeast) for 3 months. Se concentration in plasma and DNA damage in white blood cells expressed as the tail moment, including single-strand breaks (SSB) and oxidative bases lesion in DNA, using formamidopyrimidine glycosylase (FPG), were measured. Se concentration in patients was significantly lower than in healthy subjects (P < 0.0001) and increased significantly after 3 months of Se supplementation (P < 0.0001). Tail moment (SSB) in patients before the study was three times higher than in healthy subjects (P < 0.01). After 3 months of Se supplementation, it decreased significantly (P < 0.01) and was about 16% lower than in healthy subjects. The oxidative bases lesion in DNA (tail moment, FPG) of HD patients at the beginning of the study was significantly higher (P < 0.01) compared with controls, and 3 months after Se supplementation it was 2.6 times lower than in controls (P < 0.01). No changes in tail moment was observed in the placebo group. In conclusion, our study shows that in CKD patients on HD, DNA damage in white blood cells is higher than in healthy controls, and Se supplementation prevents the damage of DNA

    Global, regional, and national comparative risk assessment of 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks, 1990-2015: a systematic analysis for the Global Burden of Disease Study 2015

    Get PDF
    SummaryBackground The Global Burden of Diseases, Injuries, and Risk Factors Study 2015 provides an up-to-date synthesis of the evidence for risk factor exposure and the attributable burden of disease. By providing national and subnational assessments spanning the past 25 years, this study can inform debates on the importance of addressing risks in context. Methods We used the comparative risk assessment framework developed for previous iterations of the Global Burden of Disease Study to estimate attributable deaths, disability-adjusted life-years (DALYs), and trends in exposure by age group, sex, year, and geography for 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks from 1990 to 2015. This study included 388 risk-outcome pairs that met World Cancer Research Fund-defined criteria for convincing or probable evidence. We extracted relative risk and exposure estimates from randomised controlled trials, cohorts, pooled cohorts, household surveys, census data, satellite data, and other sources. We used statistical models to pool data, adjust for bias, and incorporate covariates. We developed a metric that allows comparisons of exposure across risk factors—the summary exposure value. Using the counterfactual scenario of theoretical minimum risk level, we estimated the portion of deaths and DALYs that could be attributed to a given risk. We decomposed trends in attributable burden into contributions from population growth, population age structure, risk exposure, and risk-deleted cause-specific DALY rates. We characterised risk exposure in relation to a Socio-demographic Index (SDI). Findings Between 1990 and 2015, global exposure to unsafe sanitation, household air pollution, childhood underweight, childhood stunting, and smoking each decreased by more than 25%. Global exposure for several occupational risks, high body-mass index (BMI), and drug use increased by more than 25% over the same period. All risks jointly evaluated in 2015 accounted for 57·8% (95% CI 56·6–58·8) of global deaths and 41·2% (39·8–42·8) of DALYs. In 2015, the ten largest contributors to global DALYs among Level 3 risks were high systolic blood pressure (211·8 million [192·7 million to 231·1 million] global DALYs), smoking (148·6 million [134·2 million to 163·1 million]), high fasting plasma glucose (143·1 million [125·1 million to 163·5 million]), high BMI (120·1 million [83·8 million to 158·4 million]), childhood undernutrition (113·3 million [103·9 million to 123·4 million]), ambient particulate matter (103·1 million [90·8 million to 115·1 million]), high total cholesterol (88·7 million [74·6 million to 105·7 million]), household air pollution (85·6 million [66·7 million to 106·1 million]), alcohol use (85·0 million [77·2 million to 93·0 million]), and diets high in sodium (83·0 million [49·3 million to 127·5 million]). From 1990 to 2015, attributable DALYs declined for micronutrient deficiencies, childhood undernutrition, unsafe sanitation and water, and household air pollution; reductions in risk-deleted DALY rates rather than reductions in exposure drove these declines. Rising exposure contributed to notable increases in attributable DALYs from high BMI, high fasting plasma glucose, occupational carcinogens, and drug use. Environmental risks and childhood undernutrition declined steadily with SDI; low physical activity, high BMI, and high fasting plasma glucose increased with SDI. In 119 countries, metabolic risks, such as high BMI and fasting plasma glucose, contributed the most attributable DALYs in 2015. Regionally, smoking still ranked among the leading five risk factors for attributable DALYs in 109 countries; childhood underweight and unsafe sex remained primary drivers of early death and disability in much of sub-Saharan Africa. Interpretation Declines in some key environmental risks have contributed to declines in critical infectious diseases. Some risks appear to be invariant to SDI. Increasing risks, including high BMI, high fasting plasma glucose, drug use, and some occupational exposures, contribute to rising burden from some conditions, but also provide opportunities for intervention. Some highly preventable risks, such as smoking, remain major causes of attributable DALYs, even as exposure is declining. Public policy makers need to pay attention to the risks that are increasingly major contributors to global burden. Funding Bill & Melinda Gates Foundation

    Measurement of the inclusive and dijet cross-sections of b-jets in pp collisions at sqrt(s) = 7 TeV with the ATLAS detector

    Get PDF
    The inclusive and dijet production cross-sections have been measured for jets containing b-hadrons (b-jets) in proton-proton collisions at a centre-of-mass energy of sqrt(s) = 7 TeV, using the ATLAS detector at the LHC. The measurements use data corresponding to an integrated luminosity of 34 pb^-1. The b-jets are identified using either a lifetime-based method, where secondary decay vertices of b-hadrons in jets are reconstructed using information from the tracking detectors, or a muon-based method where the presence of a muon is used to identify semileptonic decays of b-hadrons inside jets. The inclusive b-jet cross-section is measured as a function of transverse momentum in the range 20 < pT < 400 GeV and rapidity in the range |y| < 2.1. The bbbar-dijet cross-section is measured as a function of the dijet invariant mass in the range 110 < m_jj < 760 GeV, the azimuthal angle difference between the two jets and the angular variable chi in two dijet mass regions. The results are compared with next-to-leading-order QCD predictions. Good agreement is observed between the measured cross-sections and the predictions obtained using POWHEG + Pythia. MC@NLO + Herwig shows good agreement with the measured bbbar-dijet cross-section. However, it does not reproduce the measured inclusive cross-section well, particularly for central b-jets with large transverse momenta.Comment: 10 pages plus author list (21 pages total), 8 figures, 1 table, final version published in European Physical Journal

    PAT4 levels control amino-acid sensitivity of rapamycin-resistant mTORC1 from the Golgi and affect clinical outcome in colorectal cancer

    Get PDF
    Tumour cells can use strategies that make them resistant to nutrient deprivation to outcompete their neighbours. A key integrator of the cell’s responses to starvation and other stresses is amino-acid-dependent mechanistic target of rapamycin complex 1 (mTORC1). Activation of mTORC1 on late endosomes and lysosomes is facilitated by amino-acid transporters within the solute-linked carrier 36 (SLC36) and SLC38 families. Here, we analyse the functions of SLC36 family member, SLC36A4, otherwise known as proton-assisted amino-acid transporter 4 (PAT4), in colorectal cancer. We show that independent of other major pathological factors, high PAT4 expression is associated with reduced relapse-free survival after colorectal cancer surgery. Consistent with this, PAT4 promotes HCT116 human colorectal cancer cell proliferation in culture and tumour growth in xenograft models. Inducible knockdown in HCT116 cells reveals that PAT4 regulates a form of mTORC1 with two distinct properties: first, it preferentially targets eukaryotic translation initiation factor 4E-binding protein 1 (4E-BP1), and second, it is resistant to rapamycin treatment. Furthermore, in HCT116 cells two non-essential amino acids, glutamine and serine, which are often rapidly metabolised by tumour cells, regulate rapamycin-resistant mTORC1 in a PAT4-dependent manner. Overexpressed PAT4 is also able to promote rapamycin resistance in human embryonic kidney-293 cells. PAT4 is predominantly associated with the Golgi apparatus in a range of cell types, and in situ proximity ligation analysis shows that PAT4 interacts with both mTORC1 and its regulator Rab1A on the Golgi. These findings, together with other studies, suggest that differentially localised intracellular amino-acid transporters contribute to the activation of alternate forms of mTORC1. Furthermore, our data predict that colorectal cancer cells with high PAT4 expression will be more resistant to depletion of serine and glutamine, allowing them to survive and outgrow neighbouring normal and tumorigenic cells, and potentially providing a new route for pharmacological intervention
    • 

    corecore