48 research outputs found

    A community based study to determine incidence of cervical cancer and willingness of women to participate in cervical cancer screening program in Navsari, Gujarat, India

    Get PDF
    Background: Carcinoma of the uterine cervix is a major health problem faced by the Indian women. Regular cervical cytological examination by all sexually active women can prevent the occurrence of carcinoma cervix. Early detection of cervical cancer is possible with Pap smear tests. Methods: Women above 25 years of age, living in study area and want to participate in study were included. Total 2352 women were enrolled in study. House to house visits were conducted in all the village area by using simple random sampling method. Information about cervical cancer was given. Pap test for cervical cancer screening was carried out by gynaecologist. Cytological examination and confirmation was done by pathologists.Results: A total of 3001 women had attended village level IEC session and out that, 2352 (78.4%) women took part in the screening program. Out of these 2352 women, 2007 women (85.3% compliance) had given consent for physical cervical examination and Pap smear. The incidence of cervical cancer was 0.2% on the basis of clinical examination and biopsy.Conclusions: Higher compliance for undergoing vaginal examination and Pap test shows the positive health seeking behaviour of the women but for that, strong IEC and sensitization about the disease must be done to improve the participation. Sensitivity of Pap test was poor and couldn’t find true positive cases

    Self-reported tobacco smoking practices among medical students and their perceptions towards training about tobacco smoking in medical curricula: A cross-sectional, questionnaire survey in Malaysia, India, Pakistan, Nepal, and Bangladesh

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Tobacco smoking issues in developing countries are usually taught non-systematically as and when the topic arose. The World Health Organisation and Global Health Professional Student Survey (GHPSS) have suggested introducing a separate integrated tobacco module into medical school curricula. Our aim was to assess medical students' tobacco smoking habits, their practices towards patients' smoking habits and attitude towards teaching about smoking in medical schools.</p> <p>Methods</p> <p>A cross-sectional questionnaire survey was carried out among final year undergraduate medical students in Malaysia, India, Nepal, Pakistan, and Bangladesh. An anonymous, self-administered questionnaire included items on demographic information, students' current practices about patients' tobacco smoking habits, their perception towards tobacco education in medical schools on a five point Likert scale. Questions about tobacco smoking habits were adapted from GHPSS questionnaire. An <it>'ever smoker' </it>was defined as one who had smoked during lifetime, even if had tried a few puffs once or twice. 'Current smoker' was defined as those who had smoked tobacco product on one or more days in the preceding month of the survey. Descriptive statistics were calculated.</p> <p>Results</p> <p>Overall response rate was 81.6% (922/1130). Median age was 22 years while 50.7% were males and 48.2% were females. The overall prevalence of 'ever smokers' and 'current smokers' was 31.7% and 13.1% respectively. A majority (> 80%) of students asked the patients about their smoking habits during clinical postings/clerkships. Only a third of them did counselling, and assessed the patients' willingness to quit. Majority of the students agreed about doctors' role in tobacco control as being role models, competence in smoking cessation methods, counseling, and the need for training about tobacco cessation in medical schools. About 50% agreed that current curriculum teaches about tobacco smoking but not systematically and should be included as a separate module. Majority of the students indicated that topics about health effects, nicotine addiction and its treatment, counselling, prevention of relapse were important or very important in training about tobacco smoking.</p> <p>Conclusion</p> <p>Medical educators should consider revising medical curricula to improve training about tobacco smoking cessation in medical schools. Our results should be supported by surveys from other medical schools in developing countries of Asia.</p

    Changes in the gastric enteric nervous system and muscle: A case report on two patients with diabetic gastroparesis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The pathophysiological basis of diabetic gastroparesis is poorly understood, in large part due to the almost complete lack of data on neuropathological and molecular changes in the stomachs of patients. Experimental models indicate various lesions affecting the vagus, muscle, enteric neurons, interstitial cells of Cajal (ICC) or other cellular components. The aim of this study was to use modern analytical methods to determine morphological and molecular changes in the gastric wall in patients with diabetic gastroparesis.</p> <p>Methods</p> <p>Full thickness gastric biopsies were obtained laparoscopically from two gastroparetic patients undergoing surgical intervention and from disease-free areas of control subjects undergoing other forms of gastric surgery. Samples were processed for histological and immunohistochemical examination.</p> <p>Results</p> <p>Although both patients had severe refractory symptoms with malnutrition, requiring the placement of a gastric stimulator, one of them had no significant abnormalities as compared with controls. This patient had an abrupt onset of symptoms with a relatively short duration of diabetes that was well controlled. By contrast, the other patient had long standing brittle and poorly controlled diabetes with numerous episodes of diabetic ketoacidosis and frequent hypoglycemic episodes. Histological examination in this patient revealed increased fibrosis in the muscle layers as well as significantly fewer nerve fibers and myenteric neurons as assessed by PGP9.5 staining. Further, significant reduction was seen in staining for neuronal nitric oxide synthase, heme oxygenase-2, tyrosine hydroxylase as well as for c-KIT.</p> <p>Conclusion</p> <p>We conclude that poor metabolic control is associated with significant pathological changes in the gastric wall that affect all major components including muscle, neurons and ICC. Severe symptoms can occur in the absence of these changes, however and may reflect vagal, central or hormonal influences. Gastroparesis is therefore likely to be a heterogeneous disorder. Careful molecular and pathological analysis may allow more precise phenotypic differentiation and shed insight into the underlying mechanisms as well as identify novel therapeutic targets.</p

    Basic considerations in the dermatokinetics of topical formulations

    Get PDF
    Assessing the bioavailability of drug molecules at the site of action provides better insight into the efficiency of a dosage form. However, determining drug concentration in the skin layers following topical application of dermatological formulations is a great challenge. The protocols followed in oral formulations could not be applied for topical dosage forms. The regulatory agencies are considering several possible approaches such as tape stripping, microdialysis etc. On the other hand, the skin bioavailability assessment of xenobiotics is equally important for topical formulations in order to evaluate the toxicity. It is always possible that drug molecules applied on the skin surface may transport thorough the skin and reaches systemic circulation. Thus the real time measurement of molecules in the skin layer has become obligatory. In the last two decades, quite a few investigations have been carried out to assess the skin bioavailability and toxicity of topical/dermatological products. This review provides current understanding on the basics of dermatokinetics, drug depot formation, skin metabolism and clearance of drug molecules from the skin layers following application of topical formulations

    Global burden and strength of evidence for 88 risk factors in 204 countries and 811 subnational locations, 1990–2021: a systematic analysis for the Global Burden of Disease Study 2021

    Get PDF
    Background: Understanding the health consequences associated with exposure to risk factors is necessary to inform public health policy and practice. To systematically quantify the contributions of risk factor exposures to specific health outcomes, the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2021 aims to provide comprehensive estimates of exposure levels, relative health risks, and attributable burden of disease for 88 risk factors in 204 countries and territories and 811 subnational locations, from 1990 to 2021. Methods: The GBD 2021 risk factor analysis used data from 54 561 total distinct sources to produce epidemiological estimates for 88 risk factors and their associated health outcomes for a total of 631 risk–outcome pairs. Pairs were included on the basis of data-driven determination of a risk–outcome association. Age-sex-location-year-specific estimates were generated at global, regional, and national levels. Our approach followed the comparative risk assessment framework predicated on a causal web of hierarchically organised, potentially combinative, modifiable risks. Relative risks (RRs) of a given outcome occurring as a function of risk factor exposure were estimated separately for each risk–outcome pair, and summary exposure values (SEVs), representing risk-weighted exposure prevalence, and theoretical minimum risk exposure levels (TMRELs) were estimated for each risk factor. These estimates were used to calculate the population attributable fraction (PAF; ie, the proportional change in health risk that would occur if exposure to a risk factor were reduced to the TMREL). The product of PAFs and disease burden associated with a given outcome, measured in disability-adjusted life-years (DALYs), yielded measures of attributable burden (ie, the proportion of total disease burden attributable to a particular risk factor or combination of risk factors). Adjustments for mediation were applied to account for relationships involving risk factors that act indirectly on outcomes via intermediate risks. Attributable burden estimates were stratified by Socio-demographic Index (SDI) quintile and presented as counts, age-standardised rates, and rankings. To complement estimates of RR and attributable burden, newly developed burden of proof risk function (BPRF) methods were applied to yield supplementary, conservative interpretations of risk–outcome associations based on the consistency of underlying evidence, accounting for unexplained heterogeneity between input data from different studies. Estimates reported represent the mean value across 500 draws from the estimate's distribution, with 95% uncertainty intervals (UIs) calculated as the 2·5th and 97·5th percentile values across the draws. Findings: Among the specific risk factors analysed for this study, particulate matter air pollution was the leading contributor to the global disease burden in 2021, contributing 8·0% (95% UI 6·7–9·4) of total DALYs, followed by high systolic blood pressure (SBP; 7·8% [6·4–9·2]), smoking (5·7% [4·7–6·8]), low birthweight and short gestation (5·6% [4·8–6·3]), and high fasting plasma glucose (FPG; 5·4% [4·8–6·0]). For younger demographics (ie, those aged 0–4 years and 5–14 years), risks such as low birthweight and short gestation and unsafe water, sanitation, and handwashing (WaSH) were among the leading risk factors, while for older age groups, metabolic risks such as high SBP, high body-mass index (BMI), high FPG, and high LDL cholesterol had a greater impact. From 2000 to 2021, there was an observable shift in global health challenges, marked by a decline in the number of all-age DALYs broadly attributable to behavioural risks (decrease of 20·7% [13·9–27·7]) and environmental and occupational risks (decrease of 22·0% [15·5–28·8]), coupled with a 49·4% (42·3–56·9) increase in DALYs attributable to metabolic risks, all reflecting ageing populations and changing lifestyles on a global scale. Age-standardised global DALY rates attributable to high BMI and high FPG rose considerably (15·7% [9·9–21·7] for high BMI and 7·9% [3·3–12·9] for high FPG) over this period, with exposure to these risks increasing annually at rates of 1·8% (1·6–1·9) for high BMI and 1·3% (1·1–1·5) for high FPG. By contrast, the global risk-attributable burden and exposure to many other risk factors declined, notably for risks such as child growth failure and unsafe water source, with age-standardised attributable DALYs decreasing by 71·5% (64·4–78·8) for child growth failure and 66·3% (60·2–72·0) for unsafe water source. We separated risk factors into three groups according to trajectory over time: those with a decreasing attributable burden, due largely to declining risk exposure (eg, diet high in trans-fat and household air pollution) but also to proportionally smaller child and youth populations (eg, child and maternal malnutrition); those for which the burden increased moderately in spite of declining risk exposure, due largely to population ageing (eg, smoking); and those for which the burden increased considerably due to both increasing risk exposure and population ageing (eg, ambient particulate matter air pollution, high BMI, high FPG, and high SBP). Interpretation: Substantial progress has been made in reducing the global disease burden attributable to a range of risk factors, particularly those related to maternal and child health, WaSH, and household air pollution. Maintaining efforts to minimise the impact of these risk factors, especially in low SDI locations, is necessary to sustain progress. Successes in moderating the smoking-related burden by reducing risk exposure highlight the need to advance policies that reduce exposure to other leading risk factors such as ambient particulate matter air pollution and high SBP. Troubling increases in high FPG, high BMI, and other risk factors related to obesity and metabolic syndrome indicate an urgent need to identify and implement interventions

    Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries

    Get PDF
    Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P &lt; 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Efficacy of single-dose rasburicase in the management of tumor lysis syndrome: a case series from a regional cancer center in western India

    No full text
    Background: Tumor lysis syndrome (TLS) is an oncological emergency. Rasburicase (recombinant urate oxidase) has been proven to be an effective therapy for prevention of TLS and its serious consequences in patients with hematological malignancies such as acute leukemias with high white blood cells count, Burkitt lymphoma, and lymphoblastic lymphoma with high tumor burden. The US Food and Drug Administration recommended daily dosing regimen for 5 days is unaffordable by each and every patient in developing countries such as India. Recently, the conducted studies have clearly shown a similar efficacy for a single dose of rasburicase. Herein, we report a case series of 15 patients, including children and adults with hematologic malignancies, in whom TLS was managed by a single dose of rasburicase. Materials and Methods: We retrospectively analyzed the efficacy of single-dose rasburicase (SDR) (0.15 mg/kg intravenous infusion over 30 min) in patients with hematologic malignancies at risk for TLS. The drug was administered in five adult and 10 pediatric patients admitted to the Gujarat Cancer and Research Institute between January 2013 and December 2014. Results: The study included 15 patients, out of which 10 were pediatric (8 male:2 female) and five were adults (5 male:0 female). Patients with hematologic malignancies having Eastern Cooperative Oncology Group performance status 0–2 and at high risk or potential risk for TLS were selected. The median ages in pediatric and adult groups were 7.7 years and 32 years, respectively. The presence of hyperuricemia (plasma uric acid (UA) levels ≥7.5 mg/dl) or a diagnosis of very aggressive lymphoma or leukemia based on the World Health Organization classification of hematopoietic and lymphoid neoplasms in patients was classified as high-risk. Rasburicase was administered in a single dose of 0.15 mg/kg intravenously over 30 min. Patients were evaluated by clinical examination and blood biochemical tests at frequent intervals. Plasma samples for UA were collected at baseline before rasburicase, 6–24 h post-rasburicase, 48 h post-rasburicase, and daily during treatment. The blood samples for UA during the course of treatment were collected in prechilled tubes containing heparin and immediately immersed and transported on ice. The blood samples were analyzed within 4 h of collection. Serum electrolytes, blood urea nitrogen, creatinine, calcium, and phosphorous were monitored daily during this period. A single dose of rasburicase produced a rapid and sustained therapeutic effect of lowering the plasma UA levels in all the 15 patients. Renal parameters normalized within 72 h. UA levels remained below 4 mg/dl throughout the administration of chemotherapy until discharge, and none of the patients required a repeat dosing of rasburicase. Conclusion: SDR is a highly economical and clinically effective way of managing patients with TLS and could serve as an alternative to the 5-day treatment in a resource-limited country such as India
    corecore