103 research outputs found

    Comparison of the tuberculin skin test and the QuantiFERON-TB Gold test in detecting latent tuberculosis in health care workers in Iran

    Get PDF
    OBJECTIVES: The tuberculin skin test (TST) and the QuantiFERON-TB Gold test (QFT) are used to identify latent tuberculosis infections (LTBIs). The aim of this study was to determine the agreement between these two tests among health care workers in Iran. METHODS: This cross-sectional study included 177 tuberculosis (TB) laboratory staff and 67 non-TB staff. TST indurations of 10 mm or more were considered positive. The Student's t-test and the chi-square test were used to compare the mean score and proportion of variables between the TB laboratory staff and the non-TB laboratory staff. Kappa statistics were used to evaluate the agreement between these tests, and logistic regression was used to assess the risk factors associated with positive results for each test. RESULTS: The prevalence of LTBIs according to both the QFT and the TST was 17 (95 confidence interval CI, 12% to 21%) and 16% (95% CI, 11% to 21%), respectively. The agreement between the QFT and the TST was 77.46%, with a kappa of 0.19 (95% CI, 0.04 to 0.34). CONCLUSIONS: Although the prevalence of LTBI based on the QFT and the TST was not significantly different, the kappa statistic was low between these two tests for the detection of LTBIs

    Comparison of the tuberculin skin test and the QuantiFERON-TB Gold test in detecting latent tuberculosis in health care workers in Iran

    Get PDF
    OBJECTIVES: The tuberculin skin test (TST) and the QuantiFERON-TB Gold test (QFT) are used to identify latent tuberculosis infections (LTBIs). The aim of this study was to determine the agreement between these two tests among health care workers in Iran. METHODS: This cross-sectional study included 177 tuberculosis (TB) laboratory staff and 67 non-TB staff. TST indurations of 10 mm or more were considered positive. The Student's t-test and the chi-square test were used to compare the mean score and proportion of variables between the TB laboratory staff and the non-TB laboratory staff. Kappa statistics were used to evaluate the agreement between these tests, and logistic regression was used to assess the risk factors associated with positive results for each test. RESULTS: The prevalence of LTBIs according to both the QFT and the TST was 17 (95 confidence interval CI, 12% to 21%) and 16% (95% CI, 11% to 21%), respectively. The agreement between the QFT and the TST was 77.46%, with a kappa of 0.19 (95% CI, 0.04 to 0.34). CONCLUSIONS: Although the prevalence of LTBI based on the QFT and the TST was not significantly different, the kappa statistic was low between these two tests for the detection of LTBIs. © 2016, Korean Society of Epidemiolog

    The epidemiology of tuberculosis in the Iranians' population in 2016

    Get PDF
    Background: Tuberculosis (TB) is the 10thmost common infectious disease in the world, and it is one of the major health problems in Iran despite the implementation of the National Tuberculosis Control Program. Methods: Tuberculosis data in Iran were used in 2016 by the Ministry of Health and Treatment. The data on mortality and incidence of disease were determined by age groups, sex, and the provinces of country. Data were analyzed using Excel (2010) and SPSS software. Findings: The overall incidence rate of tuberculosis in the country was 9.7 per 100000 population (10.7 in men and 8.7 in women). The highest incidence rate of TB and mortality in both sexes were over 80 years. The death rate was 1 person per 100,000 populations, 57.7 of whom were men. Most of the deaths were from pulmonary tuberculosis. The highest incidence and mortality rates were reported in Golestan and Sistan and Baluchestan provinces. Conclusion: The incidence and mortality rate of TB has decreased in recent decades in Iran. Nevertheless, considering the raising trend of human immunodeficiency virus (HIV) among the patients with TB, and the neighborhood with endemic countries, TB should be noticed as one of the most important health priorities in the health system of the country. © 2020 Isfahan University of Medical Sciences(IUMS). All rights reserved

    Diagnosis of latent tuberculosis infection among pediatric household contacts of Iranian tuberculosis cases using tuberculin skin test, IFN- γ release assay and IFN-γ-induced protein-10

    Get PDF
    Background: Although the World Health Organization has recommended the diagnosis and prophylactic treatment of latent tuberculous infection (LTBI) in child household contacts of tuberculosis (TB) cases, the national programs in high-burden TB regions rarely implement adequate screening of this high-risk group, mainly because of resource limitations. We aimed to evaluate the prevalence of LTBI among pediatric household contacts of TB cases in two high-burden provinces in Iran. Methods: We conducted a cohort study in children who had been in household contact with a TB index. All subjects were assessed for active TB disease. For LTBI diagnosis, tuberculin skin test (TST) and QuantiFERON®-TB Gold Plus (QFT-Plus) were performed at the time of the index TB case diagnosis, as well as, 3, 12, and 18 months, if the first results were negative. In addition, interferon-γ-induced protein-10(IP-10) concentrations were measured for all participants. Results: A total of 230 children were enrolled, who had contact with an index TB case. Three contacts were diagnosed with active TB. According to the TST/QFT-Plus results, 104 (45.2) children were identified with LTBI during our study. Significantly increased IP-10 levels were found in LTBI patients compared to healthy contacts. Accordingly, more than 50 of LTBI contacts and about 10 of healthy contacts were considered as IP-10-positive. Conclusion: This study alarmingly illustrates a high prevalence of LTBI among Iranian children exposed to TB cases. We, therefore, emphasize that the children living in close contact with an infectious TB case should be screened effectively and receive prophylactic therapy. © 2021, The Author(s)

    Prediction of response to treatment in children with epilepsy

    Get PDF
    Abstract Objective: This study was conducted to predict the response to treatment in patients treated with anti-epilepsy drugs. Material and Methods: This analytical questionnaire-based study was conducted in 2014 among 128 patients with epilepsy admitted to Mofid Children's Hospital, Tehran, Iran. The inclusion criteria were children 2 months to 12 yr of age with epilepsy and patients who experienced fever and seizure attacks at least once were excluded from the study. Patients were followed up for 6 months and the response to their treatment was recorded. The good response to treatment was defined as the absence of seizure with two drugs during follow up. Results: Seventy-two patients (56.3%) were boys. The age of the first seizure was under 2 yr old in 90 patients (70.3%). History of febrile convulsion, family history of epilepsy and history of asphyxia was found in 16 (12.5%), 41 (32%), and 27 (21.1%) patients, respectively. Seizure etiology was idiopathic in 90 patients (70.3%), and the number of seizures was 1-2 in 36 patients (28.1%). Overall, 57 patients (44.5%) had cerebral lesion according to CT scan or MRI, and EEG was abnormal in 101 patients (78.9%). In 6-month follow-up, 40 patients (31.3%) responded well to the treatment and 88 patients (68.8%) responded poorly to the treatment. History of asphyxia (OR = 6.82), neonatal jaundice (OR = 2.81) and abnormal EEG (OR = 0.19) were effective factors in response to treatment. Conclusion: Abnormal EEG is an effective factor in treatment response in the children studied. Key Words: Pediatric, Anti-seizure drug, Response to treatment, Children, Epileps

    A review on development and application of plant-based bioflocculants and grafted bioflocculants

    Get PDF
    Flocculation is extensively employed for clarification through sedimentation. Application of eco-friendly plant-based bioflocculants in wastewater treatment has attracted significant attention lately with high removal capability in terms of solids, turbidity, color, and dye. However, moderate flocculating property and short shelf life restrict their development. To enhance the flocculating ability, natural polysaccharides derived from plants are chemically modified by inclusion of synthetic, nonbiodegradable monomers (e.g., acrylamide) onto their backbone to produce grafted bioflocculants. This review is aimed to provide an overview of the development and flocculating efficiencies of plant-based bioflocculants and grafted bioflocculants for the first time. Furthermore, the processing methods, flocculation mechanism, and the current challenges are discussed. All the reported studies about plant-derived bioflocculants are conducted under lab-scale conditions in wastewater treatment. Hence, the possibility to apply natural bioflocculants in food and beverage, mineral, paper and pulp, and oleo-chemical and biodiesel industries is discussed and evaluated

    Fear expression is suppressed by tyrosine administration

    Get PDF
    Animal studies have demonstrated that catecholamines regulate several aspects of fear conditioning. In humans, however, pharmacological manipulations of the catecholaminergic system have been scarce, and their primary focus has been to interfering with catecholaminergic activity after fear acquisition or expression had taken place, using L-Dopa, primarily, as catecholaminergic precursor. Here, we sought to determine if putative increases in presynaptic dopamine and norepinephrine by tyrosine administered before conditioning could affect fear expression. Electrodermal activity (EDA) of 46 healthy participants (24 placebo, 22 tyrosine) was measured in a fear instructed task. Results showed that tyrosine abolished fear expression compared to placebo. Importantly, tyrosine did not affect EDA responses to the aversive stimulus (UCS) or alter participants' mood. Therefore, the effect of tyrosine on fear expression cannot be attributed to these factors. Taken together, these findings provide evidence that the catecholaminergic system influences fear expression in humans

    Intraperitoneal drain placement and outcomes after elective colorectal surgery: international matched, prospective, cohort study

    Get PDF
    Despite current guidelines, intraperitoneal drain placement after elective colorectal surgery remains widespread. Drains were not associated with earlier detection of intraperitoneal collections, but were associated with prolonged hospital stay and increased risk of surgical-site infections.Background Many surgeons routinely place intraperitoneal drains after elective colorectal surgery. However, enhanced recovery after surgery guidelines recommend against their routine use owing to a lack of clear clinical benefit. This study aimed to describe international variation in intraperitoneal drain placement and the safety of this practice. Methods COMPASS (COMPlicAted intra-abdominal collectionS after colorectal Surgery) was a prospective, international, cohort study which enrolled consecutive adults undergoing elective colorectal surgery (February to March 2020). The primary outcome was the rate of intraperitoneal drain placement. Secondary outcomes included: rate and time to diagnosis of postoperative intraperitoneal collections; rate of surgical site infections (SSIs); time to discharge; and 30-day major postoperative complications (Clavien-Dindo grade at least III). After propensity score matching, multivariable logistic regression and Cox proportional hazards regression were used to estimate the independent association of the secondary outcomes with drain placement. Results Overall, 1805 patients from 22 countries were included (798 women, 44.2 per cent; median age 67.0 years). The drain insertion rate was 51.9 per cent (937 patients). After matching, drains were not associated with reduced rates (odds ratio (OR) 1.33, 95 per cent c.i. 0.79 to 2.23; P = 0.287) or earlier detection (hazard ratio (HR) 0.87, 0.33 to 2.31; P = 0.780) of collections. Although not associated with worse major postoperative complications (OR 1.09, 0.68 to 1.75; P = 0.709), drains were associated with delayed hospital discharge (HR 0.58, 0.52 to 0.66; P < 0.001) and an increased risk of SSIs (OR 2.47, 1.50 to 4.05; P < 0.001). Conclusion Intraperitoneal drain placement after elective colorectal surgery is not associated with earlier detection of postoperative collections, but prolongs hospital stay and increases SSI risk
    corecore