233 research outputs found

    Baseline anti-NS4a antibodies in combination with on-treatment quantitative HCV-RNA reliably identifies nonresponders to pegylated interferon-ribavirin combination therapy after 4 weeks of treatment

    Get PDF
    Background Early detection of nonresponders to hepatitis C therapy limits unnecessary exposure to treatment and its side-effects. A recent algorithm combining baseline anti-NS4a antibodies and on-treatment quantitative PCR identified nonresponders to a combination of interferon and ribavirin after 1 week of treatment. Aim To validate a stopping rule based on baseline anti-NS4a antibody levels and early on-treatment virological response in treatment-naive genotype 1 chronic hepatitis C patients treated with the current standard pegylated interferon and ribavirin combination therapy. Methods Eighty-nine genotype 1 patients from the Dynamically Individualized Treatment of hepatitis C Infection and Correlates of Viral/Host dynamics Study treated for 48 weeks with standard 180 mu g pegylated interferon (PEG-IFN)-alpha-2a (weekly) and ribavirin 1000-1200mg (daily) were analysed. Baseline anti-NS4a antibody enzyme-linked immunosorbent assay (NS4a AA 1687-1718) was performed on pretreatment serum. Hepatitis C virus-RNA was assessed at days 0, 1, 4, 7, 8, 15, 22, 29, weeks 6, 7, 8, 10, 12 and 6 weekly thereafter until end of treatment. Multiple regression logistic analysis was performed. Results Overall 54 of 89 (61%) patients achieved sustained virological response. A baseline anti-NS4a antibody titre less than 1/1250 correlated with absence of favourable initial viral decline according to variable response types (P=0.015). The optimal algorithm was developed using the combination of the absence of anti-NS4a Ab (= 100.000 IU/ml at week 4. This algorithm has a specificity of 43% and negative predictive value of 100% to detect nonresponse to standard PEG-IFN-alpha-2a and ribavirin therapy at fourth week of therapy (intention-to-treat analysis). Conclusion The decision to stop the therapy in genotype 1 chronic hepatitis C patients treated with PEG-IFN-alpha-2a and ribavirin can be confidently made after 4 weeks of treatment based on the absence of baseline anti-NS4a Ab and a week-4 hepatitis C virus-RNA above 100.000 IU/ml. Eur J Gastroenterol Hepatol 22:1443-1448 (C) 2010 Wolters Kluwer Health vertical bar Lippincott Williams & Wilkins

    Severity scoring of manganese health effects for categorical regression

    Get PDF
    Characterizing the U-shaped exposure response relationship for manganese (Mn) is necessary for estimating the risk of adverse health from Mn toxicity due to excess or deficiency. Categorical regression has emerged as a powerful tool for exposure-response analysis because of its ability to synthesize relevant information across multiple studies and species into a single integrated analysis of all relevant data. This paper documents the development of a database on Mn toxicity designed to support the application of categorical regression techniques. Specifically, we describe (i) the conduct of a systematic search of the literature on Mn toxicity to gather data appropriate for dose-response assessment; (ii) the establishment of inclusion/exclusion criteria for data to be included in the categorical regression modeling database; (iii) the development of a categorical severity scoring matrix for Mn health effects to permit the inclusion of diverse health outcomes in a single categorical regression analysis using the severity score as the outcome variable; and (iv) the convening of an international expert panel to both review the severity scoring matrix and assign severity scores to health outcomes observed in studies (including case reports, epidemiological investigations, and in vivo experimental studies) selected for inclusion in the categorical regression database. Exposure information including route, concentration, duration, health endpoint(s), and characteristics of the exposed population was abstracted from included studies and stored in a computerized manganese database (MnDB), providing a comprehensive repository of exposure-response information with the ability to support categorical regression modeling of oral exposure data

    Ambient temperature as a trigger of preterm delivery in a temperate climate.

    Get PDF
    BACKGROUND: Recent evidence suggests that elevated ambient temperatures may trigger preterm delivery. Since results from studies in temperate climates are inconclusive, we investigated the association between temperature and the risk of preterm birth in Flanders (Belgium). METHODS: We used data on 807 835 singleton deliveries (January 1998-July 2011). We combined a quasi-Poisson model with distributed lag non-linear models to allow for delayed and non-linear temperature effects, accounting for the daily pregnancies at risk and their gestational age distribution. RESULTS: For moderate heat (95th vs 50th centile) up to 1 day before delivery (lag 0-1), the risk of preterm birth increased by 8.5% (95% CI 2.4% to 15.0%) when minimum temperature increased from 8.3°C to 16.3°C and by 9.6% (95% CI 1.1% to 18.7%) when maximum temperature increased from 14.7°C to 26.5°C. Corresponding estimates for extreme heat (99th vs 50th centile) were 15.6% (95% CI 4.8% to 27.6%) for minimum temperature (19.0°C vs 8.3°C) and 14.5% (95% CI 0.5% to 30.6%) for maximum temperature (30.7°C vs 14.7°C). Despite the increased risk of preterm birth associated with cold at lag 2 (and lag 1 for minimum temperature), cumulative cold effects were small. The per cent change in preterm birth associated with moderate cold (5th vs 50th centile) up to 3 days before delivery (lag 0-3) was 2.1% (95% CI -4.1% to 8.7%) for minimum temperature (-2.0°C vs 8.3°C) and 0.6% (95% CI -7.3% to 9.2%) for maximum temperature (2.5°C vs 14.7°C). CONCLUSIONS: Even in a temperate climate, ambient temperature may trigger preterm delivery, suggesting that pregnant women should avoid temperature extremes

    Sexual maturation in relation to polychlorinated aromatic hydrocarbons: Sharpe and Skakkebaek's hypothesis revisited.

    Get PDF
    Polychlorinated aromatic hydrocarbons (PCAHs) have been described as endocrine disruptors in animals and in accidentally or occupationally exposed humans. In the present study we examined the effect of moderate exposure to PCAHs on sexual maturation. Two hundred adolescents (mean age, 17.4 years) who resided in two polluted suburbs and a rural control area in Flanders (Belgium) participated. We measured the serum concentration of polychlorinated biphenyl (PCB) congeners 138, 153, and 180 and dioxin-like compounds [chemically activated luciferase expression (CALUX) assay] as biomarkers of exposure. School physicians assessed the pubertal development of boys and girls and measured testicular volume. In one suburb near two waste incinerators, compared with the other suburb and the control area, fewer boys (p < 0.001) had reached the adult stages of genital development (62% vs. 92% and 100%, respectively) and pubic hair growth (48% vs. 77% and 100%). Also, in the same suburb, fewer girls (p = 0.04) had reached the adult stage of breast development (67% vs. 90% and 79%). In individual boys, a doubling of the serum concentration of PCB congener 138 increased the odds of not having matured into the adult stage of genital development by 3.5 (p = 0.04); similarly for PCB congener 153 in relation to male pubic hair growth, the odds ratio was 3.5 (p = 0.04). In girls, a doubling of the serum dioxin concentration increased the odds of not having reached the adult stage of breast development by 2.3 (p = 0.02). Left plus right testicular volume was lower in both polluted areas than in the control area (42.4 mL vs. 47.3 mL, p = 0.005) but was not related to the current exposure of the adolescents to PCAHs. Through endocrine disruption, environmental exposure to PCAHs may interfere with sexual maturation and in the long-run adversely affect human reproduction

    Work-related musculoskeletal disorders: Comparison of data sources for surveillance

    Full text link
    Work-related upper extremity musculoskeletal disorders “associated with repeated trauma” account for more than 60% of all newly reported occupational illness, 332,000 in 1994 according to the U.S. Department of Labor. These numbers do not include, for example, those disorders categorized as “injuries due to overexertion in lifting,” approximately 370,000. Early identification of potential disorders and associated risk factors is needed to reduce these disorders. There are a number of possible methods for conducting surveillance for work-related musculoskeletal disorders (WMDs) based on health outcome: workers' compensation, sickness and accident insurance, OSHA 200 logs, plant medical records, self-administered questionnaires, professional interviews, and physical examinations. In addition, hazard surveillance based on evaluation of job exposures to physical stressors by nonoccupational health personnel is possible. As part of a large labor-management-initiated intervention study to reduce the incidence of WMDs in four automotive plants, we were able to compare the strengths and limitations of each of these surveillance tools. University administered health interviews yielded the highest rate of symptoms; combined physical examinations plus interview (point prevalence) rates were similar to self-administered questionnaires (period prevalence) rates. Plant medical records yielded the lowest rate of WMDs. WMD status on self-administered questionnaire and on physical examination were associated with risk factor exposure scores. This study suggests that symptoms questionnaires associated with risk factor exposure scores. This study suggests that symptoms questionnaires and checklist-based hazard surveillance are feasible within the context of joint labor-management ergonomics programs and are more sensitive indicators of ergonomic problems than pre-existing data sources. Am. J. Ind. Med. 31:600–608, 1997. © 1997 Wiley-Liss, Inc.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/34815/1/15_ftp.pd

    Immunologic biomarkers in relation to exposure markers of PCBs and dioxins in Flemish adolescents (Belgium).

    Get PDF
    In this study, we investigated 17- to 18-year-old boys and girls to determine whether changes in humoral or cellular immunity or respiratory complaints were related to blood serum levels of polychlorinated biphenyls (PCBs) and dioxin-like compounds after lifetime exposure in Flanders (Belgium). We obtained blood samples from and administered questionnaires to 200 adolescents recruited from a rural area and two urban suburbs. Physicians recorded medical history and respiratory diseases. We measured immunologic biomarkers such as differential blood cell counts, lymphocyte phenotypes, and serum immunoglobulins. As biomarkers of exposure, we determined the serum concentrations of PCBs (PCB 138, PCB 153, and PCB 180) and dioxin-like compounds [chemical-activated luciferase expression (CALUX) bioassay]. The percentages of eosinophils and natural killer cells in blood were negatively correlated with CALUX toxic equivalents (TEQs) in serum (p = 0.009 and p = 0.05, respectively). Increased serum CALUX TEQs resulted in an increase in serum IgA levels (p = 0.05). Furthermore, levels of specific IgEs (measured by radioallergosorbent tests) of cat dander, house dust mite, and grass pollen were also significantly and negatively associated with the CALUX TEQ, with odds ratios (ORs) equal to 0.63 [95% confidence interval (CI), 0.42-0.96], 0.68 (0.5-0.93), and 0.70 (0.52-0.95), respectively. In addition, reported allergies of the upper airways and past use of antiallergic drugs were negatively associated with CALUX TEQs, with ORs equal to 0.66 (0.47-0.93) and 0.58 (0.39-0.85), respectively. We found a negative association between IgGs and marker PCBs in serum (p = 0.009). This study shows that immunologic measurements and respiratory complaints in adolescents were associated with environmental exposure to polyhalogenated aromatic hydrocarbons (PHAHs). The negative correlation between PHAHs and allergic responses in adolescents suggested that exposure may entail alterations in the immune status

    Two-Year Responses of Renal Function to First Occupational Lead Exposure

    Get PDF
    Introduction Whether in advanced countries lead exposure still contributes to renal impairment is debated, because blood lead (BL) level is declining toward preindustrial levels and because longitudinal studies correlating renal function and BL changes over time are scarce. Methods The Study for Promotion of Health in Recycling Lead (SPHERL) evaluated the 2-year renal function responses in 251 workers (mean age, 29.7 years) transiting from environmental to occupational exposure. Main study end point was the estimated glomerular filtration rate (eGFR) derived from serum creatinine (eGFRcrt), cystatin C (eGFRcys), or both (eGFRcc). BL level was measured by inductively coupled plasma mass spectrometry (detection limit 0.5 μg/dl). Results In the follow-up, mean baseline BL level of 4.13 μg/dl increased 3.30-fold. In fully adjusted mixed models, additionally accounting for the within-participant clustering of the 1- and 2-year follow-up data, a 3-fold BL level increment was not significantly correlated with changes in eGFR with estimates amounting to −0.86 (95% CI: −2.39 to 0.67), −1.58 (−3.34 to 0.18), and −1.32 (−2.66 to 0.03) ml/min per 1.73 m2 for eGFRcrt, eGFRcys, or eGFRcc, respectively. Baseline BL level and the cumulative lead burden did not materially modify these estimates, but baseline eGFR was a major determinant of eGFR changes showing regression to the mean during follow-up. Responses of serum osmolarity, urinary gravity, or the urinary albumin-to-creatinine ratio (ACR) were also unrelated to the BL level increment. The age-related decreases in eGFRcrt, eGFRcys, and eGFRcc were −1.41, −0.96, and −1.10 ml/min per 1.73 m2, respectively. Conclusion In the current study, the 2-year changes in renal function were unrelated to the increase in BL level. However, given the CIs around the point estimates of the changes in eGFRcc and eGFRcys, a larger study with longer follow-up is being planned

    Bone Resorption and Environmental Exposure to Cadmium in Women: A Population Study

    Get PDF
    BACKGROUND: Environmental exposure to cadmium decreases bone density indirectly through hypercalciuria resulting from renal tubular dysfunction. OBJECTIVE: We sought evidence for a direct osteotoxic effect of cadmium in women. METHODS: We randomly recruited 294 women (mean age, 49.2 years) from a Flemish population with environmental cadmium exposure. We measured 24-hr urinary cadmium and blood cadmium as indexes of lifetime and recent exposure, respectively. We assessed the multivariate-adjusted association of exposure with specific markers of bone resorption, urinary hydroxylysylpyridinoline (HP) and lysylpyridinoline (LP), as well as with calcium excretion, various calciotropic hormones, and forearm bone density. RESULTS: In all women, the effect sizes associated with a doubling of lifetime exposure were 8.4% (p = 0.009) for HP, 6.9% (p = 0.10) for LP, 0.77 mmol/day (p = 0.003) for urinary calcium, -0.009 g/cm(2) (p = 0.055) for proximal forearm bone density, and -16.8% (p = 0.065) for serum parathyroid hormone. In 144 postmenopausal women, the corresponding effect sizes were -0.01223 g/cm(2) (p = 0.008) for distal forearm bone density, 4.7% (p = 0.064) for serum calcitonin, and 10.2% for bone-specific alkaline phosphatase. In all women, the effect sizes associated with a doubling of recent exposure were 7.2% (p = 0.001) for urinary HP, 7.2% (p = 0.021) for urinary LP, -9.0% (p = 0.097) for serum parathyroid hormone, and 5.5% (p = 0.008) for serum calcitonin. Only one woman had renal tubular dysfunction (urinary retinol-binding protein > 338 mu g/day). CONCLUSIONS: In the absence of renal tubular dysfunction, environmental exposure to cadmium increases bone resorption in women, suggesting a direct osteotoxic effect with increased calciuria and reactive changes in calciotropic hormones

    Two-year neurocognitive responses to first occupational lead exposure

    Get PDF
    Objectives Lead exposure causes neurocognitive dysfunction in children, but its association with neurocognition in adults at current occupational exposure levels is uncertain mainly due to the lack of longitudinal studies. In the Study for Promotion of Health in Recycling Lead (NCT02243904), we assessed the two-year responses of neurocognitive function among workers without previous known occupational exposure newly hired at lead recycling plants. Methods Workers completed the digit-symbol test (DST) and Stroop test (ST) at baseline and annual follow-up visits. Blood lead (BL) was measured by inductively coupled plasma mass spectrometry (detection limit 0.5 μg/ dL). Statistical methods included multivariable-adjusted mixed models with participants modelled as random effect. Results DST was administered to 260 participants (11.9% women; 46.9%/45.0% whites/Hispanics; mean age 29.4 years) and ST to 168 participants. Geometric means were 3.97 and 4.13 μg/dL for baseline BL, and 3.30 and 3.44 for the last-follow-up-to-baseline BL ratio in DST and ST cohorts, respectively. In partially adjusted models, a doubling of the BL ratio was associated with a 0.66% [95% confidence interval (CI) 0.03–1.30%; P=0.040] increase in latency time (DST) and a 0.35% (95% CI ‑1.63–1.63%; P=0.59) decrease in the inference effect (ST). In fully adjusted models, none of the associations of the changes in the DST and ST test results with the blood lead changes reached statistical significance (P≥0.12). Conclusions An over 3-fold increase in blood lead over two years of occupational exposure was not associated with a relevant decline in cognitive performance
    corecore