44 research outputs found

    Telomere length in peripheral leukocyte DNA and gastric cancer risk

    No full text
    Telomere length reflects lifetime cumulative oxidative stress from environmental exposures, such as cigarette smoking and chronic inflammation. Shortened telomere length is thought to cause genomic instability and has been associated with several cancers. We examined the association of telomere length in peripheral leukocyte DNA with gastric cancer risk as well as potential confounding factors and risk modifiers for telomere length-related risk. In a population-based study of gastric cancer conducted in a high-risk population in Warsaw, Poland, between 1994 and 1996, we measured relative telomere length in 300 cases and 416 age- and gender-matched controls using quantitative real-time PCR. Among controls, telomeres were significantly shorter in association with aging (P < 0.001), increasing pack-years of cigarette smoking (P = 0.02), decreasing fruit intake (P = 0.04), and Helicobacter pylori positivity (P = 0.03). Gastric cancer cases had significantly shorter telomere length (mean +/- SD relative telomere length, 1.25 +/- 0.34) than controls (1.34 +/- 0.35; P = 0.0008). Gastric cancer risk doubled [odds ratio (OR), 2.04; 95% confidence interval (95% CI), 1.33-3.13] among subjects in the shortest compared with the highest quartile of telomere length (P(trend) < 0.001). Telomere length-associated risks were higher among individuals with the lowest risk profile, those H. pylori-negative (OR, 5.45; 95% CI, 2.10-14.1), nonsmokers (OR, 3.07; 95% CI, 1.71-5.51), and individuals with high intake of fruits (OR, 2.43; 95% CI, 1.46-4.05) or vegetables (OR, 2.39; 95% CI, 1.51-3.81). Our results suggest that telomere length in peripheral leukocyte DNA was associated with H. pylori positivity, cigarette smoking, and dietary fruit intake. Shortened telomeres increased gastric cancer risk in this high-risk Polish population

    Strategic Approaches for the Management of Environmental Risk Uncertainties Posed by Nanomaterials

    No full text
    Central to the responsible development of nanotechnologies is an understanding of the risks they pose to the environment. As with any novel material or emerging technology, a scarcity of data introduces potentially high uncertainty in to the characterisation of risk. Early priorities are the identification of key areas of risk uncertainty and the strategic approach for managing and reducing these. This is important as the information subsequently gathered supports decision making and policy development. We identify one important source of uncertainty for the quantification of both hazard and exposure for nanomaterials, the complexity of their behaviour in natural systems. We then outline two approaches for managing this uncertainty, based on experiences with chemicals: one that primarily focuses on hazard and one that initially focuses on exposure. While each approach places emphasis on different information requirements a common feature is the considerable time lag between information gathering and subsequent decision making based on the evidence gathered. Complementary environmental surveillance approaches can act as a safety net, although it is not as yet clear how fit for purpose current monitoring programmes are in this regard

    Strategic approaches for the management of environmental risk uncertainties posed by nanomaterials

    No full text
    Central to the responsible development of nanotechnologies is an understanding of the risks they pose to the environment. As with any novel material or emerging technology, a scarcity of data introduces potentially high uncertainty in to the characterisation of risk. Early priorities are the identification of key areas of risk uncertainty and the strategic approach for managing and reducing these. This is important as the information subsequently gathered supports decision making and policy development. We identify one important source of uncertainty for the quantification of both hazard and exposure for nanomaterials, the complexity of their behaviour in natural systems. We then outline two approaches for managing this uncertainty, based on experiences with chemicals: one that primarily focuses on hazard and one that initially focuses on exposure. While each approach places emphasis on different information requirements a common feature is the considerable time lag between information gathering and subsequent decision making based on the evidence gathered. Complementary environmental surveillance approaches can act as a safety net, although it is not as yet clear how fit for purpose current monitoring programmes are in this regard

    A guide to enteral nutrition in intensive care units: 10 expert tips for the daily practice

    No full text
    The preferential use of the oral/enteral route in critically ill patients over gut rest is uniformly recommended and applied. This article provides practical guidance on enteral nutrition in compliance with recent American and European guidelines. Low-dose enteral nutrition can be safely started within 48 h after admission, even during treatment with small or moderate doses of vasopressor agents. A percutaneous access should be used when enteral nutrition is anticipated for ≥ 4 weeks. Energy delivery should not be calculated to match energy expenditure before day 4–7, and the use of energy-dense formulas can be restricted to cases of inability to tolerate full-volume isocaloric enteral nutrition or to patients who require fluid restriction. Low-dose protein (max 0.8 g/kg/day) can be provided during the early phase of critical illness, while a protein target of > 1.2 g/kg/day could be considered during the rehabilitation phase. The occurrence of refeeding syndrome should be assessed by daily measurement of plasma phosphate, and a phosphate drop of 30% should be managed by reduction of enteral feeding rate and high-dose thiamine. Vomiting and increased gastric residual volume may indicate gastric intolerance, while sudden abdominal pain, distension, gastrointestinal paralysis, or rising abdominal pressure may indicate lower gastrointestinal intolerance.SCOPUS: re.jinfo:eu-repo/semantics/publishe

    Deteccion de citomegalovirus mediante la tecnica de inmunoperoxidasa y aislamiento viral Cytomegalovirus detection by Immunoperoxidase assay and viral isolation

    No full text
    En el presente estudio se comparó la técnica de inmunoperoxidasa para la detección de citomegalovirus (IPCMV) utilizando anticuerpos monoclonales que reconocen proteínas precoces virales con el método convencional de aislamiento viral en fibroblastos humanos. Un total de 150 muestras de orina fueron examinadas encontrando una sensibilidad de un 89.8% y una especificidad de 91.3% de la técnica de IPCMV comparada con el aislamiento viral. Una de las ventajas que presentó la IPCMV fue la rapidez con que fueron obtenidos los resultados (48 horas) mientras que el aislamiento viral fue como promedio 14 días.<br>An Immunoperoxidase assay was applied to detect early antigens of Cytomegalovirus (CMV) in 150 urine samples from immunocompromised patients, using the commercial available monoclonal antibody against CMV El3. The detection of early antigen by IP (IPCMV) is compared to the conventional cell culture isolation regarding specificity and sensitivity in order to evaluate is usefulness in the diagnostic of CMV infections. The IPCMV showed a sensitivity of 89.8% and a specificity of 91.3% when compared to the isolation method. The great advantage of the IPCMV is based on the shorter time results are achieved, since 48-72 Hs can be enough to provide evidence of CMV infection, while in the isolation technique cytopatho-genic effect was present around 14 days after sample inoculation
    corecore