210 research outputs found

    Parity Violation in Neutron Resonances in 107,109Ag

    Full text link
    Parity nonconservation (PNC) was studied in p-wave resonances in Ag by measuring the helicity dependence of the neutron total cross section. Transmission measurements on natural Ag were performed in the energy range 32 to 422 eV with the time-of-flight method at the Manuel Lujan Neutron Scattering Center at Los Alamos National Laboratory. A total of 15 p-wave neutron resonances were studied in 107Ag and ninep-wave resonances in 109Ag. Statistically significant asymmetries were observed for eight resonances in 107Ag and for four resonances in109Ag. An analysis treating the PNC matrix elements as random variables yields a weak spreading width of Γw=(2.67-1.21+2.65)×10-7 eV for107Ag and Γw=(1.30-0.74+2.49)×10-7 eV for 109Ag

    Parity Violation in Neutron Resonances in 107,109Ag

    Get PDF
    Parity nonconservation (PNC) was studied in p-wave resonances in Ag by measuring the helicity dependence of the neutron total cross section. Transmission measurements on natural Ag were performed in the energy range 32 to 422 eV with the time-of-flight method at the Manuel Lujan Neutron Scattering Center at Los Alamos National Laboratory. A total of 15 p-wave neutron resonances were studied in 107Ag and ninep-wave resonances in 109Ag. Statistically significant asymmetries were observed for eight resonances in 107Ag and for four resonances in109Ag. An analysis treating the PNC matrix elements as random variables yields a weak spreading width of Γw=(2.67-1.21+2.65)×10-7 eV for107Ag and Γw=(1.30-0.74+2.49)×10-7 eV for 109Ag

    Parity Violation in Neutron Resonances in 115In

    Get PDF
    Parity nonconservation (PNC) was studied in p-wave resonances in indium by measuring the helicity dependence of the neutron total cross section in the neutron energy range 6.0–316 eV with the time-of-flight method at LANSCE. A total of 36 p-wave neutron resonances were studied in 115In, and statistically significant asymmetries were observed for nine cases. An analysis treating the PNC matrix elements as random variables yields a weak matrix element of M=(0.67-0.12+0.16) meV and a weak spreading width of Γw=(1.30-0.43+0.76)×10-7 eV

    Moving Forward in Human Cancer Risk Assessment

    Get PDF
    The goal of human risk assessment is to decide whether a given exposure level to a particular chemical or substance is acceptable to human health, and to provide risk management measures based on an evaluation and prediction of the effects of that exposure on human health. Within this framework, the current safety paradigm for assessing possible carcinogenic properties of drugs, cosmetics, industrial chemicals and environmental exposures relies mainly on in vitro genotoxicity testing followed by 2-year bioassays in mice and rats. This testing paradigm was developed 40 to 50 years ago with the initial premise that ¿mutagens are also carcinogens¿ and that the carcinogenic risk to humans can be extrapolated from the tumor incidence after lifetime exposure to maximally tolerated doses of chemicals in rodents. Genotoxicity testing is used as a surrogate for carcinogenicity testing and is required for initiation of clinical trials (Jacobs and Jacobson-Kram 2004) and for most industrial chemicals safety assessment. Although the carcinogenicity-testing paradigm has effectively protected patients and consumers from introduction of harmful carcinogens as drugs and other products, the testing paradigm is clearly not sustainable in the future. The causal link between genetic damage and carcinogenicity is well documented; however, the limitations of genotoxicity/carcinogenicity testing assays, the presence of additional non-genotoxic mechanisms, issues of species-specific effects, and the lack of mechanistic insights provide an enormous scientific challenge. The 2-year rodent carcinogenicity bioassays are associated with technical complexity, high costs, high animal burden as well as the uncertainty associated with extrapolating from rodents to humans. Additional frustrations exist because of the limited predictability of the 2-year bioassay and, in particular, with regard to the problem of the prediction of false positives. For instance, in the Carcinogenic Potency Project DataBase (CPDB) which includes results from chronic, long-term animal cancer tests with mice, rats, hamsters amounting to a total of 6540 individual experiments with 1547 chemicals, 751 of those chemicals or 51% have positive findings in rodent studies. Similarly, when one considers all chronically used human pharmaceuticals, some 50% induce tumors in rodents. Yet only 20 human pharmaceutical compounds have been identified as carcinogens in epidemiological studies, despite the fact that quite a large number of epidemiological studies have been carried out on these compounds, e.g. NSAID¿s, benzodiazepines, phenobarbital. This high incidence of tumors in bioassays has lead to questions concerning the human relevance of tumors induced in rodents (Knight et al. 2006; Ward 2008). In summary, dependency on the rodent model as a golden standard of cancer risk assessment is neglecting the high number of false positives and clearly has serious limitations. Consequently, there is a growing appeal for a paradigm change after "50 years of rats and mice". For instance, the current demands for volume of carcinogenic testing together with limitations of animal usage as initially stipulated by REACH (Combes et al. 2006) will require revolutionary change in the testing paradigm. For the purpose of developing a road map for this needed paradigm change in carcinogenicity testing, a workshop was held in August 2009 in Venice, Italy entitled ¿Genomics in Cancer Risk Assessment.¿ This workshop brought together toxicologists from academia and industry with governmental regulators and risk assessors from the US and the EU, for discussing the state-of-the-art in developing alternative testing strategies for genotoxicity and carcinogenicity, thereby focusing on the contribution from the ¿omics technologies. What follows is a highlight of the major conclusions and suggestions from this workshop as a path forward.JRC.DG.I.3-In-vitro method

    The combined use of surgical debulking and diode laser photocoagulation for limbal melanoma treatment: a retrospective study of 21 dogs

    Get PDF
    Objective To evaluate effectiveness and safety of debulking and diode laser photocoagulation (DPC) for the treatment of limbal melanoma (LM). Procedure Retrospective multi-institutional case series. Medical records of animals diagnosed with LM at the Centro Veterinario Specialistico (CVS) and at the Long Island Veterinary Specialists from 1994 to 2014 were retrieved. Signalment, location, extent of tumors, recurrence rate, and early and late complications were reported. Patient follow-up information was obtained from veterinary ophthalmologists, primary care veterinarians, and where appropriate, owners. Results Twenty-one eyes of 21 dogs (13 females and 8 males) were included in this study. The dogs' average age was 6 years (range: 7 months-11 years). The follow-up period ranged from 1-108 months (median 48 months) after the last DPC procedure. Long-term follow-up was obtained by telephone interviews in 6 of 20 cases and by clinical re-evaluations in 14 of 20 cases. The most common early complications were a moderate anterior uveitis and peripheral corneal edema (21/21 eyes). Late complications included corneal fibrosis and/or pigmentation (20/21). In one case, a severe bullous keratopathy associated with extensive corneal fibrosis was observed (1/21). One case was blind due to concurrent Sudden Acquired Retinal Degeneration (SARD). However, after surgery 2 of 20 eyes lost vision and one of these was enucleated. Conclusions Debulking, in addition to diode laser photocoagulation, was technically straightforward to perform, minimally invasive, well tolerated, and highly successful in this case series

    Identification of Giardia lamblia DHHC Proteins and the Role of Protein S-palmitoylation in the Encystation Process

    Get PDF
    Protein S-palmitoylation, a hydrophobic post-translational modification, is performed by protein acyltransferases that have a common DHHC Cys-rich domain (DHHC proteins), and provides a regulatory switch for protein membrane association. In this work, we analyzed the presence of DHHC proteins in the protozoa parasite Giardia lamblia and the function of the reversible S-palmitoylation of proteins during parasite differentiation into cyst. Two specific events were observed: encysting cells displayed a larger amount of palmitoylated proteins, and parasites treated with palmitoylation inhibitors produced a reduced number of mature cysts. With bioinformatics tools, we found nine DHHC proteins, potential protein acyltransferases, in the Giardia proteome. These proteins displayed a conserved structure when compared to different organisms and are distributed in different monophyletic clades. Although all Giardia DHHC proteins were found to be present in trophozoites and encysting cells, these proteins showed a different intracellular localization in trophozoites and seemed to be differently involved in the encystation process when they were overexpressed. dhhc transgenic parasites showed a different pattern of cyst wall protein expression and yielded different amounts of mature cysts when they were induced to encyst. Our findings disclosed some important issues regarding the role of DHHC proteins and palmitoylation during Giardia encystation.Fil: Merino, Maria Cecilia. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Córdoba. Instituto de Investigación Médica Mercedes y Martín Ferreyra. Universidad Nacional de Córdoba. Instituto de Investigación Médica Mercedes y Martín Ferreyra; ArgentinaFil: Zamponi, Nahuel. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Córdoba. Instituto de Investigación Médica Mercedes y Martín Ferreyra. Universidad Nacional de Córdoba. Instituto de Investigación Médica Mercedes y Martín Ferreyra; ArgentinaFil: Vranych, Cecilia Verónica. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Córdoba. Instituto de Investigación Médica Mercedes y Martín Ferreyra. Universidad Nacional de Córdoba. Instituto de Investigación Médica Mercedes y Martín Ferreyra; ArgentinaFil: Touz, Maria Carolina. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Córdoba. Instituto de Investigación Médica Mercedes y Martín Ferreyra. Universidad Nacional de Córdoba. Instituto de Investigación Médica Mercedes y Martín Ferreyra; ArgentinaFil: Ropolo, Andrea Silvana. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Córdoba. Instituto de Investigación Médica Mercedes y Martín Ferreyra. Universidad Nacional de Córdoba. Instituto de Investigación Médica Mercedes y Martín Ferreyra; Argentin

    Meeting Report: Validation of Toxicogenomics-Based Test Systems: ECVAM–ICCVAM/NICEATM Considerations for Regulatory Use

    Get PDF
    This is the report of the first workshop “Validation of Toxicogenomics-Based Test Systems” held 11–12 December 2003 in Ispra, Italy. The workshop was hosted by the European Centre for the Validation of Alternative Methods (ECVAM) and organized jointly by ECVAM, the U.S. Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), and the National Toxicology Program (NTP) Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM). The primary aim of the workshop was for participants to discuss and define principles applicable to the validation of toxicogenomics platforms as well as validation of specific toxicologic test methods that incorporate toxicogenomics technologies. The workshop was viewed as an opportunity for initiating a dialogue between technologic experts, regulators, and the principal validation bodies and for identifying those factors to which the validation process would be applicable. It was felt that to do so now, as the technology is evolving and associated challenges are identified, would be a basis for the future validation of the technology when it reaches the appropriate stage. Because of the complexity of the issue, different aspects of the validation of toxicogenomics-based test methods were covered. The three focus areas include a) biologic validation of toxicogenomics-based test methods for regulatory decision making, b) technical and bioinformatics aspects related to validation, and c) validation issues as they relate to regulatory acceptance and use of toxicogenomics-based test methods. In this report we summarize the discussions and describe in detail the recommendations for future direction and priorities

    Minería de datos y big data: aplicaciones en riesgo crediticio, salud y análisis de mercado

    Get PDF
    Esta línea de investigación se centra en el estudio y desarrollo de Sistemas Inteligentes para la resolución de problemas de Minería de Datos y Big Data utilizando técnicas de Aprendizaje Automático. Los sistemas desarrollados se aplican particularmente al procesamiento de textos y reconocimiento de patrones en imágenes. En el área de la Minería de Datos se está trabajando, por un lado, en la generación de un modelo de fácil interpretación a partir de la extracción de reglas de clasificación que permita justificar la toma de decisiones y, por otro lado, en el desarrollo de nuevas estrategias para tratar grandes volúmenes de datos. Con respecto al área de Big Data se están realizando diversos aportes usando el framework Spark Streaming. En esta dirección, se está investigando en una técnica de clustering dinámico que se ejecuta de manera distribuida. Además se ha implementado en Spark Streaming una aplicación que calcula el índice de Hurtz de manera online, actualizándolo cada pocos segundos con el objetivo de estudiar un cierto mercado de negocios. En el área de la Minería de Textos se han desarrollado estrategias para resumir documentos a través de la extracción utilizando métricas de selección y técnicas de optimización de los párrafos más representativos. Además se han desarrollado métodos capaces de determinar la subjetividad de oraciones escritas en español.Eje temático: Bases de Datos y Minería de Datos
    corecore