3,178 research outputs found

    Genotoxic mixtures and dissimilar action: Concepts for prediction and assessment

    Get PDF
    This article has been made available through the Brunel Open Access Publishing Fund. This article is distributed under the terms of the creative commons Attribution license which permits any use, distribution, and reproduction in any medium, provided the original author(s)and the source are credited.Combinations of genotoxic agents have frequently been assessed without clear assumptions regarding their expected (additive) mixture effects, often leading to claims of synergisms that might in fact be compatible with additivity. We have shown earlier that the combined effects of chemicals, which induce micronuclei (MN) in the cytokinesis-block micronucleus assay in Chinese hamster ovary-K1 cells by a similar mechanism, were additive according to the concept of concentration addition (CA). Here, we extended these studies and investigated for the first time whether valid additivity expectations can be formulated for MN-inducing chemicals that operate through a variety of mechanisms, including aneugens and clastogens (DNA cross-linkers, topoisomerase II inhibitors, minor groove binders). We expected that their effects should follow the additivity principles of independent action (IA). With two mixtures, one composed of various aneugens (colchicine, flubendazole, vinblastine sulphate, griseofulvin, paclitaxel), and another composed of aneugens and clastogens (flubendazole, doxorubicin, etoposide, melphalan and mitomycin C), we observed mixture effects that fell between the additivity predictions derived from CA and IA. We achieved better agreement between observation and prediction by grouping the chemicals into common assessment groups and using hybrid CA/IA prediction models. The combined effects of four dissimilarly acting compounds (flubendazole, paclitaxel, doxorubicin and melphalan) also fell within CA and IA. Two binary mixtures (flubendazole/paclitaxel and flubendazole/doxorubicin) showed effects in reasonable agreement with IA additivity. Our studies provide a systematic basis for the investigation of mixtures that affect endpoints of relevance to genotoxicity and show that their effects are largely additive.UK Food Standards Agenc

    Oscillations of aqueous PEDOT:PSS fluid droplets and the properties of complex fluids in drop-on-demand inkjet printing

    Get PDF
    Shear-thinning aqueous poly(3,4-ethylenedioxythiophene): poly(styrene sulphonate) (PEDOT:PSS) fluids were studied under the conditions of drop-on-demand inkjet printing. Ligament retraction caused oscillation of the resulting drops, from which values of surface tension and viscosity were derived. Effective viscosities of <4 mPa s at drop oscillation frequencies of 13–33 kHz were consistent with conventional high-frequency rheometry, with only a small possible contribution from viscoelasticity with a relaxation time of about 6 ÎŒs. Strong evidence was found that the viscosity, reduced by shear-thinning in the printhead nozzle, recovered as the drop formed. The low viscosity values measured for the drops in flight were associated with the strong oscillation induced by ligament retraction, while for a weakly perturbed drop the viscosity remained high. Surface tension values in the presence of surfactant were significantly higher than the equilibrium values, and consistent with the surface age of the drops. [Graphical abstract - see article]This work was supported by EPSRC and a consortium of industrial partners (EPSRC Grant no. EP/H018913/1: Innovation in industrial inkjet technology). The high-speed camera and high power flash lamp were provided by the EPSRC Engineering Instrument Pool and we thank Adrian Walker for his help.This is the final version of the article. It first appeared from Elsevier via http://dx.doi.org/10.1016/j.jnnfm.2015.05.00

    Compatibility of neutrino DIS data and global analyses of parton distribution functions

    Full text link
    Neutrino\antineutrino deep inelastic scattering (DIS) data provide useful constrains for the flavor decomposition in global fits of parton distribution functions (PDF). The smallness of the cross-sections requires the use of nuclear targets in the experimental setup. Understanding the nuclear corrections is, for this reason, of utmost importance for a precise determination of the PDFs. Here, we explore the nuclear effects in the neutrino\antineutrino-nucleon DIS by comparing the NuTeV, CDHSW, and CHORUS cross-sections to the predictions derived from the latest parton distribution functions and their nuclear modifications. We obtain a good description of these data and find no apparent disagreement between the nuclear effects in neutrino DIS and those in charged lepton DIS. These results also indicate that further improvements in the knowledge of the nuclear PDFs could be obtained by a more extensive use of these sets of neutrino data.Comment: 16 pages, 8 figure

    Theoretical Uncertainties in Electroweak Boson Production Cross Sections at 7, 10, and 14 TeV at the LHC

    Full text link
    We present an updated study of the systematic errors in the measurements of the electroweak boson cross-sections at the LHC for various experimental cuts for a center of mass energy of 7, 10 and 14 TeV. The size of both electroweak and NNLO QCD contributions are estimated, together with the systematic error from the parton distributions. The effects of new versions of the MSTW, CTEQ, and NNPDF PDFs are considered.Comment: PDFLatex with JHEP3.cls. 22 pages, 43 figures. Version 2 adds the CT10W PDF set to analysis and updates the final systematic error table and conclusions, plus several citations and minor wording changes. Version 3 adds some references on electroweak and mixed QED/QCD corrections. Version 4 adds more references and acknowledgement

    Measurements in two bases are sufficient for certifying high-dimensional entanglement

    Full text link
    High-dimensional encoding of quantum information provides a promising method of transcending current limitations in quantum communication. One of the central challenges in the pursuit of such an approach is the certification of high-dimensional entanglement. In particular, it is desirable to do so without resorting to inefficient full state tomography. Here, we show how carefully constructed measurements in two bases (one of which is not orthonormal) can be used to faithfully and efficiently certify bipartite high-dimensional states and their entanglement for any physical platform. To showcase the practicality of this approach under realistic conditions, we put it to the test for photons entangled in their orbital angular momentum. In our experimental setup, we are able to verify 9-dimensional entanglement for a pair of photons on a 11-dimensional subspace each, at present the highest amount certified without any assumptions on the state.Comment: 11+14 pages, 2+7 figure

    On staying grounded and avoiding Quixotic dead ends

    Get PDF
    The 15 articles in this special issue on The Representation of Concepts illustrate the rich variety of theoretical positions and supporting research that characterize the area. Although much agreement exists among contributors, much disagreement exists as well, especially about the roles of grounding and abstraction in conceptual processing. I first review theoretical approaches raised in these articles that I believe are Quixotic dead ends, namely, approaches that are principled and inspired but likely to fail. In the process, I review various theories of amodal symbols, their distortions of grounded theories, and fallacies in the evidence used to support them. Incorporating further contributions across articles, I then sketch a theoretical approach that I believe is likely to be successful, which includes grounding, abstraction, flexibility, explaining classic conceptual phenomena, and making contact with real-world situations. This account further proposes that (1) a key element of grounding is neural reuse, (2) abstraction takes the forms of multimodal compression, distilled abstraction, and distributed linguistic representation (but not amodal symbols), and (3) flexible context-dependent representations are a hallmark of conceptual processing

    Light Stop NLSPs at the Tevatron and LHC

    Full text link
    How light can the stop be given current experimental constraints? Can it still be lighter than the top? In this paper, we study this and related questions in the context of gauge-mediated supersymmetry breaking, where a stop NLSP decays into a W, b and gravitino. Focusing on the case of prompt decays, we simulate several existing Tevatron and LHC analyses that would be sensitive to this scenario, and find that they allow the stop to be as light as 150 GeV, mostly due to the large top production background. With more data, the existing LHC analyses will be able to push the limit up to at least 180 GeV. We hope this work will motivate more dedicated experimental searches for this simple scenario, in which, for most purposes, the only free parameters are the stop mass and lifetime.Comment: 31 pages, 11 figures; v2: added minor clarifications and reference
    • 

    corecore