120 research outputs found

    Restoration of catalytic functions in Cre recombinase mutants by electrostatic compensation between active site and DNA substrate

    Get PDF
    Two conserved catalytic arginines, Arg-173 and Arg-292, of the tyrosine site-specific recombinase Cre are essential for the transesterification steps of strand cleavage and joining in native DNA substrates containing scissile phosphate groups. The active site tyrosine (Tyr-324) provides the nucleophile for the cleavage reaction, and forms a covalent 3â€Č-phosphotyrosyl intermediate. The 5â€Č-hydroxyl group formed during cleavage provides the nucleophile for the joining reaction between DNA partners, yielding strand exchange. Previous work showed that substitution of the scissile phosphate (P) by methylphosphonate (MeP) permits strand cleavage by a Cre variant lacking Arg-292. We now demonstrate that MeP activation and cleavage are not blocked by substitution of Arg-173 or even simultaneous substitutions of Arg-173 and Arg-292 by alanine. Furthermore, Cre(R173A) and Cre(R292A) are competent in strand joining, Cre(R173A) being less efficient. No joining activity is detected with Cre(R173A, R292A). Consistent with their ability to cleave and join strands, Cre(R173A) and Cre(R292A) can promote recombination between two MeP-full-site DNA partners. These findings shed light on the overall contribution of active site electrostatics, and tease apart distinctive contributions of the individual arginines, to the chemical steps of recombination. They have general implications in active site mechanisms that promote important phosphoryl transfer reactions in nucleic acids

    Cellular shear adhesion force measurement and simultaneous imaging by atomic force microscope

    Get PDF
    This paper presents a sensitive and fast cellular shear adhesion force measurement method using an atomic force microscope (AFM). In the work, the AFM was used both as a tool for the imaging of cells on the nano-scale and as a force sensor for the measurement of the shear adhesion force between the cell and the substrate. After the cell imaging, the measurement of cellular shear adhesion forces was made based on the different positions of the cell on the nano-scale. Moreover, different pushing speeds of probe and various locations of cells were used in experiments to study their influences. In this study, the measurement of the cell adhesion in the upper portion of the cell is different from that in the lower portion. It may reveal that the cancer cells have the metastasis tendency after cultured for 16 to 20 hours, which is significant for preventing metastasis in the patients diagnosed with early cancer lesions. Furthermore, the cellular shear adhesion forces of two types of living cancer cells were obtained based on the measurements of AFM cantilever deflections in the torsional and vertical directions. The results demonstrate that the shear adhesion force of cancer cells is twice as much as the same type of cancer cells with TRAIL. The method can also provide a way for the measurement of the cellular shear adhesion force between the cell and the substrate, and for the simultaneous exploration of cells using the AFM imaging and manipulatio

    A conceptual modeling methodology based on niches and granularity

    Get PDF
    This paper presents a methodology for conceptual modeling which is based on a new modeling primitive, the niche, and associated constructs granularity and reconciliation. A niche is an environment where entities interact for a specific purpose, playing specific roles, and according to the norms and constraints of that environment. Granularity refers to the relative level of power or influence of an entity within a niche. Reconciliation is a relationship from N entities onto one reconciled entity, and represents explicitly a situation where two or more different perspectives of the same entity have been reconciled, by negotiation, into a single consensus view. The methodology we propose provides a systematic method of designing conceptual models along with a process for normalising inappropriate relationships. Normalising is a prescriptive process for identifying and remedying inconsistencies within a model based on granularities. Drawing on a number of case studies, we show how niches and granularity make complexity easier to manage, highlight inaccuracies in a model, identify opportunities for achieving project goals, and reduce semantic heterogeneity

    Planck pre-launch status : The Planck mission

    Get PDF
    Peer reviewe

    Planck early results I : The Planck mission

    Get PDF
    Peer reviewe

    Planck early results: first assessment of the High Frequency Instrument in-flight performance

    Get PDF
    The Planck High Frequency Instrument (HFI) is designed to measure the temperature and polarization anisotropies of the Cosmic Microwave Background and galactic foregrounds in six wide bands centered at 100, 143, 217, 353, 545 and 857 GHz at an angular resolution of 10' (100 GHz), 7' (143 GHz), and 5' (217 GHz and higher). HFI has been operating flawlessly since launch on 14 May 2009. The bolometers cooled to 100 mK as planned. The settings of the readout electronics, such as the bolometer bias current, that optimize HFI's noise performance on orbit are nearly the same as the ones chosen during ground testing. Observations of Mars, Jupiter, and Saturn verified both the optical system and the time response of the detection chains. The optical beams are close to predictions from physical optics modeling. The time response of the detection chains is close to pre-launch measurements. The detectors suffer from an unexpected high flux of cosmic rays related to low solar activity. Due to the redundancy of Planck's observations strategy, the removal of a few percent of data contaminated by glitches does not affect significantly the sensitivity. The cosmic rays heat up significantly the bolometer plate and the modulation on periods of days to months of the heat load creates a common drift of all bolometer signals which do not affect the scientific capabilities. Only the high energy cosmic rays showers induce inhomogeneous heating which is a probable source of low frequency noise.Comment: Submitted to A&A. 22 pages, 6 tables, 21 figures. One of a set of simultaneous papers for the Planck Missio

    Planck early results. II. The thermal performance of Planck

    Get PDF
    The performance of the Planck instruments in space is enabled by their low operating temperatures, 20 K for LFI and 0.1 K for HFI, achieved through a combination of passive radiative cooling and three active mechanical coolers. The scientific requirement for very broad frequency coverage led to two detector technologies with widely different temperature and cooling needs. Active coolers could satisfy these needs; a helium cryostat, as used by previous cryogenic space missions (IRAS, COBE, ISO, Spitzer, AKARI), could not. Radiative cooling is provided by three V-groove radiators and a large telescope baffle. The active coolers are a hydrogen sorption cooler (<20 K), a 4He Joule-Thomson cooler (4.7 K), and a 3He-4He dilution cooler (1.4 K and 0.1 K). The flight system was at ambient temperature at launch and cooled in space to operating conditions. The HFI bolometer plate reached 93 mK on 3 July 2009, 50 days after launch. The solar panel always faces the Sun, shadowing the rest of Planck, and operates at a mean temperature of 384 K. At the other end of the spacecraft, the telescope baffle operates at 42.3 K and the telescope primary mirror operates at 35.9 K. The temperatures of key parts of the instruments are stabilized by both active and passive methods. Temperature fluctuations are driven by changes in the distance from the Sun, sorption cooler cycling and fluctuations in gas-liquid flow, and fluctuations in cosmic ray flux on the dilution and bolometer plates. These fluctuations do not compromise the science data

    Euclid preparation XVIII. The NISP photometric system

    Get PDF
    Euclid will be the first space mission to survey most of the extragalactic sky in the 0.95–2.02 ”m range, to a 5 σ point-source median depth of 24.4 AB mag. This unique photometric dataset will find wide use beyond Euclid’s core science. In this paper, we present accurate computations of the Euclid YE, JE, and HE passbands used by the Near-Infrared Spectrometer and Photometer (NISP), and the associated photometric system. We pay particular attention to passband variations in the field of view, accounting for, among other factors, spatially variable filter transmission and variations in the angle of incidence on the filter substrate using optical ray tracing. The response curves’ cut-on and cut-off wavelengths – and their variation in the field of view – are determined with ∌0.8 nm accuracy, essential for the photometric redshift accuracy required by Euclid. After computing the photometric zero points in the AB mag system, we present linear transformations from and to common ground-based near-infrared photometric systems, for normal stars, red and brown dwarfs, and galaxies separately. A Python tool to compute accurate magnitudes for arbitrary passbands and spectral energy distributions is provided. We discuss various factors, from space weathering to material outgassing, that may slowly alter Euclid’s spectral response. At the absolute flux scale, the Euclid in-flight calibration program connects the NISP photometric system to Hubble Space Telescope spectrophotometric white dwarf standards; at the relative flux scale, the chromatic evolution of the response is tracked at the milli-mag level. In this way, we establish an accurate photometric system that is fully controlled throughout Euclid’s lifetime
    • 

    corecore