14,942 research outputs found

    Roles of replication fork-interacting and Chk1-activating domains from claspin in a DNA replication checkpoint response

    Get PDF
    Claspin is essential for the ATR-dependent activation of Chk1 in Xenopus egg extracts containing incompletely replicated DNA. Claspin associates with replication forks upon origin unwinding. We show that Claspin contains a replication fork-interacting domain (RFID, residues 265–605) that associates with Cdc45, DNA polymerase ε, replication protein A, and two replication factor C complexes on chromatin. The RFID contains two basic patches (BP1 and BP2) at amino acids 265–331 and 470–600, respectively. Deletion of either BP1 or BP2 compromises optimal binding of Claspin to chromatin. Absence of BP1 has no effect on the ability of Claspin to mediate activation of Chk1. By contrast, removal of BP2 causes a large reduction in the Chk1-activating potency of Claspin. We also find that Claspin contains a small Chk1-activating domain (residues 776–905) that does not bind stably to chromatin, but it is fully effective at high concentrations for mediating activation of Chk1. These results indicate that stable retention of Claspin on chromatin is not necessary for activation of Chk1. Instead, our findings suggest that only transient interaction of Claspin with replication forks potentiates its Chk1-activating function. Another implication of this work is that stable binding of Claspin to chromatin may play a role in other functions besides the activation of Chk1

    The Faint End of the Quasar Luminosity Function at z ~ 4: Implications for Ionization of the Intergalactic Medium and Cosmic Downsizing

    Get PDF
    We present an updated determination of the z ~ 4 QSO luminosity function (QLF), improving the quality of the determination of the faint end of the QLF presented by Glikman et al. (2010). We have observed an additional 43 candidates from our survey sample, yielding one additional QSO at z = 4.23 and increasing the completeness of our spectroscopic follow-up to 48% for candidates brighter than R = 24 over our survey area of 3.76 deg^2. We study the effect of using K-corrections to compute the rest-frame absolute magnitude at 1450 Å compared with measuring M_(1450) directly from the object spectra. We find a luminosity-dependent bias: template-based K-corrections overestimate the luminosity of low-luminosity QSOs, likely due to their reliance on templates derived from higher luminosity QSOs. Combining our sample with bright quasars from the Sloan Digital Sky Survey and using spectrum-based M 1450 for all the quasars, we fit a double power law to the binned QLF. Our best fit has a bright-end slope, α = 3.3 ± 0.2, and faint-end slope, β = 1.6^(+0.8)_(–0.6). Our new data revise the faint-end slope of the QLF down to flatter values similar to those measured at z ~ 3. The break luminosity, though poorly constrained, is at M* = –24.1^(+0.7)_(–1.9), approximately 1-1.5 mag fainter than at z ~ 3. This QLF implies that QSOs account for about half the radiation needed to ionize the intergalactic medium at these redshifts

    Computational Design and Characterization of a Temperature-Sensitive Plasmid Replicon for Gram Positive Thermophiles

    Get PDF
    Temperature-sensitive (Ts) plasmids are useful tools for genetic engineering, but there are currently none compatible with the gram positive, thermophilic, obligate anaerobe, Clostridium thermocellum. Traditional mutagenesis techniques yield Ts mutants at a low frequency, and therefore requires the development of high-throughput screening protocols, which are also not available for this organism. Recently there has been progress in the development of computer algorithms which can predict Ts mutations. Most plasmids currently used for genetic modification of C. thermocellum are based on the replicon of plasmid pNW33N, which replicates using the RepB replication protein. To address this problem, we set out to create a Ts plasmid by mutating the gene coding for the RepB replication protein using an algorithm designed by Varadarajan et al. (1996) for predicting Ts mutants based on the amino-acid sequence of the protein

    Perturbative matching of staggered four-fermion operators with hypercubic fat links

    Full text link
    We calculate the one-loop matching coefficients between continuum and lattice four-fermion operators for lattice operators constructed using staggered fermions and improved by the use of fattened links. In particular, we consider hypercubic fat links and SU(3) projected Fat-7 links, and their mean-field improved versions. We calculate only current-current diagrams, so that our results apply for operators whose flavor structure does not allow ``eye-diagrams''. We present general formulae, based on two independent approaches, and give numerical results for the cases in which the operators have the taste (staggered flavor) of the pseudo-Goldstone pion. We find that the one-loop corrections are reduced down to the 10-20% level, resolving the problem of large perturbative corrections for staggered fermion calculations of matrix elements.Comment: 37 pages, no figure, 20 table

    A Call to Arms: Revisiting Database Design

    Get PDF
    Good database design is crucial to obtain a sound, consistent database, and - in turn - good database design methodologies are the best way to achieve the right design. These methodologies are taught to most Computer Science undergraduates, as part of any Introduction to Database class. They can be considered part of the "canon", and indeed, the overall approach to database design has been unchanged for years. Moreover, none of the major database research assessments identify database design as a strategic research direction. Should we conclude that database design is a solved problem? Our thesis is that database design remains a critical unsolved problem. Hence, it should be the subject of more research. Our starting point is the observation that traditional database design is not used in practice - and if it were used it would result in designs that are not well adapted to current environments. In short, database design has failed to keep up with the times. In this paper, we put forth arguments to support our viewpoint, analyze the root causes of this situation and suggest some avenues of research.Comment: Removed spurious column break. Nothing else was change

    Accelerated Cardiac Diffusion Tensor Imaging Using Joint Low-Rank and Sparsity Constraints

    Full text link
    Objective: The purpose of this manuscript is to accelerate cardiac diffusion tensor imaging (CDTI) by integrating low-rankness and compressed sensing. Methods: Diffusion-weighted images exhibit both transform sparsity and low-rankness. These properties can jointly be exploited to accelerate CDTI, especially when a phase map is applied to correct for the phase inconsistency across diffusion directions, thereby enhancing low-rankness. The proposed method is evaluated both ex vivo and in vivo, and is compared to methods using either a low-rank or sparsity constraint alone. Results: Compared to using a low-rank or sparsity constraint alone, the proposed method preserves more accurate helix angle features, the transmural continuum across the myocardium wall, and mean diffusivity at higher acceleration, while yielding significantly lower bias and higher intraclass correlation coefficient. Conclusion: Low-rankness and compressed sensing together facilitate acceleration for both ex vivo and in vivo CDTI, improving reconstruction accuracy compared to employing either constraint alone. Significance: Compared to previous methods for accelerating CDTI, the proposed method has the potential to reach higher acceleration while preserving myofiber architecture features which may allow more spatial coverage, higher spatial resolution and shorter temporal footprint in the future.Comment: 11 pages, 16 figures, published on IEEE Transactions on Biomedical Engineerin
    corecore