669 research outputs found

    Manipulating gene expression for the metabolic engineering of plants

    Get PDF
    Introducing and expressing foreign genes in plants present many technical challenges that are not encountered with microbial systems. This review addresses the variety of issues that must be considered and the variety of options that are available, in terms of choosing transformation systems and designing recombinant transgenes to ensure appropriate expression in plant cells. Tissue specificity and proper developmental regulation, as well as proper subcellular localization of products, must be dealt with for successful metabolic engineering in plants

    A Bayesian approach to star-galaxy classification

    Full text link
    Star-galaxy classification is one of the most fundamental data-processing tasks in survey astronomy, and a critical starting point for the scientific exploitation of survey data. For bright sources this classification can be done with almost complete reliability, but for the numerous sources close to a survey's detection limit each image encodes only limited morphological information. In this regime, from which many of the new scientific discoveries are likely to come, it is vital to utilise all the available information about a source, both from multiple measurements and also prior knowledge about the star and galaxy populations. It is also more useful and realistic to provide classification probabilities than decisive classifications. All these desiderata can be met by adopting a Bayesian approach to star-galaxy classification, and we develop a very general formalism for doing so. An immediate implication of applying Bayes's theorem to this problem is that it is formally impossible to combine morphological measurements in different bands without using colour information as well; however we develop several approximations that disregard colour information as much as possible. The resultant scheme is applied to data from the UKIRT Infrared Deep Sky Survey (UKIDSS), and tested by comparing the results to deep Sloan Digital Sky Survey (SDSS) Stripe 82 measurements of the same sources. The Bayesian classification probabilities obtained from the UKIDSS data agree well with the deep SDSS classifications both overall (a mismatch rate of 0.022, compared to 0.044 for the UKIDSS pipeline classifier) and close to the UKIDSS detection limit (a mismatch rate of 0.068 compared to 0.075 for the UKIDSS pipeline classifier). The Bayesian formalism developed here can be applied to improve the reliability of any star-galaxy classification schemes based on the measured values of morphology statistics alone.Comment: Accepted 22 November 2010, 19 pages, 17 figure

    Absence of Whisker-Related Pattern Formation in Mice with NMDA Receptors Lacking Coincidence Detection Properties and Calcium Signaling

    Get PDF
    Precise refinement of synaptic connectivity is the result of activity-dependent mechanisms in which coincidence-dependent calcium signaling by NMDA receptors (NMDARs) under control of the voltage-dependent Mg2+ block might play a special role. In the developing rodent trigeminal system, the pattern of synaptic connections between whisker-specific inputs and their target cells in the brainstem is refined to form functionally and morphologically distinct units (barrelettes). To test the role of NMDA receptor signaling in this process, we introduced the N598R mutation into the native NR1 gene. This leads to the expression of functional NMDARs that are Mg2+ insensitive and Ca2+impermeable. Newborn mice expressing exclusively NR1 N598R-containing NMDARs do not show any whisker-related patterning in the brainstem, whereas the topographic projection of trigeminal afferents and gross brain morphology appear normal. Furthermore, the NR1 N598R mutation does not affect expression levels of NMDAR subunits and other important neurotransmitter receptors. Our results show that coincidence detection by, and/or Ca2+ permeability of, NMDARs is necessary for the development of somatotopic maps in the brainstem and suggest that highly specific signaling underlies synaptic refinement

    Tree planting in organic soils does not result in net carbon sequestration on decadal timescales

    Get PDF
    Tree planting is increasingly being proposed as a strategy to combat climate change through carbon (C) sequestration in tree biomass. However, total ecosystem C storage that includes soil organic C (SOC) must be considered to determine whether planting trees for climate change mitigation results in increased C storage. We show that planting two native tree species (Betula pubescens and Pinus sylvestris ), of widespread Eurasian distribution, onto heather (Calluna vulgaris ) moorland with podzolic and peaty podzolic soils in Scotland, did not lead to an increase in net ecosystem C stock 12 or 39 years after planting. Plots with trees had greater soil respiration and lower SOC in organic soil horizons than heather control plots. The decline in SOC cancelled out the increment in C stocks in tree biomass on decadal timescales. At all four experimental sites sampled, there was no net gain in ecosystem C stocks 12–39 years after afforestation—indeed we found a net ecosystem C loss in one of four sites with deciduous B. pubescens stands; no net gain in ecosystem C at three sites planted with B. pubescens ; and no net gain at additional stands of P. sylvestris . We hypothesize that altered mycorrhizal communities and autotrophic C inputs have led to positive ‘priming’ of soil organic matter, resulting in SOC loss, constraining the benefits of tree planting for ecosystem C sequestration. The results are of direct relevance to current policies, which promote tree planting on the assumption that this will increase net ecosystem C storage and contribute to climate change mitigation. Ecosystem‐level biogeochemistry and C fluxes must be better quantified and understood before we can be assured that large‐scale tree planting in regions with considerable pre‐existing SOC stocks will have the intended policy and climate change mitigation outcomes

    Gravitationally lensed quasars and supernovae in future wide-field optical imaging surveys

    Full text link
    Cadenced optical imaging surveys in the next decade will be capable of detecting time-varying galaxy-scale strong gravitational lenses in large numbers, increasing the size of the statistically well-defined samples of multiply-imaged quasars by two orders of magnitude, and discovering the first strongly-lensed supernovae. We carry out a detailed calculation of the likely yields of several planned surveys, using realistic distributions for the lens and source properties and taking magnification bias and image configuration detectability into account. We find that upcoming wide-field synoptic surveys should detect several thousand lensed quasars. In particular, the LSST should find 8000 lensed quasars, 3000 of which will have well-measured time delays, and also ~130 lensed supernovae, which is compared with ~15 lensed supernovae predicted to be found by the JDEM. We predict the quad fraction to be ~15% for the lensed quasars and ~30% for the lensed supernovae. Generating a mock catalogue of around 1500 well-observed double-image lenses, we compute the available precision on the Hubble constant and the dark energy equation parameters for the time delay distance experiment (assuming priors from Planck): the predicted marginalised 68% confidence intervals are \sigma(w_0)=0.15, \sigma(w_a)=0.41, and \sigma(h)=0.017. While this is encouraging in the sense that these uncertainties are only 50% larger than those predicted for a space-based type-Ia supernova sample, we show how the dark energy figure of merit degrades with decreasing knowledge of the the lens mass distribution. (Abridged)Comment: 17 pages, 10 figures, 3 tables, accepted for publication in MNRAS; mock LSST lens catalogue may be available at http://kipac-prod.stanford.edu/collab/research/lensing/mocklen

    Recycling oriented vertical vibratory separation of copper and polypropylene particles

    Get PDF
    Vibration has been employed in various engineering processes for material handling. The famous Brazil nut effect, large particles tend to rise to the top under vibration, initiates various research about vibration induced particle segregation. Particle size and density are two determining factors for their behaviour under vibration. Previous research in University of Nottingham proves vertical vibratory separation to be a promising environmental friendly mechanical separation method for recycling metallic fraction from shredded Waste Electric and Electronic Equipment (WEEE) stream. A pilot scale thin cell vibratory separator has been developed to investigate the potential for WEEE recycling applications. Shredded copper and polypropylene particles have been chosen to mimic metallic and non-metallic fractions in WEEE. Vibratory separation experiment with controlled environment and addition of solid lubricant are presented in this paper. The result demonstrates the effect of relative humidity and solid lubricant on improving flowability of granular system hence successful vibratory separation. The proposed mechanisms for the presence of moisture and solid lubricant are lubricant effect and elimination of static electricity

    Data challenges of time domain astronomy

    Full text link
    Astronomy has been at the forefront of the development of the techniques and methodologies of data intensive science for over a decade with large sky surveys and distributed efforts such as the Virtual Observatory. However, it faces a new data deluge with the next generation of synoptic sky surveys which are opening up the time domain for discovery and exploration. This brings both new scientific opportunities and fresh challenges, in terms of data rates from robotic telescopes and exponential complexity in linked data, but also for data mining algorithms used in classification and decision making. In this paper, we describe how an informatics-based approach-part of the so-called "fourth paradigm" of scientific discovery-is emerging to deal with these. We review our experiences with the Palomar-Quest and Catalina Real-Time Transient Sky Surveys; in particular, addressing the issue of the heterogeneity of data associated with transient astronomical events (and other sensor networks) and how to manage and analyze it.Comment: 15 pages, 3 figures, to appear in special issue of Distributed and Parallel Databases on Data Intensive eScienc

    Photometric redshifts for the next generation of deep radio continuum surveys - I: template fitting

    Get PDF
    We present a study of photometric redshift performance for galaxies and active galactic nuclei detected in deep radio continuum surveys. Using two multi-wavelength datasets, over the NOAO Deep Wide Field Survey Boötes and COSMOS fields, we assess photometric redshift (photo-z) performance for a sample of 4; 500 radio continuum sources with spectroscopic redshifts relative to those of 63; 000 non radio-detected sources in the same fields. We investigate the performance of three photometric redshift template sets as a function of redshift, radio luminosity and infrared/X-ray properties. We find that no single template library is able to provide the best performance across all subsets of the radio detected population, with variation in the optimum template set both between subsets and between fields. Through a hierarchical Bayesian combination of the photo-z estimates from all three template sets, we are able to produce a consensus photo-z estimate which equals or improves upon the performance of any individual template set
    corecore