2,626 research outputs found

    Deciphering the past to inform the future: preparing for the next (“really big”) extreme event

    Get PDF
    Climate change will bring more extremes in temperature and precipitation that will impact productivity and ecosystem resilience throughout agroecosystems worldwide. Historical events can be used to identify drivers that impact future events. A catastrophic drought in the US in the 1930s resulted in an abrupt boundary between areas severely impacted by the Dust Bowl and areas that were less severely affected. Historical primary production data confirmed the location of this boundary at the border between two states (Nebraska and Iowa). Local drivers of weather and soils explained production responses across the boundary before and after the drought (1926–1948). During the drought, however, features at the landscape scale (soil properties and wind velocities) and regional scale (the Missouri River, its floodplain, and the nearby Loess Hills) explained most of the observed variance in primary production. The impact of future extreme events may be affected by land surface properties that either accentuate or ameliorate the effects of these events. Consideration of large-scale geomorphic processes may be necessary to interpret and manage for catastrophic events

    Topologically Protected Quantum State Transfer in a Chiral Spin Liquid

    Get PDF
    Topology plays a central role in ensuring the robustness of a wide variety of physical phenomena. Notable examples range from the robust current carrying edge states associated with the quantum Hall and the quantum spin Hall effects to proposals involving topologically protected quantum memory and quantum logic operations. Here, we propose and analyze a topologically protected channel for the transfer of quantum states between remote quantum nodes. In our approach, state transfer is mediated by the edge mode of a chiral spin liquid. We demonstrate that the proposed method is intrinsically robust to realistic imperfections associated with disorder and decoherence. Possible experimental implementations and applications to the detection and characterization of spin liquid phases are discussed.Comment: 14 pages, 7 figure

    mTCTScan: a comprehensive platform for annotation and prioritization of mutations affecting drug sensitivity in cancers

    Get PDF
    Cancer therapies have experienced rapid progress in recent years, with a number of novel small-molecule kinase inhibitors and monoclonal antibodies now being widely used to treat various types of human cancers. During cancer treatments, mutations can have important effects on drug sensitivity. However, the relationship between tumor genomic profiles and the effectiveness of cancer drugs remains elusive. We introduce Mutation To Cancer Therapy Scan (mTCTScan) web server (http://jjwanglab.org/mTCTScan) that can systematically analyze mutations affecting cancer drug sensitivity based on individual genomic profiles. The platform was developed by leveraging the latest knowledge on mutation-cancer drug sensitivity associations and the results from large-scale chemical screening using human cancer cell lines. Using an evidence-based scoring scheme based on current integrative evidences, mTCTScan is able to prioritize mutations according to their associations with cancer drugs and preclinical compounds. It can also show related drugs/compounds with sensitivity classification by considering the context of the entire genomic profile. In addition, mTCTScan incorporates comprehensive filtering functions and cancer-related annotations to better interpret mutation effects and their association with cancer drugs. This platform will greatly benefit both researchers and clinicians for interrogating mechanisms of mutation-dependent drug response, which will have a significant impact on cancer precision medicine.published_or_final_versio

    Noncommutative geometry inspired black holes in higher dimensions at the LHC

    Full text link
    When embedding models of noncommutative geometry inspired black holes into the peridium of large extra dimensions, it is natural to relate the noncommutativity scale to the higher-dimensional Planck scale. If the Planck scale is of the order of a TeV, noncommutative geometry inspired black holes could become accessible to experiments. In this paper, we present a detailed phenomenological study of the production and decay of these black holes at the Large Hadron Collider (LHC). Noncommutative inspired black holes are relatively cold and can be well described by the microcanonical ensemble during their entire decay. One of the main consequences of the model is the existence of a black hole remnant. The mass of the black hole remnant increases with decreasing mass scale associated with noncommutative and decreasing number of dimensions. The experimental signatures could be quite different from previous studies of black holes and remnants at the LHC since the mass of the remnant could be well above the Planck scale. Although the black hole remnant can be very heavy, and perhaps even charged, it could result in very little activity in the central detectors of the LHC experiments, when compared to the usual anticipated black hole signatures. If this type of noncommutative inspired black hole can be produced and detected, it would result in an additional mass threshold above the Planck scale at which new physics occurs.Comment: 21 pages, 7 figure

    Towards a large-scale quantum simulator on diamond surface at room temperature

    Full text link
    Strongly-correlated quantum many-body systems exhibits a variety of exotic phases with long-range quantum correlations, such as spin liquids and supersolids. Despite the rapid increase in computational power of modern computers, the numerical simulation of these complex systems becomes intractable even for a few dozens of particles. Feynman's idea of quantum simulators offers an innovative way to bypass this computational barrier. However, the proposed realizations of such devices either require very low temperatures (ultracold gases in optical lattices, trapped ions, superconducting devices) and considerable technological effort, or are extremely hard to scale in practice (NMR, linear optics). In this work, we propose a new architecture for a scalable quantum simulator that can operate at room temperature. It consists of strongly-interacting nuclear spins attached to the diamond surface by its direct chemical treatment, or by means of a functionalized graphene sheet. The initialization, control and read-out of this quantum simulator can be accomplished with nitrogen-vacancy centers implanted in diamond. The system can be engineered to simulate a wide variety of interesting strongly-correlated models with long-range dipole-dipole interactions. Due to the superior coherence time of nuclear spins and nitrogen-vacancy centers in diamond, our proposal offers new opportunities towards large-scale quantum simulation at room temperatures

    Total flavonoid fraction of the Herba epimedii extract suppresses urinary calcium excretion and improves bone properties in ovariectomised mice

    Get PDF
    2010-2011 > Academic research: refereed > Publication in refereed journalVersion of RecordPublishe

    Snapshot photoacoustic topography through an ergodic relay of optical absorption in vivo

    Get PDF
    Photoacoustic tomography (PAT) has demonstrated versatile biomedical applications, ranging from tracking single cells to monitoring whole-body dynamics of small animals and diagnosing human breast cancer. Currently, PAT has two major implementations: photoacoustic computed tomography (PACT) and photoacoustic microscopy (PAM). PACT uses a multi-element ultrasonic array for parallel detection, which is relatively complex and expensive. In contrast, PAM requires point-by-point scanning with a single-element detector, which has a limited imaging throughput. The trade-off between the system cost and throughput demands a new imaging method. To this end, we have developed photoacoustic topography through an ergodic relay (PATER). PATER can capture a wide-field image with only a single-element ultrasonic detector upon a single laser shot. This protocol describes the detailed procedures for PATER system construction, including component selection, equipment setup and system alignment. A step-by-step guide for in vivo imaging of a mouse brain is provided as an example application. Data acquisition, image reconstruction and troubleshooting procedures are also elaborated. It takes ~130 min to carry out this protocol, including ~60 min for both calibration and snapshot wide-field data acquisition using a laser with a 2-kHz pulse repetition rate. PATER offers low-cost snapshot wide-field imaging of fast dynamics, such as visualizing blood pulse wave propagation and tracking melanoma tumor cell circulation in mice in vivo. We envision that PATER will have wide biomedical applications and anticipate that the compact size of the setup will allow it to be further developed as a wearable device to monitor human vital signs

    Algebraic Comparison of Partial Lists in Bioinformatics

    Get PDF
    The outcome of a functional genomics pipeline is usually a partial list of genomic features, ranked by their relevance in modelling biological phenotype in terms of a classification or regression model. Due to resampling protocols or just within a meta-analysis comparison, instead of one list it is often the case that sets of alternative feature lists (possibly of different lengths) are obtained. Here we introduce a method, based on the algebraic theory of symmetric groups, for studying the variability between lists ("list stability") in the case of lists of unequal length. We provide algorithms evaluating stability for lists embedded in the full feature set or just limited to the features occurring in the partial lists. The method is demonstrated first on synthetic data in a gene filtering task and then for finding gene profiles on a recent prostate cancer dataset

    Visualization of acetylcholine distribution in central nervous system tissue sections by tandem imaging mass spectrometry

    Get PDF
    Metabolite distribution imaging via imaging mass spectrometry (IMS) is an increasingly utilized tool in the field of neurochemistry. As most previous IMS studies analyzed the relative abundances of larger metabolite species, it is important to expand its application to smaller molecules, such as neurotransmitters. This study aimed to develop an IMS application to visualize neurotransmitter distribution in central nervous system tissue sections. Here, we raise two technical problems that must be resolved to achieve neurotransmitter imaging: (1) the lower concentrations of bioactive molecules, compared with those of membrane lipids, require higher sensitivity and/or signal-to-noise (S/N) ratios in signal detection, and (2) the molecular turnover of the neurotransmitters is rapid; thus, tissue preparation procedures should be performed carefully to minimize postmortem changes. We first evaluated intrinsic sensitivity and matrix interference using Matrix Assisted Laser Desorption/Ionization (MALDI) mass spectrometry (MS) to detect six neurotransmitters and chose acetylcholine (ACh) as a model for study. Next, we examined both single MS imaging and MS/MS imaging for ACh and found that via an ion transition from m/z 146 to m/z 87 in MS/MS imaging, ACh could be visualized with a high S/N ratio. Furthermore, we found that in situ freezing method of brain samples improved IMS data quality in terms of the number of effective pixels and the image contrast (i.e., the sensitivity and dynamic range). Therefore, by addressing the aforementioned problems, we demonstrated the tissue distribution of ACh, the most suitable molecular specimen for positive ion detection by IMS, to reveal its localization in central nervous system tissues
    corecore