201 research outputs found

    A meta-analysis of Boolean network models reveals design principles of gene regulatory networks

    Full text link
    Gene regulatory networks (GRNs) describe how a collection of genes governs the processes within a cell. Understanding how GRNs manage to consistently perform a particular function constitutes a key question in cell biology. GRNs are frequently modeled as Boolean networks, which are intuitive, simple to describe, and can yield qualitative results even when data is sparse. We generate an expandable database of published, expert-curated Boolean GRN models, and extracted the rules governing these networks. A meta-analysis of this diverse set of models enables us to identify fundamental design principles of GRNs. The biological term canalization reflects a cell's ability to maintain a stable phenotype despite ongoing environmental perturbations. Accordingly, Boolean canalizing functions are functions where the output is already determined if a specific variable takes on its canalizing input, regardless of all other inputs. We provide a detailed analysis of the prevalence of canalization and show that most rules describing the regulatory logic are highly canalizing. Independent from this, we also find that most rules exhibit a high level of redundancy. An analysis of the prevalence of small network motifs, e.g. feed-forward loops or feedback loops, in the wiring diagram of the identified models reveals several highly abundant types of motifs, as well as a surprisingly high overabundance of negative regulations in complex feedback loops. Lastly, we provide the strongest evidence thus far in favor of the hypothesis that GRNs operate at the critical edge between order and chaos.Comment: 12 pages, 8 figure

    Metataxonomic and Metagenomic Approaches vs. Culture-Based Techniques for Clinical Pathology.

    Get PDF
    Diagnoses that are both timely and accurate are critically important for patients with life-threatening or drug resistant infections. Technological improvements in High-Throughput Sequencing (HTS) have led to its use in pathogen detection and its application in clinical diagnoses of infectious diseases. The present study compares two HTS methods, 16S rRNA marker gene sequencing (metataxonomics) and whole metagenomic shotgun sequencing (metagenomics), in their respective abilities to match the same diagnosis as traditional culture methods (culture inference) for patients with ventilator associated pneumonia (VAP). The metagenomic analysis was able to produce the same diagnosis as culture methods at the species-level for five of the six samples, while the metataxonomic analysis was only able to produce results with the same species-level identification as culture for two of the six samples. These results indicate that metagenomic analyses have the accuracy needed for a clinical diagnostic tool, but full integration in diagnostic protocols is contingent on technological improvements to decrease turnaround time and lower costs

    Bird sensitivity to disturbance as an indicator of forest patchconditions: An issue in environmental assessments

    Get PDF
    An Environmental Assessment (EA) is one of the steps within the Environmental Impact Assessment process. Birds are often used in EA to help decision makers evaluate potential human impacts from proposed development activities. A “sensitivity to human disturbance” index, created by Parker III et al. (1996) for all Neotropical species, is commonly considered an ecological indicator. However, this parameter was created subjectively and, for most species, there have been no rigorous field test to validate its effectiveness as such. Therefore, in this study, we aim to: (1) evaluate if, at the local scale, birds from forest patches in a human-modified landscape (HML) may differ in sensitivity from Parker's sensitivity classification; (2) evaluate the effectiveness of the species richness value at each sensitivity level as an ecological indicator; (3) gather information on how often and in which manner Parker's classification has been used in EA. To do so, bird sampling was performed in eight forest patches in a HML over one year. Then, we created a local sensitivity to disturbance using information about threat, endemism, spatial distribution and relative abundance of all species in the study area. We found that 37% of the forest birds showed different local sensitivity levels when compared with Parker's classification. Our results show that only the richness of high-sensitivity species from our local classification fitted the ecological indicator assumptions helping the environmental conditions evaluation of the studied patches. We conclude that species richness of each Parker's bird sensitivity levels do not necessarily perform as an ecological indicator at the local scale, and particularly in HML. Nevertheless, Parker's Neotropical bird sensitivity classification was used in 50% of EA we reviewed. In these, 76% assumed that it was an accurate ecological indicator of the local forest conditions for birds. The lack of clear criteria used in Parker's classification allows diverse interpretations by ornithologists, and there is no agreement about the ecological meaning of each sensitivity level and what environmental conditions each level may indicate of. Therefore, the use of Parker's classification in EA may jeopardize accurate interpretations of proposed anthropogenic impacts. Furthermore, because a bird species’ sensitivity often varies between locations, we argue that Parker's generalized classification of bird sensitivity should not be used as an indicator of forest environmental conditions in EA throughout HMLs in Neotropics. Rather, local bird ecological indices should be explored, otherwise, erroneous predictions of the anthropogenic impacts will continue to be common

    LSST Science Book, Version 2.0

    Get PDF
    A survey that can cover the sky in optical bands over wide fields to faint magnitudes with a fast cadence will enable many of the exciting science opportunities of the next decade. The Large Synoptic Survey Telescope (LSST) will have an effective aperture of 6.7 meters and an imaging camera with field of view of 9.6 deg^2, and will be devoted to a ten-year imaging survey over 20,000 deg^2 south of +15 deg. Each pointing will be imaged 2000 times with fifteen second exposures in six broad bands from 0.35 to 1.1 microns, to a total point-source depth of r~27.5. The LSST Science Book describes the basic parameters of the LSST hardware, software, and observing plans. The book discusses educational and outreach opportunities, then goes on to describe a broad range of science that LSST will revolutionize: mapping the inner and outer Solar System, stellar populations in the Milky Way and nearby galaxies, the structure of the Milky Way disk and halo and other objects in the Local Volume, transient and variable objects both at low and high redshift, and the properties of normal and active galaxies at low and high redshift. It then turns to far-field cosmological topics, exploring properties of supernovae to z~1, strong and weak lensing, the large-scale distribution of galaxies and baryon oscillations, and how these different probes may be combined to constrain cosmological models and the physics of dark energy.Comment: 596 pages. Also available at full resolution at http://www.lsst.org/lsst/sciboo

    Quiescence and γH2AX in neuroblastoma are regulated by ouabain/Na,K-ATPase

    Get PDF
    Cellular quiescence is a state of reversible proliferation arrest that is induced by anti-mitogenic signals. The endogenous cardiac glycoside ouabain is a specific ligand of the ubiquitous sodium pump, Na,K-ATPase, also known to regulate cell growth through unknown signalling pathways. To investigate the role of ouabain/Na,K-ATPase in uncontrolled neuroblastoma growth we used xenografts, flow cytometry, immunostaining, comet assay, real-time PCR, and electrophysiology after various treatment strategies. The ouabain/Na,K-ATPase complex induced quiescence in malignant neuroblastoma. Tumour growth was reduced by >50% when neuroblastoma cells were xenografted into immune-deficient mice that were fed with ouabain. Ouabain-induced S-G2 phase arrest, activated the DNA-damage response (DDR) pathway marker γH2AX, increased the cell cycle regulator p21Waf1/Cip1 and upregulated the quiescence-specific transcription factor hairy and enhancer of split1 (HES1), causing neuroblastoma cells to ultimately enter G0. Cells re-entered the cell cycle and resumed proliferation, without showing DNA damage, when ouabain was removed. Conclusion: These findings demonstrate a novel action of ouabain/Na,K-ATPase as a regulator of quiescence in neuroblastoma, suggesting that ouabain can be used in chemotherapies to suppress tumour growth and/or arrest cells to increase the therapeutic index in combination therapies

    Readout of a quantum processor with high dynamic range Josephson parametric amplifiers

    Full text link
    We demonstrate a high dynamic range Josephson parametric amplifier (JPA) in which the active nonlinear element is implemented using an array of rf-SQUIDs. The device is matched to the 50 Ω\Omega environment with a Klopfenstein-taper impedance transformer and achieves a bandwidth of 250-300 MHz, with input saturation powers up to -95 dBm at 20 dB gain. A 54-qubit Sycamore processor was used to benchmark these devices, providing a calibration for readout power, an estimate of amplifier added noise, and a platform for comparison against standard impedance matched parametric amplifiers with a single dc-SQUID. We find that the high power rf-SQUID array design has no adverse effect on system noise, readout fidelity, or qubit dephasing, and we estimate an upper bound on amplifier added noise at 1.6 times the quantum limit. Lastly, amplifiers with this design show no degradation in readout fidelity due to gain compression, which can occur in multi-tone multiplexed readout with traditional JPAs.Comment: 9 pages, 8 figure

    Measurement-Induced State Transitions in a Superconducting Qubit: Within the Rotating Wave Approximation

    Full text link
    Superconducting qubits typically use a dispersive readout scheme, where a resonator is coupled to a qubit such that its frequency is qubit-state dependent. Measurement is performed by driving the resonator, where the transmitted resonator field yields information about the resonator frequency and thus the qubit state. Ideally, we could use arbitrarily strong resonator drives to achieve a target signal-to-noise ratio in the shortest possible time. However, experiments have shown that when the average resonator photon number exceeds a certain threshold, the qubit is excited out of its computational subspace, which we refer to as a measurement-induced state transition. These transitions degrade readout fidelity, and constitute leakage which precludes further operation of the qubit in, for example, error correction. Here we study these transitions using a transmon qubit by experimentally measuring their dependence on qubit frequency, average photon number, and qubit state, in the regime where the resonator frequency is lower than the qubit frequency. We observe signatures of resonant transitions between levels in the coupled qubit-resonator system that exhibit noisy behavior when measured repeatedly in time. We provide a semi-classical model of these transitions based on the rotating wave approximation and use it to predict the onset of state transitions in our experiments. Our results suggest the transmon is excited to levels near the top of its cosine potential following a state transition, where the charge dispersion of higher transmon levels explains the observed noisy behavior of state transitions. Moreover, occupation in these higher energy levels poses a major challenge for fast qubit reset

    Overcoming leakage in scalable quantum error correction

    Full text link
    Leakage of quantum information out of computational states into higher energy states represents a major challenge in the pursuit of quantum error correction (QEC). In a QEC circuit, leakage builds over time and spreads through multi-qubit interactions. This leads to correlated errors that degrade the exponential suppression of logical error with scale, challenging the feasibility of QEC as a path towards fault-tolerant quantum computation. Here, we demonstrate the execution of a distance-3 surface code and distance-21 bit-flip code on a Sycamore quantum processor where leakage is removed from all qubits in each cycle. This shortens the lifetime of leakage and curtails its ability to spread and induce correlated errors. We report a ten-fold reduction in steady-state leakage population on the data qubits encoding the logical state and an average leakage population of less than 1×1031 \times 10^{-3} throughout the entire device. The leakage removal process itself efficiently returns leakage population back to the computational basis, and adding it to a code circuit prevents leakage from inducing correlated error across cycles, restoring a fundamental assumption of QEC. With this demonstration that leakage can be contained, we resolve a key challenge for practical QEC at scale.Comment: Main text: 7 pages, 5 figure
    corecore