3,927 research outputs found

    Sparse dimensionality reduction approaches in Mendelian randomization with highly correlated exposures.

    Get PDF
    Multivariable Mendelian randomization (MVMR) is an instrumental variable technique that generalizes the MR framework for multiple exposures. Framed as a linear regression problem, it is subject to the pitfall of multi-collinearity. The bias and efficiency of MVMR estimates thus depends heavily on the correlation of exposures. Dimensionality reduction techniques such as principal component analysis (PCA) provide transformations of all the included variables that are effectively uncorrelated. We propose the use of sparse PCA (sPCA) algorithms that create principal components of subsets of the exposures with the aim of providing more interpretable and reliable MR estimates. The approach consists of three steps. We first apply a sparse dimension reduction method and transform the variant-exposure summary statistics to principal components. We then choose a subset of the principal components based on data-driven cutoffs, and estimate their strength as instruments with an adjusted F-statistic. Finally, we perform MR with these transformed exposures. This pipeline is demonstrated in a simulation study of highly correlated exposures and an applied example using summary data from a genome-wide association study of 97 highly correlated lipid metabolites. As a positive control, we tested the causal associations of the transformed exposures on CHD. Compared to the conventional inverse-variance weighted MVMR method and a weak-instrument robust MVMR method (MR GRAPPLE), sparse component analysis achieved a superior balance of sparsity and biologically insightful grouping of the lipid traits

    Stochastic theory of large-scale enzyme-reaction networks: Finite copy number corrections to rate equation models

    Full text link
    Chemical reactions inside cells occur in compartment volumes in the range of atto- to femtolitres. Physiological concentrations realized in such small volumes imply low copy numbers of interacting molecules with the consequence of considerable fluctuations in the concentrations. In contrast, rate equation models are based on the implicit assumption of infinitely large numbers of interacting molecules, or equivalently, that reactions occur in infinite volumes at constant macroscopic concentrations. In this article we compute the finite-volume corrections (or equivalently the finite copy number corrections) to the solutions of the rate equations for chemical reaction networks composed of arbitrarily large numbers of enzyme-catalyzed reactions which are confined inside a small sub-cellular compartment. This is achieved by applying a mesoscopic version of the quasi-steady state assumption to the exact Fokker-Planck equation associated with the Poisson Representation of the chemical master equation. The procedure yields impressively simple and compact expressions for the finite-volume corrections. We prove that the predictions of the rate equations will always underestimate the actual steady-state substrate concentrations for an enzyme-reaction network confined in a small volume. In particular we show that the finite-volume corrections increase with decreasing sub-cellular volume, decreasing Michaelis-Menten constants and increasing enzyme saturation. The magnitude of the corrections depends sensitively on the topology of the network. The predictions of the theory are shown to be in excellent agreement with stochastic simulations for two types of networks typically associated with protein methylation and metabolism.Comment: 13 pages, 4 figures; published in The Journal of Chemical Physic

    Search for Sterile Neutrinos with a Radioactive Source at Daya Bay

    Get PDF
    The far site detector complex of the Daya Bay reactor experiment is proposed as a location to search for sterile neutrinos with > eV mass. Antineutrinos from a 500 kCi 144Ce-144Pr beta-decay source (DeltaQ=2.996 MeV) would be detected by four identical 20-ton antineutrino targets. The site layout allows flexible source placement; several specific source locations are discussed. In one year, the 3+1 sterile neutrino hypothesis can be tested at essentially the full suggested range of the parameters Delta m^2_{new} and sin^22theta_{new} (90% C.L.). The backgrounds from six nuclear reactors at >1.6 km distance are shown to be manageable. Advantages of performing the experiment at the Daya Bay far site are described

    A statistical mechanics description of environmental variability in metabolic networks

    Get PDF
    Many of the chemical reactions that take place within a living cell are irreversible. Due to evolutionary pressures, the number of allowable reactions within these systems are highly constrained and thus the resulting metabolic networks display considerable asymmetry. In this paper, we explore possible evolutionary factors pertaining to the reduced symmetry observed in these networks, and demonstrate the important role environmental variability plays in shaping their structural organization. Interpreting the returnability index as an equilibrium constant for a reaction network in equilibrium with a hypothetical reference system, enables us to quantify the extent to which a metabolic network is in disequilibrium. Further, by introducing a new directed centrality measure via an extension of the subgraph centrality metric to directed networks, we are able to characterise individual metabolites by their participation within metabolic pathways. To demonstrate these ideas, we study 116 metabolic networks of bacteria. In particular, we find that the equilibrium constant for the metabolic networks decreases significantly in-line with variability in bacterial habitats, supporting the view that environmental variability promotes disequilibrium within these biochemical reaction system

    Site amplification, attenuation, and scattering from noise correlation amplitudes across a dense array in Long Beach, CA

    Get PDF
    For accurate seismic hazard evaluation, both the spatial and frequency-dependent variabilities in the amplitudes of earthquake ground motions are needed. While this information is rarely fully available due to the paucity of relevant seismic data, dense arrays like the 5200-geophone array in Long Beach, California provide the opportunity to study this amplitude variability. Here we show that ambient noise correlation amplitudes from the Long Beach array can be used to directly determine frequency-dependent site amplification factors. We analyze Rayleigh-wavefield amplitude gradients from ambient noise correlations that are processed so that relative amplitudes satisfy the wave equation and are therefore meaningful. Ultimately, we construct maps of site amplification across Long Beach at frequencies of 0.67, 1.0, and 2.0 Hz. These maps correlate well with local structure, notably the Newport-Inglewood Fault and also to known velocity structure. Through this process, we also obtain constraints on average attenuation structure and local scattering

    An integrated study of earth resources in the State of California using remote sensing techniques

    Get PDF
    The author has identified the following significant results. The supply, demand, and impact relationships of California's water resources as exemplified by the Feather River project and other aspects of the California Water Plan are discussed
    • …
    corecore