7,067 research outputs found

    Characteristics of oligonucleotide frequencies across genomes: Conservation versus variation, strand symmetry, and evolutionary implications

    Get PDF
    One of the objectives of evolutionary genomics is to reveal the genetic information contained in the primordial genome (called the primary genetic information in this paper, with the primordial genome defined here as the most primitive nucleic acid genome for earth’s life) by searching for primitive traits or relics remained in modern genomes. As the shorter a sequence is, the less probable it would be modified during genome evolution. For that reason, some characteristics of very short nucleotide sequences would have considerable chances to persist during billions of years of evolution. Consequently, conservation of certain genomic features of mononucleotides, dinucleotides, and higher-order oligonucleotides across various genomes may exist; some, if not all, of these features would be relics of the primary genetic information. Based on this assumption, we analyzed the pattern of frequencies of mononucleotides, dinucleotides, and higher-order oligonucleotides of the whole-genome sequences from 458 species (including archaea, bacteria, and eukaryotes). Also, we studied the phenomenon of strand symmetry in these genomes. The results show that the conservation of frequencies of some dinucleotides and higher-order oligonucleotides across genomes does exist, and that strand symmetry is a ubiquitous and explicit phenomenon that may contribute to frequency conservation. We propose a new hypothesis for the origin of strand symmetry and frequency conservation as well as for the constitution of early genomes. We conclude that the phenomena of strand symmetry and the pattern of frequency conservation would be original features of the primary genetic information

    The Wiener and Terminal Wiener indices of trees

    Full text link
    Heydari \cite{heydari2013} presented very nice formulae for the Wiener and terminal Wiener indices of generalized Bethe trees. It is pity that there are some errors for the formulae. In this paper, we correct these errors and characterize all trees with the minimum terminal Wiener index among all the trees of order nn and with maximum degree Δ\Delta.Comment: 13 page

    Joint power and admission control via p norm minimization deflation

    Full text link
    In an interference network, joint power and admission control aims to support a maximum number of links at their specified signal to interference plus noise ratio (SINR) targets while using a minimum total transmission power. In our previous work, we formulated the joint control problem as a sparse â„“0\ell_0-minimization problem and relaxed it to a â„“1\ell_1-minimization problem. In this work, we propose to approximate the â„“0\ell_0-optimization problem to a p norm minimization problem where 0<p<10<p<1, since intuitively p norm will approximate 0 norm better than 1 norm. We first show that the â„“p\ell_p-minimization problem is strongly NP-hard and then derive a reformulation of it such that the well developed interior-point algorithms can be applied to solve it. The solution to the â„“p\ell_p-minimization problem can efficiently guide the link's removals (deflation). Numerical simulations show the proposed heuristic outperforms the existing algorithms.Comment: 2013 IEEE International Conference on Acoustics, Speech, and Signal Processin

    Sample Approximation-Based Deflation Approaches for Chance SINR Constrained Joint Power and Admission Control

    Full text link
    Consider the joint power and admission control (JPAC) problem for a multi-user single-input single-output (SISO) interference channel. Most existing works on JPAC assume the perfect instantaneous channel state information (CSI). In this paper, we consider the JPAC problem with the imperfect CSI, that is, we assume that only the channel distribution information (CDI) is available. We formulate the JPAC problem into a chance (probabilistic) constrained program, where each link's SINR outage probability is enforced to be less than or equal to a specified tolerance. To circumvent the computational difficulty of the chance SINR constraints, we propose to use the sample (scenario) approximation scheme to convert them into finitely many simple linear constraints. Furthermore, we reformulate the sample approximation of the chance SINR constrained JPAC problem as a composite group sparse minimization problem and then approximate it by a second-order cone program (SOCP). The solution of the SOCP approximation can be used to check the simultaneous supportability of all links in the network and to guide an iterative link removal procedure (the deflation approach). We exploit the special structure of the SOCP approximation and custom-design an efficient algorithm for solving it. Finally, we illustrate the effectiveness and efficiency of the proposed sample approximation-based deflation approaches by simulations.Comment: The paper has been accepted for publication in IEEE Transactions on Wireless Communication

    Decomposition by Successive Convex Approximation: A Unifying Approach for Linear Transceiver Design in Heterogeneous Networks

    Get PDF
    We study the downlink linear precoder design problem in a multi-cell dense heterogeneous network (HetNet). The problem is formulated as a general sum-utility maximization (SUM) problem, which includes as special cases many practical precoder design problems such as multi-cell coordinated linear precoding, full and partial per-cell coordinated multi-point transmission, zero-forcing precoding and joint BS clustering and beamforming/precoding. The SUM problem is difficult due to its non-convexity and the tight coupling of the users' precoders. In this paper we propose a novel convex approximation technique to approximate the original problem by a series of convex subproblems, each of which decomposes across all the cells. The convexity of the subproblems allows for efficient computation, while their decomposability leads to distributed implementation. {Our approach hinges upon the identification of certain key convexity properties of the sum-utility objective, which allows us to transform the problem into a form that can be solved using a popular algorithmic framework called BSUM (Block Successive Upper-Bound Minimization).} Simulation experiments show that the proposed framework is effective for solving interference management problems in large HetNet.Comment: Accepted by IEEE Transactions on Wireless Communicatio

    A Kinetic Model for Cell Damage Caused by Oligomer Formation

    Get PDF
    It is well-known that the formation of amyloid fiber may cause invertible damage to cells, while the underlying mechanism has not been fully uncovered. In this paper, we construct a mathematical model, consisting of infinite ODEs in the form of mass-action equations together with two reaction-convection PDEs, and then simplify it to a system of 5 ODEs by using the maximum entropy principle. This model is based on four simple assumptions, one of which is that cell damage is raised by oligomers rather than mature fibrils. With the simplified model, the effects of nucleation and elongation, fragmentation, protein and seeds concentrations on amyloid formation and cell damage are extensively explored and compared with experiments. We hope that our results can provide a valuable insight into the processes of amyloid formation and cell damage thus raised.Comment: 16 pages+ 5 figures for maintext; 8 pages+ 4 figures for Supporting Material
    • …
    corecore