257 research outputs found
STUDIES ON CORRELATED MUTATIONS ALGORITHMS OF PROTEINS PROVIDING STRUCTURAL, SPATIAL, AND ALLOSTERY INFORMATION FROM MULTIPLE SEQUENCE ALIGNMENTS
Proteins provide innumerable cellular functions and benefits for all kingdoms in the domains of life. Advancements in the high throughput collection and analysis of proteins have led to ever-deeper understanding of biological pathways, evolution, and coding biases. Most protein functional and/or structural analysis that is carried out in an in vitro manner is not amenable to high throughput technologies. With the incredible growth of sequences to study, we have capabilities to further refine algorithms that work in silico, using the work done in vitro as a benchmark. There has been a renaissance of the study of proteins using new approaches that are largely possible because of the amount of data now available for analysis. The research in this dissertation investigates some of the new techniques available in this field, to find the limitations of these techniques as well as improve upon them.
Chapter 1 presents both an overview of generalized techniques at the disposal of researchers looking for links between protein sequence covariance and allostery. The methods most commonly used including mutual information, chemical similarity matrixes, phylogenetic perturbation, and chi-square analysis are reviewed as well as the limits of such approaches to detecting allostery. Chapter 2 explores using a recent phylogenetic correction that has been successful for improving the efficacy of mutual information to predict special contact on the other algorithm types introduced in the first chapter. Chapter 3 is an attempt to detect bias of covariance algorithms on the rigid bodies found in protein structures. Chapter 4 is the description of a novel algorithm, termed COvariance By Sections (COBS), that in many ways is a combination of the methodologies used in Chapter 2 and Chapter 3, whereby we leverage a phylogenetic correction on groups of MSA columns rather than individual columns
A Process Model of Non-Relativistic Quantum Mechanics
A process model of quantum mechanics utilizes a combinatorial game to generate a discrete and finite causal space upon which can be defined a self-consistent quantum mechanics. An emergent space-time and continuous wave function arise through a uniform interpolation process. Standard non-relativistic quantum mechanics (at least for integer spin particles) emerges under the limit of infinite information (the causal space grows to infinity) and infinitesimal scale (the separation between points goes to zero). This model is quasi-local, discontinuous, and quasi-non-contextual. The bridge between process and wave function is through the process covering map, which reveals that the standard wave function formalism lacks important dynamical information related to the generation of the causal space. Reformulating several classical conundrums such as wave particle duality, Schrodinger's cat, hidden variable results, the model offers potential resolutions to all, while retaining a high degree of locality and contextuality at the local level, yet nonlocality and contextuality at the emergent level. The model remains computationally powerful
Near-Term Quantum Computing Techniques: Variational Quantum Algorithms, Error Mitigation, Circuit Compilation, Benchmarking and Classical Simulation
Quantum computing is a game-changing technology for global academia, research
centers and industries including computational science, mathematics, finance,
pharmaceutical, materials science, chemistry and cryptography. Although it has
seen a major boost in the last decade, we are still a long way from reaching
the maturity of a full-fledged quantum computer. That said, we will be in the
Noisy-Intermediate Scale Quantum (NISQ) era for a long time, working on dozens
or even thousands of qubits quantum computing systems. An outstanding
challenge, then, is to come up with an application that can reliably carry out
a nontrivial task of interest on the near-term quantum devices with
non-negligible quantum noise. To address this challenge, several near-term
quantum computing techniques, including variational quantum algorithms, error
mitigation, quantum circuit compilation and benchmarking protocols, have been
proposed to characterize and mitigate errors, and to implement algorithms with
a certain resistance to noise, so as to enhance the capabilities of near-term
quantum devices and explore the boundaries of their ability to realize useful
applications. Besides, the development of near-term quantum devices is
inseparable from the efficient classical simulation, which plays a vital role
in quantum algorithm design and verification, error-tolerant verification and
other applications. This review will provide a thorough introduction of these
near-term quantum computing techniques, report on their progress, and finally
discuss the future prospect of these techniques, which we hope will motivate
researchers to undertake additional studies in this field.Comment: Please feel free to email He-Liang Huang with any comments,
questions, suggestions or concern
Towards an Information Theoretic Framework for Evolutionary Learning
The vital essence of evolutionary learning consists of information flows between the environment and the entities differentially surviving and reproducing therein. Gain or loss of information in individuals and populations due to evolutionary steps should be considered in evolutionary algorithm theory and practice. Information theory has rarely been applied to evolutionary computation - a lacuna that this dissertation addresses, with an emphasis on objectively and explicitly evaluating the ensemble models implicit in evolutionary learning. Information theoretic functionals can provide objective, justifiable, general, computable, commensurate measures of fitness and diversity.
We identify information transmission channels implicit in evolutionary learning. We define information distance metrics and indices for ensembles. We extend Price\u27s Theorem to non-random mating, give it an effective fitness interpretation and decompose it to show the key factors influencing heritability and evolvability. We argue that heritability and evolvability of our information theoretic indicators are high. We illustrate use of our indices for reproductive and survival selection. We develop algorithms to estimate information theoretic quantities on mixed continuous and discrete data via the empirical copula and information dimension. We extend statistical resampling. We present experimental and real world application results: chaotic time series prediction; parity; complex continuous functions; industrial process control; and small sample social science data. We formalize conjectures regarding evolutionary learning and information geometry
MS FT-2-2 7 Orthogonal polynomials and quadrature: Theory, computation, and applications
Quadrature rules find many applications in science and engineering. Their analysis is a classical area of applied mathematics and continues to attract considerable attention. This seminar brings together speakers with expertise in a large variety of quadrature rules. It is the aim of the seminar to provide an overview of recent developments in the analysis of quadrature rules. The computation of error estimates and novel applications also are described
Generalized averaged Gaussian quadrature and applications
A simple numerical method for constructing the optimal generalized averaged Gaussian quadrature formulas will be presented. These formulas exist in many cases in which real positive GaussKronrod formulas do not exist, and can be used as an adequate alternative in order to estimate the error of a Gaussian rule. We also investigate the conditions under which the optimal averaged Gaussian quadrature formulas and their truncated variants are internal
- …