338 research outputs found
On a class of matrices with real eigenvalues
AbstractIt is easy to prove that if A is a real irreducible square matrix and if a real nonsingular diagonal matrix D exists such that AD is symmetric and positive semidefinite, then for any real diagonal matrix Y, AY has only real eigenvalues. This paper proves the converse result that if no such D exists, then for some Y, AY will possess some nonreal eigenvalues
Backward error analysis and the substitution law for Lie group integrators
Butcher series are combinatorial devices used in the study of numerical
methods for differential equations evolving on vector spaces. More precisely,
they are formal series developments of differential operators indexed over
rooted trees, and can be used to represent a large class of numerical methods.
The theory of backward error analysis for differential equations has a
particularly nice description when applied to methods represented by Butcher
series. For the study of differential equations evolving on more general
manifolds, a generalization of Butcher series has been introduced, called
Lie--Butcher series. This paper presents the theory of backward error analysis
for methods based on Lie--Butcher series.Comment: Minor corrections and additions. Final versio
Numerical modelling of subglacial ribs, drumlins, herringbones, and mega-scale glacial lineations reveals their developmental trajectories and transitions
Initially a matter of intellectual curiosity, but now important for understanding ice-sheet dynamics, the formation of subglacial bedforms has been a subject of scientific enquiry for over a century. Here, we use a numerical model of the coupled flow of ice, water, and subglacial sediment to explore the formation of subglacial ribs (i.e., ribbed moraine), drumlins and mega-scale glacial lineations (MSGLs). The model produces instabilities at the ice–bed interface, which result in landforms resembling subglacial ribs and drumlins. We find that a behavioural trajectory is present. Initially subglacial ribs form, which can either develop into fields of organized drumlins, or herringbone-type structures misaligned with ice flow. We present potential examples of these misaligned bedforms in deglaciated landscapes, the presence of which means caution should be taken when interpreting cross-cutting bedforms to reconstruct ice flow directions. Under unvarying ice flow parameters, MSGLs failed to appear in our experiments. However, drumlin fields can elongate into MSGLs in our model if low ice–bed coupling conditions are imposed. The conditions under which drumlins elongate into MSGLs are analogous to those found beneath contemporary ice streams, providing the first mechanism, rather than just an association, for linking MSGLs with ice stream flow. We conclude that the instability theory, as realized in this numerical model, is sufficient to explain the fundamental mechanics and process-interactions that lead to the initiation of subglacial bedforms, the development of the distinctive types of bedform patterns, and their evolutionary trajectories. We therefore suggest that the first part of the longstanding ‘drumlin problem’ – how and why they come into existence – is now solved. However, much remains to be discovered regarding the exact sedimentary and hydrological processes involved
Using the Hopf Algebra Structure of QFT in Calculations
We employ the recently discovered Hopf algebra structure underlying
perturbative Quantum Field Theory to derive iterated integral representations
for Feynman diagrams. We give two applications: to massless Yukawa theory and
quantum electrodynamics in four dimensions.Comment: 28 p, Revtex, epsf for figures, minor changes, to appear in
Phys.Rev.
On post-Lie algebras, Lie--Butcher series and moving frames
Pre-Lie (or Vinberg) algebras arise from flat and torsion-free connections on
differential manifolds. They have been studied extensively in recent years,
both from algebraic operadic points of view and through numerous applications
in numerical analysis, control theory, stochastic differential equations and
renormalization. Butcher series are formal power series founded on pre-Lie
algebras, used in numerical analysis to study geometric properties of flows on
euclidean spaces. Motivated by the analysis of flows on manifolds and
homogeneous spaces, we investigate algebras arising from flat connections with
constant torsion, leading to the definition of post-Lie algebras, a
generalization of pre-Lie algebras. Whereas pre-Lie algebras are intimately
associated with euclidean geometry, post-Lie algebras occur naturally in the
differential geometry of homogeneous spaces, and are also closely related to
Cartan's method of moving frames. Lie--Butcher series combine Butcher series
with Lie series and are used to analyze flows on manifolds. In this paper we
show that Lie--Butcher series are founded on post-Lie algebras. The functorial
relations between post-Lie algebras and their enveloping algebras, called
D-algebras, are explored. Furthermore, we develop new formulas for computations
in free post-Lie algebras and D-algebras, based on recursions in a magma, and
we show that Lie--Butcher series are related to invariants of curves described
by moving frames.Comment: added discussion of post-Lie algebroid
IMP: Imperial Metagenomics Pipeline for high-throughput sequence data
We have developed an in-house pipeline for the processing and analyses of sequence data generated during Illumina technology-based metagenomic studies of the human gut microbiota. Each component of the pipeline has been selected following comparative analysis of available tools; however, the modular nature of software facilitates replacement of any individual component with an alternative should a better tool become available in due course. The pipeline consists of quality analysis and trimming followed by taxonomic filtering of sequence data allowing reads associated with samples to be binned according to whether they represent human, prokaryotic (bacterial/archaeal), viral, parasite, fungal or plant DNA. Viral, parasite, fungal and plant DNA can be assigned to species level on a presence/absence basis, allowing – for example – identification of dietary intake of plant-based foodstuffs and their derivatives. Prokaryotic DNA is subject to taxonomic and functional analyses, with assignment to taxonomic hierarchies (kingdom, class, order, family, genus, species, strain/subspecies) and abundance determination. After de novo assembly of sequence reads, genes within samples are predicted and used to build a non-redundant catalogue of genes. From this catalogue, per-sample gene abundance can be determined after normalization of data based on gene length. Functional annotation of genes is achieved through mapping of gene clusters against KEGG proteins, and InterProScan. The pipeline is undergoing validation using the human faecal metagenomic data of Qin et al. (2014, Nature 513, 59–64). Outputs from the pipeline allow development of tools for the integration of metagenomic and metabolomic data, moving metagenomic studies beyond determination of gene richness and representation towards microbial-metabolite mapping. There is scope to improve the outputs from viral, parasite, fungal and plant DNA analyses, depending on the depth of sequencing associated with samples. The pipeline can easily be adapted for the analyses of environmental and non-human animal samples, and for use with data generated via non-Illumina sequencing platforms
Assessing ice sheet models against the landform record: the Likelihood of Accordant Lineations Analysis (LALA) tool
Palaeo-ice sheets leave behind a rich database regarding their past behaviour, recorded in the landscape in the form of glacial geomorphology. The most numerous landform created by these ice sheets are subglacial lineations, which generate snapshots of the direction of ice flow at fixed (yet typically unknown) points in time. Despite their relative density within the landform record, the information provided by subglacial lineations is currently underutilised in tests of numerical ice sheet models. To some extent, this is a consequence of ongoing debate regarding lineation formation, but predominantly, it reflects the lack of rigorous model-data comparison techniques that would enable lineation information to be properly integrated. Here, we present the Likelihood of Accordant Lineations Analysis (LALA) tool. LALA provides a statistically rigorous measure of the log-likelihood of a supplied ice sheet simulation through comparison of simulation output with both the location and direction of observed lineations. Given an ensemble of ice sheet simulations, LALA provides a formal, and statistically underpinned, quantitative assessment of each simulation's quality-of-fit to mapped lineations. This enables a comparison of each simulation's relative plausibility, including identification of the most likely ice sheet simulations amongst the ensemble. This is achieved by modelling lineation formation as a marked Poisson point process and comparison of observed to modelled flow directions using the von Mises distribution. LALA is flexible—users can adapt parameters to account for differing assumptions regarding lineation formation, and for variations in the level of precision required for differing model-data comparison experiments. We provide guidelines and rationale for assigning parameter values, including an assessment of the variability between users when mapping lineations. Finally, we demonstrate the utility of LALA through application to an ensemble of simulations of the last British-Irish Ice Sheet. This comparison highlights the benefits of LALA over previous tools and demonstrates some of the considerations of experimental design required when identifying the fit between ice sheet model simulations and the landform record
Dynamic modeling and simulation of leukocyte integrin activation through an electronic design automation framework
Model development and analysis of biological systems is recognized as a key requirement for integrating in-vitro and in-vivo experimental data. In-silico simulations of a biochemical model allows one to test different experimental conditions, helping in the discovery of the dynamics that regulate the system. Several characteristics and issues of biological system modeling are common to the electronics system modeling, such as concurrency, reactivity, abstraction levels, as well as state space explosion during verification. This paper proposes a modeling and simulation framework for discrete event-based execution of biochemical systems based on SystemC. SystemC is the reference language in the electronic design automation (EDA) field for modeling and verifying complex systems at different abstraction levels. SystemC-based verification is the de-facto an alternative to model checking when such a formal verification technique cannot deal with the state space complexity of the model. The paper presents how the framework has been applied to model the intracellular signalling network controlling integrin activation mediating leukocyte recruitment from the blood into the tissues, by handling the solution space complexity through different levels of simulation accuracy
Thermoelectric effect in molecular electronics
We provide a theoretical estimate of the thermoelectric current and voltage
over a Phenyldithiol molecule. We also show that the thermoelectric voltage is
(1) easy to analyze, (2) insensitive to the detailed coupling to the contacts,
(3) large enough to be measured and (4) give valuable information, which is not
readily accessible through other experiments, on the location of the Fermi
energy relative to the molecular levels. The location of the Fermi-energy is
poorly understood and controversial even though it is a central factor in
determining the nature of conduction (n- or p-type). We also note that the
thermoelectric voltage measured over Guanine molecules with an STM by Poler et
al., indicate conduction through the HOMO level, i.e., p-type conduction.Comment: 4 pages, 3 figure
H\"older-continuous rough paths by Fourier normal ordering
We construct in this article an explicit geometric rough path over arbitrary
-dimensional paths with finite -variation for any
. The method may be coined as 'Fourier normal ordering', since
it consists in a regularization obtained after permuting the order of
integration in iterated integrals so that innermost integrals have highest
Fourier frequencies. In doing so, there appear non-trivial tree combinatorics,
which are best understood by using the structure of the Hopf algebra of
decorated rooted trees (in connection with the Chen or multiplicative property)
and of the Hopf shuffle algebra (in connection with the shuffle or geometric
property). H\"older continuity is proved by using Besov norms. The method is
well-suited in particular in view of applications to probability theory (see
the companion article \cite{Unt09} for the construction of a rough path over
multidimensional fractional Brownian motion with Hurst index , or
\cite{Unt09ter} for a short survey in that case).Comment: 50 pages, 6 figure
- …