13,212 research outputs found

    Extragalactic Foreground Contamination in Temperature-based CMB Lens Reconstruction

    Get PDF
    We discuss the effect of unresolved point source contamination on estimates of the CMB lensing potential, from components such as the thermal Sunyaev-Zel'dovich effect, radio point sources, and the Cosmic Infrared Background. We classify the possible trispectra associated with such source populations, and construct estimators for the amplitude and scale-dependence of several of the major trispectra. We show how to propagate analytical models for these source trispectra to biases for lensing. We also construct a "source-hardened" lensing estimator which experiences significantly smaller biases when exposed to unresolved point sources than the standard quadratic lensing estimator. We demonstrate these ideas in practice using the sky simulations of Sehgal et. al., for cosmic-variance limited experiments designed to mimic ACT, SPT, and Planck

    Pure xenon hexafluoride prepared for thermal properties studies

    Get PDF
    Preparation of a xenon hexafluoride and sodium fluoride salt yields a sample of the highest possible purity for use in thermal measurements. The desired hexafluoride can easily be freed from the common contaminants, xenon tetra-fluoride, xenon difluoride, and xenon oxide tetrafluoride, because none of these compounds reacts with sodium fluoride

    Implementing vertex dynamics models of cell populations in biology within a consistent computational framework

    Get PDF
    The dynamic behaviour of epithelial cell sheets plays a central role during development, growth, disease and wound healing. These processes occur as a result of cell adhesion, migration, division, differentiation and death, and involve multiple processes acting at the cellular and molecular level. Computational models offer a useful means by which to investigate and test hypotheses about these processes, and have played a key role in the study of cell–cell interactions. However, the necessarily complex nature of such models means that it is difficult to make accurate comparison between different models, since it is often impossible to distinguish between differences in behaviour that are due to the underlying model assumptions, and those due to differences in the in silico implementation of the model. In this work, an approach is described for the implementation of vertex dynamics models, a discrete approach that represents each cell by a polygon (or polyhedron) whose vertices may move in response to forces. The implementation is undertaken in a consistent manner within a single open source computational framework, Chaste, which comprises fully tested, industrial-grade software that has been developed using an agile approach. This framework allows one to easily change assumptions regarding force generation and cell rearrangement processes within these models. The versatility and generality of this framework is illustrated using a number of biological examples. In each case we provide full details of all technical aspects of our model implementations, and in some cases provide extensions to make the models more generally applicable

    Gamma-Ray Bursts observed by XMM-Newton

    Full text link
    Analysis of observations with XMM-Newton have made a significant contribution to the study of Gamma-ray Burst (GRB) X-ray afterglows. The effective area, bandpass and resolution of the EPIC instrument permit the study of a wide variety of spectral features. In particular, strong, time-dependent, soft X-ray emission lines have been discovered in some bursts. The emission mechanism and energy source for these lines pose major problems for the current generation of GRB models. Other GRBs have intrinsic absorption, possibly related to the environment around the progenitor, or possible iron emission lines similar to those seen in GRBs observed with BeppoSAX. Further XMM-Newton observations of GRBs discovered by the Swift satellite should help unlock the origin of the GRB phenomenon over the next few years.Comment: To appear in proceedings of the "XMM-Newton EPIC Consortium meeting, Palermo, 2003 October 14-16", published in Memorie della Societa Astronomica Italian

    Colorectal Cancer Through Simulation and Experiment

    Get PDF
    Colorectal cancer has continued to generate a huge amount of research interest over several decades, forming a canonical example of tumourigenesis since its use in Fearon and Vogelstein’s linear model of genetic mutation. Over time, the field has witnessed a transition from solely experimental work to the inclusion of mathematical biology and computer-based modelling. The fusion of these disciplines has the potential to provide valuable insights into oncologic processes, but also presents the challenge of uniting many diverse perspectives. Furthermore, the cancer cell phenotype defined by the ‘Hallmarks of Cancer’ has been extended in recent times and provides an excellent basis for future research. We present a timely summary of the literature relating to colorectal cancer, addressing the traditional experimental findings, summarising the key mathematical and computational approaches, and emphasising the role of the Hallmarks in current and future developments. We conclude with a discussion of interdisciplinary work, outlining areas of experimental interest which would benefit from the insight that mathematical and computational modelling can provide

    Distribution of Gaussian Process Arc Lengths

    Full text link
    We present the first treatment of the arc length of the Gaussian Process (GP) with more than a single output dimension. GPs are commonly used for tasks such as trajectory modelling, where path length is a crucial quantity of interest. Previously, only paths in one dimension have been considered, with no theoretical consideration of higher dimensional problems. We fill the gap in the existing literature by deriving the moments of the arc length for a stationary GP with multiple output dimensions. A new method is used to derive the mean of a one-dimensional GP over a finite interval, by considering the distribution of the arc length integrand. This technique is used to derive an approximate distribution over the arc length of a vector valued GP in Rn\mathbb{R}^n by moment matching the distribution. Numerical simulations confirm our theoretical derivations.Comment: 10 pages, 4 figures, Accepted to The 20th International Conference on Artificial Intelligence and Statistics (AISTATS

    Validity of the Cauchy-Born rule applied to discrete cellular-scale models of biological tissues

    Get PDF
    The development of new models of biological tissues that consider cells in a discrete manner is becoming increasingly popular as an alternative to PDE-based continuum methods, although formal relationships between the discrete and continuum frameworks remain to be established. For crystal mechanics, the discrete-to-continuum bridge is often made by assuming that local atom displacements can be mapped homogeneously from the mesoscale deformation gradient, an assumption known as the Cauchy-Born rule (CBR). Although the CBR does not hold exactly for non-crystalline materials, it may still be used as a first order approximation for analytic calculations of effective stresses or strain energies. In this work, our goal is to investigate numerically the applicability of the CBR to 2-D cellular-scale models by assessing the mechanical behaviour of model biological tissues, including crystalline (honeycomb) and non-crystalline reference states. The numerical procedure consists in precribing an affine deformation on the boundary cells and computing the position of internal cells. The position of internal cells is then compared with the prediction of the CBR and an average deviation is calculated in the strain domain. For centre-based models, we show that the CBR holds exactly when the deformation gradient is relatively small and the reference stress-free configuration is defined by a honeycomb lattice. We show further that the CBR may be used approximately when the reference state is perturbed from the honeycomb configuration. By contrast, for vertex-based models, a similar analysis reveals that the CBR does not provide a good representation of the tissue mechanics, even when the reference configuration is defined by a honeycomb lattice. The paper concludes with a discussion of the implications of these results for concurrent discrete/continuous modelling, adaptation of atom-to-continuum (AtC) techniques to biological tissues and model classification

    Quantum Metropolis Sampling

    Get PDF
    The original motivation to build a quantum computer came from Feynman who envisaged a machine capable of simulating generic quantum mechanical systems, a task that is believed to be intractable for classical computers. Such a machine would have a wide range of applications in the simulation of many-body quantum physics, including condensed matter physics, chemistry, and high energy physics. Part of Feynman's challenge was met by Lloyd who showed how to approximately decompose the time-evolution operator of interacting quantum particles into a short sequence of elementary gates, suitable for operation on a quantum computer. However, this left open the problem of how to simulate the equilibrium and static properties of quantum systems. This requires the preparation of ground and Gibbs states on a quantum computer. For classical systems, this problem is solved by the ubiquitous Metropolis algorithm, a method that basically acquired a monopoly for the simulation of interacting particles. Here, we demonstrate how to implement a quantum version of the Metropolis algorithm on a quantum computer. This algorithm permits to sample directly from the eigenstates of the Hamiltonian and thus evades the sign problem present in classical simulations. A small scale implementation of this algorithm can already be achieved with today's technologyComment: revised versio

    Sparse and stable Markowitz portfolios

    Full text link
    We consider the problem of portfolio selection within the classical Markowitz mean-variance framework, reformulated as a constrained least-squares regression problem. We propose to add to the objective function a penalty proportional to the sum of the absolute values of the portfolio weights. This penalty regularizes (stabilizes) the optimization problem, encourages sparse portfolios (i.e. portfolios with only few active positions), and allows to account for transaction costs. Our approach recovers as special cases the no-short-positions portfolios, but does allow for short positions in limited number. We implement this methodology on two benchmark data sets constructed by Fama and French. Using only a modest amount of training data, we construct portfolios whose out-of-sample performance, as measured by Sharpe ratio, is consistently and significantly better than that of the naive evenly-weighted portfolio which constitutes, as shown in recent literature, a very tough benchmark.Comment: Better emphasis of main result, new abstract, new examples and figures. New appendix with full details of algorithm. 17 pages, 6 figure
    • …
    corecore