71 research outputs found

    Geometric Phase Integrals and Irrationality Tests

    Full text link
    Let F(x)F(x) be an analytical, real valued function defined on a compact domain B⊂R\mathcal {B}\subset\mathbb{R}. We prove that the problem of establishing the irrationality of F(x)F(x) evaluated at x0∈Bx_0\in \mathcal{B} can be stated with respect to the convergence of the phase of a suitable integral I(h)I(h), defined on an open, bounded domain, for hh that goes to infinity. This is derived as a consequence of a similar equivalence, that establishes the existence of isolated solutions of systems equations of analytical functions on compact real domains in Rp\mathbb{R}^p, if and only if the phase of a suitable ``geometric'' complex phase integral I(h)I(h) converges for h→∞h\rightarrow \infty. We finally highlight how the method can be easily adapted to be relevant for the study of the existence of rational or integer points on curves in bounded domains, and we sketch some potential theoretical developments of the method

    Stem-Like Adaptive Aneuploidy and Cancer Quasispecies

    Get PDF
    We analyze and reinterpret experimental evidence from the literature to argue for an ability of tumor cells to self-regulate their aneuploidy rate. We conjecture that this ability is mediated by a diversification factor that exploits molecular mechanisms common to embryo stem cells and, to a lesser extent, adult stem cells, that is eventually reactivated in tumor cells. Moreover, we propose a direct use of the quasispecies model to cancer cells based on their significant genomic instability (i.e. aneuploidy rate), by defining master sequences lengths as the sum of all copy numbers of physically distinct whole and fragmented chromosomes. We compute an approximate error threshold such that any aneuploidy rate larger than the threshold would lead to a loss of fitness of a tumor population, and we confirm that highly aneuploid cancer populations already function with aneuploidy rates close to the estimated threshold

    Augmented Sparse Reconstruction of Protein Signaling Networks

    Full text link
    The problem of reconstructing and identifying intracellular protein signaling and biochemical networks is of critical importance in biology today. We sought to develop a mathematical approach to this problem using, as a test case, one of the most well-studied and clinically important signaling networks in biology today, the epidermal growth factor receptor (EGFR) driven signaling cascade. More specifically, we suggest a method, augmented sparse reconstruction, for the identification of links among nodes of ordinary differential equation (ODE) networks from a small set of trajectories with different initial conditions. Our method builds a system of representation by using a collection of integrals of all given trajectories and by attenuating block of terms in the representation itself. The system of representation is then augmented with random vectors, and minimization of the 1-norm is used to find sparse representations for the dynamical interactions of each node. Augmentation by random vectors is crucial, since sparsity alone is not able to handle the large error-in-variables in the representation. Augmented sparse reconstruction allows to consider potentially very large spaces of models and it is able to detect with high accuracy the few relevant links among nodes, even when moderate noise is added to the measured trajectories. After showing the performance of our method on a model of the EGFR protein network, we sketch briefly the potential future therapeutic applications of this approach.Comment: 24 pages, 6 figure

    Agnostic Structure of Data Science Methods

    Get PDF
    In this paper we argue that data science is a coherent and novel approach to empirical problems that, in its most general form, does not build understanding about phenomena. Within the new type of mathematization at work in data science, mathematical methods are not selected because of any relevance for a problem at hand; mathematical methods are applied to a specific problem only by `forcing’, i.e. on the basis of their ability to reorganize the data for further analysis and the intrinsic richness of their mathematical structure. In particular, we argue that deep learning neural networks are best understood within the context of forcing optimization methods. We finally explore the broader question of the appropriateness of data science methods in solving problems. We argue that this question should not be interpreted as a search for a correspondence between phenomena and specific solutions found by data science methods; rather, it is the internal structure of data science methods that is open to precise forms of understanding

    The Agnostic Structure of Data Science Methods

    Get PDF
    In this paper we want to discuss the changing role of mathematics in science, as a way to discuss some methodological trends at work in big data science. More specifically, we will show how the role of mathematics has dramatically changed from its more classical approach. Classically, any application of mathematical techniques requires a previous understanding of the phenomena, and of the mutual relations among the relevant data; modern data analysis appeals, instead, to mathematics in order to identify possible invariants uniquely attached to the specific questions we may ask about the phenomena of interest. In other terms, the new paradigm for the application of mathematics does not require any understanding of the phenomenon, but rather relies on mathematics to organize data in such a way as to reveal possible invariants that may or may not provide further understanding of the phenomenon per se, but that nevertheless provide an answer to the relevant question

    The Stationary Phase Method for Real Analytic Geometry

    Get PDF
    We prove that the existence of isolated solutions of systems of equations of analytical functions on compact real domains in Rp, is equivalent to the convergence of the phase of a suitable complex valued integral I(h) for h→∞. As an application, we then use this result to prove that the problem of establishing the irrationality of the value of an analytic function F(x) at a point x0 can be rephrased in terms of a similar phase convergence

    Isolated Objects and Their Evolution: A Derivation of the Propagator’s Path Integral for Spinless Elementary Particles

    Get PDF
    We formalize the notion of isolated objects (units), and we build a consistent theory to describe their evolution and interaction. We further introduce a notion of indistinguishability of distinct spacetime paths of a unit, for which the evolution of the state variables of the unit is the same, and a generalization of the equivalence principle based on indistinguishability. Under a time reversal condition on the whole set of indistinguishable paths of a unit, we show that the quantization of motion of spinless elementary particles in a general potential field can be derived in this framework, in the limiting case of weak fields and low velocities. Extrapolating this approach to include weak relativistic effects, we explore possible experimental consequences. We conclude by suggesting a primitive ontology for the theory of isolated objects

    The Agnostic Structure of Data Science Methods

    Get PDF
    In this paper we want to discuss the changing role of mathematics in science, as a way to discuss some methodological trends at work in big data science. More specifically, we will show how the role of mathematics has dramatically changed from its more classical approach. Classically, any application of mathematical techniques requires a previous understanding of the phenomena, and of the mutual relations among the relevant data; modern data analysis appeals, instead, to mathematics in order to identify possible invariants uniquely attached to the specific questions we may ask about the phenomena of interest. In other terms, the new paradigm for the application of mathematics does not require any understanding of the phenomenon, but rather relies on mathematics to organize data in such a way as to reveal possible invariants that may or may not provide further understanding of the phenomenon per se, but that nevertheless provide an answer to the relevant question

    Agnostic Science. Towards a Philosophy of Data Analysis

    Get PDF
    Forthcoming on "Foundations of Science"Submitted to Foundations of SciencesInternational audienceIn this paper we will offer a few examples to illustrate the orientation of contemporary research in data analysis and we will investigate the corresponding role of mathematics. We argue that the modus operandi of data analysis is implicitly based on the belief that if we have collected enough and sufficiently diverse data, we will be able to answer most relevant questions concerning the phenomenon itself. This is a methodological paradigm strongly related, but not limited to, biology, and we label it the microarray paradigm. In this new framework, mathematics provides powerful techniques and general ideas which generate new computational tools. But it is missing any explicit isomorphism between a mathematical structure and the phenomenon under consideration. This methodology used in data analysis suggests the possibility of forecasting and analyzing without a structured and general understanding. This is the perspective we propose to call agnostic science, and we argue that, rather than diminishing or flattening the role of mathematics in science, the lack of isomorphisms with phenomena liberates mathematics, paradoxically making more likely the practical use of some of its most sophisticated ideas

    The Agnostic Structure of Data Science Methods

    Get PDF
    In this paper we argue that data science is a coherent approach to empirical problems that, in its most general form, does not build understanding about phenomena. We start by exploring the broad structure of mathematization methods in data science, organized around the belief that if enough and sufficiently diverse data are collected regarding a certain phenomenon, it is possible to answer all relevant questions about it. We call this belief `the microarray paradigm’ and the approach to empirical phenomena based on it `agnostic science'. Not all computational methods dealing with large data sets are properly within the domain of agnostic science, and we give an example of an algorithm, PageRank, that relies on large data processing, but such that the significance of its output is readily intelligible. Within the new type of mathematization at work in agnostic science, mathematical methods are not selected because of any particular relevance for a problem at hand. Rather, mathematical methods are applied to a specific problem only on the basis of their ability to reorganize the data for further analysis and the intrinsic richness of their mathematical structure. We refer to this type of mathematization as `forcing’. We then show that optimization methods are used in data science by forcing them on problems. This is particularly significant since virtually all methods of data science can be reinterpreted as types of optimization methods. In particular, we argue that deep learning neural networks are best understood within the context of forcing optimality. We finally explore the broader question of the appropriateness of data science methods in solving problems. We argue that this question should not be interpreted as a search for a correspondence between phenomena and specific solutions found by data science methods. Rather, it is the internal structure of data science methods that is open to forms of understanding. As an example, we offer an analysis of ensemble methods, where distinct data science methods are combined in the search for the solution of a problem, and we speculate on the general structure of the data sets that are most appropriate for such methods
    • …
    corecore