87,172 research outputs found

    A Survey on Continuous Time Computations

    Full text link
    We provide an overview of theories of continuous time computation. These theories allow us to understand both the hardness of questions related to continuous time dynamical systems and the computational power of continuous time analog models. We survey the existing models, summarizing results, and point to relevant references in the literature

    Comparison of POD reduced order strategies for the nonlinear 2D Shallow Water Equations

    Full text link
    This paper introduces tensorial calculus techniques in the framework of Proper Orthogonal Decomposition (POD) to reduce the computational complexity of the reduced nonlinear terms. The resulting method, named tensorial POD, can be applied to polynomial nonlinearities of any degree pp. Such nonlinear terms have an on-line complexity of O(kp+1)\mathcal{O}(k^{p+1}), where kk is the dimension of POD basis, and therefore is independent of full space dimension. However it is efficient only for quadratic nonlinear terms since for higher nonlinearities standard POD proves to be less time consuming once the POD basis dimension kk is increased. Numerical experiments are carried out with a two dimensional shallow water equation (SWE) test problem to compare the performance of tensorial POD, standard POD, and POD/Discrete Empirical Interpolation Method (DEIM). Numerical results show that tensorial POD decreases by 76×76\times times the computational cost of the on-line stage of standard POD for configurations using more than 300,000300,000 model variables. The tensorial POD SWE model was only 2−8×2-8\times slower than the POD/DEIM SWE model but the implementation effort is considerably increased. Tensorial calculus was again employed to construct a new algorithm allowing POD/DEIM shallow water equation model to compute its off-line stage faster than the standard and tensorial POD approaches.Comment: 23 pages, 8 figures, 5 table

    Numerical algebraic geometry for model selection and its application to the life sciences

    Full text link
    Researchers working with mathematical models are often confronted by the related problems of parameter estimation, model validation, and model selection. These are all optimization problems, well-known to be challenging due to non-linearity, non-convexity and multiple local optima. Furthermore, the challenges are compounded when only partial data is available. Here, we consider polynomial models (e.g., mass-action chemical reaction networks at steady state) and describe a framework for their analysis based on optimization using numerical algebraic geometry. Specifically, we use probability-one polynomial homotopy continuation methods to compute all critical points of the objective function, then filter to recover the global optima. Our approach exploits the geometric structures relating models and data, and we demonstrate its utility on examples from cell signaling, synthetic biology, and epidemiology.Comment: References added, additional clarification

    Supporting Regularized Logistic Regression Privately and Efficiently

    Full text link
    As one of the most popular statistical and machine learning models, logistic regression with regularization has found wide adoption in biomedicine, social sciences, information technology, and so on. These domains often involve data of human subjects that are contingent upon strict privacy regulations. Increasing concerns over data privacy make it more and more difficult to coordinate and conduct large-scale collaborative studies, which typically rely on cross-institution data sharing and joint analysis. Our work here focuses on safeguarding regularized logistic regression, a widely-used machine learning model in various disciplines while at the same time has not been investigated from a data security and privacy perspective. We consider a common use scenario of multi-institution collaborative studies, such as in the form of research consortia or networks as widely seen in genetics, epidemiology, social sciences, etc. To make our privacy-enhancing solution practical, we demonstrate a non-conventional and computationally efficient method leveraging distributing computing and strong cryptography to provide comprehensive protection over individual-level and summary data. Extensive empirical evaluation on several studies validated the privacy guarantees, efficiency and scalability of our proposal. We also discuss the practical implications of our solution for large-scale studies and applications from various disciplines, including genetic and biomedical studies, smart grid, network analysis, etc
    • …
    corecore