70,443 research outputs found

    Galerkin Approximation of Dynamical Quantities using Trajectory Data

    Full text link
    Understanding chemical mechanisms requires estimating dynamical statistics such as expected hitting times, reaction rates, and committors. Here, we present a general framework for calculating these dynamical quantities by approximating boundary value problems using dynamical operators with a Galerkin expansion. A specific choice of basis set in the expansion corresponds to estimation of dynamical quantities using a Markov state model. More generally, the boundary conditions impose restrictions on the choice of basis sets. We demonstrate how an alternative basis can be constructed using ideas from diffusion maps. In our numerical experiments, this basis gives results of comparable or better accuracy to Markov state models. Additionally, we show that delay embedding can reduce the information lost when projecting the system's dynamics for model construction; this improves estimates of dynamical statistics considerably over the standard practice of increasing the lag time

    Laws of 4D printing

    Full text link
    The main difference between 3D and 4D printed structures is one extra dimension that is smart evolution over time. However, currently, there is no general formula to model and predict this extra dimension. Here, by starting from fundamental concepts, we derive and validate a universal bi-exponential formula that is required to model and predict the fourth D of 4D printed multi-material structures. 4D printing is a new manufacturing paradigm to elaborate stimuli-responsive materials in multi-material structures for advanced manufacturing (and construction) of advanced products (and structures). It conserves the general attributes of 3D printing (such as the elimination of molds, dies, and machining) and further enables the fourth dimension of products and structures to provide intelligent behavior over time. This intelligent behavior is encoded (usually by an inverse mathematical problem) into stimuli-responsive multi-materials during printing and is enabled by stimuli after printing. Here, we delve into the fourth dimension and reveal three general laws that govern the time-dependent shape-shifting behaviors of almost all (photochemical-, photothermal-, solvent-, pH-, moisture-, electrochemical-, electrothermal-, ultrasound-, enzyme-, etc.-responsive) multi-material 4D structures. We demonstrate that two different types of time-constants govern the shape-shifting behavior of almost all the multi-material 4D printed structures over time. Our results starting from the most fundamental concepts and ending with governing equations can serve as general design principles for future research in the 4D printing field, where the time-dependent behaviors should be understood, modeled, and predicted, correctly. Future software and hardware developments in 4D printing can also benefit from these results.Comment: This manuscript is currently under review in a journa

    Freely configurable quantum simulator based on a two-dimensional array of individually trapped ions

    Full text link
    A custom-built and precisely controlled quantum system may offer access to a fundamental understanding of another, less accessible system of interest. A universal quantum computer is currently out of reach, but an analog quantum simulator that makes the relevant observables, interactions, and states of a quantum model accessible could permit experimental insight into complex quantum dynamics that are intractable on conventional computers. Several platforms have been suggested and proof-of-principle experiments have been conducted. Here we characterise two-dimensional arrays of three ions trapped by radio-frequency fields in individually controlled harmonic wells forming equilateral triangles with side lengths 40 and 80 micrometer. In our approach, which is scalable to arbitrary two dimensional lattices, we demonstrate individual control of the electronic and motional degrees of freedom, preparation of a fiducial initial state with ion motion close to the ground state, as well as tuning of crucial couplings between ions within experimental sequences. Our work paves the way towards an analog quantum simulator of two-dimensional systems designed at will.Comment: 10 pages, 5 figure

    Lurking Variable Detection via Dimensional Analysis

    Full text link
    Lurking variables represent hidden information, and preclude a full understanding of phenomena of interest. Detection is usually based on serendipity -- visual detection of unexplained, systematic variation. However, these approaches are doomed to fail if the lurking variables do not vary. In this article, we address these challenges by introducing formal hypothesis tests for the presence of lurking variables, based on Dimensional Analysis. These procedures utilize a modified form of the Buckingham Pi theorem to provide structure for a suitable null hypothesis. We present analytic tools for reasoning about lurking variables in physical phenomena, construct procedures to handle cases of increasing complexity, and present examples of their application to engineering problems. The results of this work enable algorithm-driven lurking variable detection, complementing a traditionally inspection-based approach.Comment: 28 pages; full simulation codes provided in ancillary document for reproducibilit

    Data driven wireless network design : a multi-level modeling approach

    Get PDF
    A

    Netboost: Boosting-supported network analysis improves high-dimensional omics prediction in acute myeloid leukemia and Huntington's disease

    Full text link
    Background: State-of-the art selection methods fail to identify weak but cumulative effects of features found in many high-dimensional omics datasets. Nevertheless, these features play an important role in certain diseases. Results: We present Netboost, a three-step dimension reduction technique. First, a boosting-based filter is combined with the topological overlap measure to identify the essential edges of the network. Second, sparse hierarchical clustering is applied on the selected edges to identify modules and finally module information is aggregated by the first principal components. The primary analysis is than carried out on these summary measures instead of the original data. We demonstrate the application of the newly developed Netboost in combination with CoxBoost for survival prediction of DNA methylation and gene expression data from 180 acute myeloid leukemia (AML) patients and show, based on cross-validated prediction error curve estimates, its prediction superiority over variable selection on the full dataset as well as over an alternative clustering approach. The identified signature related to chromatin modifying enzymes was replicated in an independent dataset of AML patients in the phase II AMLSG 12-09 study. In a second application we combine Netboost with Random Forest classification and improve the disease classification error in RNA-sequencing data of Huntington's disease mice. Conclusion: Netboost improves definition of predictive variables for survival analysis and classification. It is a freely available Bioconductor R package for dimension reduction and hypothesis generation in high-dimensional omics applications

    Variational Latent Gaussian Process for Recovering Single-Trial Dynamics from Population Spike Trains

    Full text link
    When governed by underlying low-dimensional dynamics, the interdependence of simultaneously recorded population of neurons can be explained by a small number of shared factors, or a low-dimensional trajectory. Recovering these latent trajectories, particularly from single-trial population recordings, may help us understand the dynamics that drive neural computation. However, due to the biophysical constraints and noise in the spike trains, inferring trajectories from data is a challenging statistical problem in general. Here, we propose a practical and efficient inference method, called the variational latent Gaussian process (vLGP). The vLGP combines a generative model with a history-dependent point process observation together with a smoothness prior on the latent trajectories. The vLGP improves upon earlier methods for recovering latent trajectories, which assume either observation models inappropriate for point processes or linear dynamics. We compare and validate vLGP on both simulated datasets and population recordings from the primary visual cortex. In the V1 dataset, we find that vLGP achieves substantially higher performance than previous methods for predicting omitted spike trains, as well as capturing both the toroidal topology of visual stimuli space, and the noise-correlation. These results show that vLGP is a robust method with a potential to reveal hidden neural dynamics from large-scale neural recordings

    Desiree - a Refinement Calculus for Requirements Engineering

    Full text link
    The requirements elicited from stakeholders suffer from various afflictions, including informality, incompleteness, ambiguity, vagueness, inconsistencies, and more. It is the task of requirements engineering (RE) processes to derive from these an eligible (formal, complete enough, unambiguous, consistent, measurable, satisfiable, modifiable and traceable) requirements specification that truly captures stakeholder needs. We propose Desiree, a refinement calculus for systematically transforming stakeholder require-ments into an eligible specification. The core of the calculus is a rich set of requirements operators that iteratively transform stakeholder requirements by strengthening or weakening them, thereby reducing incompleteness, removing ambiguities and vagueness, eliminating unattainability and conflicts, turning them into an eligible specification. The framework also includes an ontology for modeling and classifying requirements, a description-based language for representing requirements, as well as a systematic method for applying the concepts and operators. In addition, we define the semantics of the requirements concepts and operators, and develop a graphical modeling tool in support of the entire framework. To evaluate our proposal, we have conducted a series of empirical evaluations, including an ontology evaluation by classifying a large public requirements set, a language evaluation by rewriting the large set of requirements using our description-based syntax, a method evaluation through a realistic case study, and an evaluation of the entire framework through three controlled experiments. The results of our evaluations show that our ontology, language, and method are adequate in capturing requirements in practice, and offer strong evidence that with sufficient training, our framework indeed helps people conduct more effective requirements engineering.Comment: PhD thesis, University of Trento, 235 pages, 26 figures. second author supervised this wor

    A Factor-Adjusted Multiple Testing Procedure with Application to Mutual Fund Selection

    Full text link
    In this article, we propose a factor-adjusted multiple testing (FAT) procedure based on factor-adjusted p-values in a linear factor model involving some observable and unobservable factors, for the purpose of selecting skilled funds in empirical finance. The factor-adjusted p-values were obtained after extracting the latent common factors by the principal component method. Under some mild conditions, the false discovery proportion can be consistently estimated even if the idiosyncratic errors are allowed to be weakly correlated across units. Furthermore, by appropriately setting a sequence of threshold values approaching zero, the proposed FAT procedure enjoys model selection consistency. Extensive simulation studies and a real data analysis for selecting skilled funds in the U.S. financial market are presented to illustrate the practical utility of the proposed method. Supplementary materials for this article are available online

    Toward a social psychophysics of face communication

    Get PDF
    As a highly social species, humans are equipped with a powerful tool for social communication—the face, which can elicit multiple social perceptions in others due to the rich and complex variations of its movements, morphology, and complexion. Consequently, identifying precisely what face information elicits different social perceptions is a complex empirical challenge that has largely remained beyond the reach of traditional research methods. More recently, the emerging field of social psychophysics has developed new methods designed to address this challenge. Here, we introduce and review the foundational methodological developments of social psychophysics, present recent work that has advanced our understanding of the face as a tool for social communication, and discuss the main challenges that lie ahead
    • …
    corecore