524 research outputs found

    D'ya like DAGs? A Survey on Structure Learning and Causal Discovery

    Full text link
    Causal reasoning is a crucial part of science and human intelligence. In order to discover causal relationships from data, we need structure discovery methods. We provide a review of background theory and a survey of methods for structure discovery. We primarily focus on modern, continuous optimization methods, and provide reference to further resources such as benchmark datasets and software packages. Finally, we discuss the assumptive leap required to take us from structure to causality.Comment: 35 page

    Discovering Causal Relations and Equations from Data

    Full text link
    Physics is a field of science that has traditionally used the scientific method to answer questions about why natural phenomena occur and to make testable models that explain the phenomena. Discovering equations, laws and principles that are invariant, robust and causal explanations of the world has been fundamental in physical sciences throughout the centuries. Discoveries emerge from observing the world and, when possible, performing interventional studies in the system under study. With the advent of big data and the use of data-driven methods, causal and equation discovery fields have grown and made progress in computer science, physics, statistics, philosophy, and many applied fields. All these domains are intertwined and can be used to discover causal relations, physical laws, and equations from observational data. This paper reviews the concepts, methods, and relevant works on causal and equation discovery in the broad field of Physics and outlines the most important challenges and promising future lines of research. We also provide a taxonomy for observational causal and equation discovery, point out connections, and showcase a complete set of case studies in Earth and climate sciences, fluid dynamics and mechanics, and the neurosciences. This review demonstrates that discovering fundamental laws and causal relations by observing natural phenomena is being revolutionised with the efficient exploitation of observational data, modern machine learning algorithms and the interaction with domain knowledge. Exciting times are ahead with many challenges and opportunities to improve our understanding of complex systems.Comment: 137 page

    Exploring resource/performance trade-offs for streaming applications on embedded multiprocessors

    Get PDF
    Embedded system design is challenged by the gap between the ever-increasing customer demands and the limited resource budgets. The tough competition demands ever-shortening time-to-market and product lifecycles. To solve or, at least to alleviate, the aforementioned issues, designers and manufacturers need model-based quantitative analysis techniques for early design-space exploration to study trade-offs of different implementation candidates. Moreover, modern embedded applications, especially the streaming applications addressed in this thesis, face more and more dynamic input contents, and the platforms that they are running on are more flexible and allow runtime configuration. Quantitative analysis techniques for embedded system design have to be able to handle such dynamic adaptable systems. This thesis has the following contributions: - A resource-aware extension to the Synchronous Dataflow (SDF) model of computation. - Trade-off analysis techniques, both in the time-domain and in the iterationdomain (i.e., on an SDF iteration basis), with support for resource sharing. - Bottleneck-driven design-space exploration techniques for resource-aware SDF. - A game-theoretic approach to controller synthesis, guaranteeing performance under dynamic input. As a first contribution, we propose a new model, as an extension of static synchronous dataflow graphs (SDF) that allows the explicit modeling of resources with consistency checking. The model is called resource-aware SDF (RASDF). The extension enables us to investigate resource sharing and to explore different scheduling options (ways to allocate the resources to the different tasks) using state-space exploration techniques. Consistent SDF and RASDF graphs have the property that an execution occurs in so-called iterations. An iteration typically corresponds to the processing of a meaningful piece of data, and it returns the graph to its initial state. On multiprocessor platforms, iterations may be executed in a pipelined fashion, which makes performance analysis challenging. As the second contribution, this thesis develops trade-off analysis techniques for RASDF, both in the time-domain and in the iteration-domain (i.e., on an SDF iteration basis), to dimension resources on platforms. The time-domain analysis allows interleaving of different iterations, but the size of the explored state space grows quickly. The iteration-based technique trades the potential of interleaving of iterations for a compact size of the iteration state space. An efficient bottleneck-driven designspace exploration technique for streaming applications, the third main contribution in this thesis, is derived from analysis of the critical cycle of the state space, to reveal bottleneck resources that are limiting the throughput. All techniques are based on state-based exploration. They enable system designers to tailor their platform to the required applications, based on their own specific performance requirements. Pruning techniques for efficient exploration of the state space have been developed. Pareto dominance in terms of performance and resource usage is used for exact pruning, and approximation techniques are used for heuristic pruning. Finally, the thesis investigates dynamic scheduling techniques to respond to dynamic changes in input streams. The fourth contribution in this thesis is a game-theoretic approach to tackle controller synthesis to select the appropriate schedules in response to dynamic inputs from the environment. The approach transforms the explored iteration state space of a scenario- and resource-aware SDF (SARA SDF) graph to a bipartite game graph, and maps the controller synthesis problem to the problem of finding a winning positional strategy in a classical mean payoff game. A winning strategy of the game can be used to synthesize the controller of schedules for the system that is guaranteed to satisfy the throughput requirement given by the designer

    Convex reconstruction from structured measurements

    Get PDF
    Convex signal reconstruction is the art of solving ill-posed inverse problems via convex optimization. It is applicable to a great number of problems from engineering, signal analysis, quantum mechanics and many more. The most prominent example is compressed sensing, where one aims at reconstructing sparse vectors from an under-determined set of linear measurements. In many cases, one can prove rigorous performance guarantees for these convex algorithms. The combination of practical importance and theoretical tractability has directed a significant amount of attention to this young field of applied mathematics. However, rigorous proofs are usually only available for certain "generic cases"---for instance situations, where all measurements are represented by random Gaussian vectors. The focus of this thesis is to overcome this drawback by devising mathematical proof techniques can be applied to more "structured" measurements. Here, structure can have various meanings. E.g. it could refer to the type of measurements that occur in a given concrete application. Or, more abstractly, structure in the sense that a measurement ensemble is small and exhibits rich geometric features. The main focus of this thesis is phase retrieval: The problem of inferring phase information from amplitude measurements. This task is ubiquitous in, for instance, in crystallography, astronomy and diffraction imaging. Throughout this project, a series of increasingly better convex reconstruction guarantees have been established. On the one hand, we improved results for certain measurement models that mimic typical experimental setups in diffraction imaging. On the other hand, we identified spherical t-designs as a general purpose tool for the derandomization of data recovery schemes. Loosely speaking, a t-design is a finite configuration of vectors that is "evenly distributed" in the sense that it reproduces the first 2t moments of the uniform measure. Such configurations have been studied, for instance, in algebraic combinatorics, coding theory, and quantum information. We have shown that already spherical 4-designs allow for proving close-to-optimal convex reconstruction guarantees for phase retrieval. The success of this program depends on explicit constructions of spherical t-designs. In this regard, we have studied the design properties of stabilizer states. These are configurations of vectors that feature prominently in quantum information theory. Mathematically, they can be related to objects in discrete symplectic vector spaces---a structure we use heavily. We have shown that these vectors form a spherical 3-design and are, in some sense, close to a spherical 4-design. Putting these efforts together, we establish tight bounds on phase retrieval from stabilizer measurements. While working on the derandomization of phase retrieval, I obtained a number of results on other convex signal reconstruction problems. These include compressed sensing from anisotropic measurements, non-negative compressed sensing in the presence of noise and identifying improved convex regularizers for low rank matrix reconstruction. Going even further, the mathematical methods I used to tackle ill-posed inverse problems can be applied to a plethora of problems from quantum information theory. In particular, the causal structure behind Bell inequalities, new ways to compare experiments to fault-tolerance thresholds in quantum error correction, a novel benchmark for quantum state tomography via Bayesian estimation, and the task of distinguishing quantum states

    Evolvability-guided Optimization of Linear Deformation Setups for Evolutionary Design Optimization

    Get PDF
    Richter A. Evolvability-guided Optimization of Linear Deformation Setups for Evolutionary Design Optimization. Bielefeld: Universität Bielefeld; 2019.Andreas Richter gratefully acknowledges the financial support from Honda Research Institute Europe (HRI-EU).This thesis targets efficient solutions for optimal representation setups for evolutionary design optimization problems. The representation maps the abstract parameters of an optimizer to a meaningful variation of the design model, e.g., the shape of a car. Thereby, it determines the convergence speed to and the quality of the final result. Thus, engineers are eager to employ well-tuned representations to achieve high-quality design solutions. But, setting up optimal representations is a cumbersome process because the setup procedure requires detailed knowledge about the objective functions, e.g., a fluid dynamics simulation, and the parameters of the employed representation itself. Thus, we target efficient routines to set up representations automatically to support engineers from their tedious, partly manual work. Inspired by the concept of evolvability, we present novel quality criteria for the evaluation of linear deformations commonly applied as representations. We define and analyze the criteria variability, regularity, and improvement potential which measure the expected quality and convergence speed of an evolutionary design optimization process based on the linear deformation setup. Moreover, we target the efficient optimization of deformation setups with respect to these three criteria. In dynamic design optimization scenarios a suitable compromise between exploration and exploitation is crucial for efficient solutions. We discuss the construction of optimal compromises for these dynamic scenarios with our criteria because they characterize exploration and exploitation. As a result an engineer can initialize and adjust the deformation setup for improved convergence speed of a design process and for enhanced quality of the design solutions with our methods

    Evolvability-guided Optimization of Linear Deformation Setups for Evolutionary Design Optimization

    Get PDF
    Richter A. Evolvability-guided Optimization of Linear Deformation Setups for Evolutionary Design Optimization. Bielefeld: Universität Bielefeld; 2019.Andreas Richter gratefully acknowledges the financial support from Honda Research Institute Europe (HRI-EU).This thesis targets efficient solutions for optimal representation setups for evolutionary design optimization problems. The representation maps the abstract parameters of an optimizer to a meaningful variation of the design model, e.g., the shape of a car. Thereby, it determines the convergence speed to and the quality of the final result. Thus, engineers are eager to employ well-tuned representations to achieve high-quality design solutions. But, setting up optimal representations is a cumbersome process because the setup procedure requires detailed knowledge about the objective functions, e.g., a fluid dynamics simulation, and the parameters of the employed representation itself. Thus, we target efficient routines to set up representations automatically to support engineers from their tedious, partly manual work. Inspired by the concept of evolvability, we present novel quality criteria for the evaluation of linear deformations commonly applied as representations. We define and analyze the criteria variability, regularity, and improvement potential which measure the expected quality and convergence speed of an evolutionary design optimization process based on the linear deformation setup. Moreover, we target the efficient optimization of deformation setups with respect to these three criteria. In dynamic design optimization scenarios a suitable compromise between exploration and exploitation is crucial for efficient solutions. We discuss the construction of optimal compromises for these dynamic scenarios with our criteria because they characterize exploration and exploitation. As a result an engineer can initialize and adjust the deformation setup for improved convergence speed of a design process and for enhanced quality of the design solutions with our methods

    Discovering causal relations and equations from data

    Get PDF
    Physics is a field of science that has traditionally used the scientific method to answer questions about why natural phenomena occur and to make testable models that explain the phenomena. Discovering equations, laws, and principles that are invariant, robust, and causal has been fundamental in physical sciences throughout the centuries. Discoveries emerge from observing the world and, when possible, performing interventions on the system under study. With the advent of big data and data-driven methods, the fields of causal and equation discovery have developed and accelerated progress in computer science, physics, statistics, philosophy, and many applied fields. This paper reviews the concepts, methods, and relevant works on causal and equation discovery in the broad field of physics and outlines the most important challenges and promising future lines of research. We also provide a taxonomy for data-driven causal and equation discovery, point out connections, and showcase comprehensive case studies in Earth and climate sciences, fluid dynamics and mechanics, and the neurosciences. This review demonstrates that discovering fundamental laws and causal relations by observing natural phenomena is revolutionised with the efficient exploitation of observational data and simulations, modern machine learning algorithms and the combination with domain knowledge. Exciting times are ahead with many challenges and opportunities to improve our understanding of complex systems
    corecore