4,928 research outputs found

    Robots that can adapt like animals

    Get PDF
    As robots leave the controlled environments of factories to autonomously function in more complex, natural environments, they will have to respond to the inevitable fact that they will become damaged. However, while animals can quickly adapt to a wide variety of injuries, current robots cannot "think outside the box" to find a compensatory behavior when damaged: they are limited to their pre-specified self-sensing abilities, can diagnose only anticipated failure modes, and require a pre-programmed contingency plan for every type of potential damage, an impracticality for complex robots. Here we introduce an intelligent trial and error algorithm that allows robots to adapt to damage in less than two minutes, without requiring self-diagnosis or pre-specified contingency plans. Before deployment, a robot exploits a novel algorithm to create a detailed map of the space of high-performing behaviors: This map represents the robot's intuitions about what behaviors it can perform and their value. If the robot is damaged, it uses these intuitions to guide a trial-and-error learning algorithm that conducts intelligent experiments to rapidly discover a compensatory behavior that works in spite of the damage. Experiments reveal successful adaptations for a legged robot injured in five different ways, including damaged, broken, and missing legs, and for a robotic arm with joints broken in 14 different ways. This new technique will enable more robust, effective, autonomous robots, and suggests principles that animals may use to adapt to injury

    Action and behavior: a free-energy formulation

    Get PDF
    We have previously tried to explain perceptual inference and learning under a free-energy principle that pursues Helmholtz’s agenda to understand the brain in terms of energy minimization. It is fairly easy to show that making inferences about the causes of sensory data can be cast as the minimization of a free-energy bound on the likelihood of sensory inputs, given an internal model of how they were caused. In this article, we consider what would happen if the data themselves were sampled to minimize this bound. It transpires that the ensuing active sampling or inference is mandated by ergodic arguments based on the very existence of adaptive agents. Furthermore, it accounts for many aspects of motor behavior; from retinal stabilization to goal-seeking. In particular, it suggests that motor control can be understood as fulfilling prior expectations about proprioceptive sensations. This formulation can explain why adaptive behavior emerges in biological agents and suggests a simple alternative to optimal control theory. We illustrate these points using simulations of oculomotor control and then apply to same principles to cued and goal-directed movements. In short, the free-energy formulation may provide an alternative perspective on the motor control that places it in an intimate relationship with perception

    Distinguishing cause from effect using observational data: methods and benchmarks

    Get PDF
    The discovery of causal relationships from purely observational data is a fundamental problem in science. The most elementary form of such a causal discovery problem is to decide whether X causes Y or, alternatively, Y causes X, given joint observations of two variables X, Y. An example is to decide whether altitude causes temperature, or vice versa, given only joint measurements of both variables. Even under the simplifying assumptions of no confounding, no feedback loops, and no selection bias, such bivariate causal discovery problems are challenging. Nevertheless, several approaches for addressing those problems have been proposed in recent years. We review two families of such methods: Additive Noise Methods (ANM) and Information Geometric Causal Inference (IGCI). We present the benchmark CauseEffectPairs that consists of data for 100 different cause-effect pairs selected from 37 datasets from various domains (e.g., meteorology, biology, medicine, engineering, economy, etc.) and motivate our decisions regarding the "ground truth" causal directions of all pairs. We evaluate the performance of several bivariate causal discovery methods on these real-world benchmark data and in addition on artificially simulated data. Our empirical results on real-world data indicate that certain methods are indeed able to distinguish cause from effect using only purely observational data, although more benchmark data would be needed to obtain statistically significant conclusions. One of the best performing methods overall is the additive-noise method originally proposed by Hoyer et al. (2009), which obtains an accuracy of 63+-10 % and an AUC of 0.74+-0.05 on the real-world benchmark. As the main theoretical contribution of this work we prove the consistency of that method.Comment: 101 pages, second revision submitted to Journal of Machine Learning Researc

    Robust aeroelastic design of composite plate wings

    Get PDF

    Accommodating maintenance in prognostics

    Get PDF
    Error on title page - year of award is 2021Steam turbines are an important asset of nuclear power plants, and are required to operate reliably and efficiently. Unplanned outages have a significant impact on the ability of the plant to generate electricity. Therefore, condition-based maintenance (CBM) can be used for predictive and proactive maintenance to avoid unplanned outages while reducing operating costs and increasing the reliability and availability of the plant. In CBM, the information gathered can be interpreted for prognostics (the prediction of failure time or remaining useful life (RUL)). The aim of this project was to address two areas of challenges in prognostics, the selection of predictive technique and accommodation of post-maintenance effects, to improve the efficacy of prognostics. The selection of an appropriate predictive algorithm is a key activity for an effective development of prognostics. In this research, a formal approach for the evaluation and selection of predictive techniques is developed to facilitate a methodic selection process of predictive techniques by engineering experts. This approach is then implemented for a case study provided by the engineering experts. Therefore, as a result of formal evaluation, a probabilistic technique the Bayesian Linear Regression (BLR) and a non-probabilistic technique the Support Vector Regression (SVR) were selected for prognostics implementation. In this project, the knowledge of prognostics implementation is extended by including post maintenance affects into prognostics. Maintenance aims to restore a machine into a state where it is safe and reliable to operate while recovering the health of the machine. However, such activities result in introduction of uncertainties that are associated with predictions due to deviations in degradation model. Thus, affecting accuracy and efficacy of predictions. Therefore, such vulnerabilities must be addressed by incorporating the information from maintenance events for accurate and reliable predictions. This thesis presents two frameworks which are adapted for probabilistic and non-probabilistic prognostic techniques to accommodate maintenance. Two case studies: a real-world case study from a nuclear power plant in the UK and a synthetic case study which was generated based on the characteristics of a real-world case study are used for the implementation and validation of the frameworks. The results of the implementation hold a promise for predicting remaining useful life while accommodating maintenance repairs. Therefore, ensuring increased asset availability with higher reliability, maintenance cost effectiveness and operational safety.Steam turbines are an important asset of nuclear power plants, and are required to operate reliably and efficiently. Unplanned outages have a significant impact on the ability of the plant to generate electricity. Therefore, condition-based maintenance (CBM) can be used for predictive and proactive maintenance to avoid unplanned outages while reducing operating costs and increasing the reliability and availability of the plant. In CBM, the information gathered can be interpreted for prognostics (the prediction of failure time or remaining useful life (RUL)). The aim of this project was to address two areas of challenges in prognostics, the selection of predictive technique and accommodation of post-maintenance effects, to improve the efficacy of prognostics. The selection of an appropriate predictive algorithm is a key activity for an effective development of prognostics. In this research, a formal approach for the evaluation and selection of predictive techniques is developed to facilitate a methodic selection process of predictive techniques by engineering experts. This approach is then implemented for a case study provided by the engineering experts. Therefore, as a result of formal evaluation, a probabilistic technique the Bayesian Linear Regression (BLR) and a non-probabilistic technique the Support Vector Regression (SVR) were selected for prognostics implementation. In this project, the knowledge of prognostics implementation is extended by including post maintenance affects into prognostics. Maintenance aims to restore a machine into a state where it is safe and reliable to operate while recovering the health of the machine. However, such activities result in introduction of uncertainties that are associated with predictions due to deviations in degradation model. Thus, affecting accuracy and efficacy of predictions. Therefore, such vulnerabilities must be addressed by incorporating the information from maintenance events for accurate and reliable predictions. This thesis presents two frameworks which are adapted for probabilistic and non-probabilistic prognostic techniques to accommodate maintenance. Two case studies: a real-world case study from a nuclear power plant in the UK and a synthetic case study which was generated based on the characteristics of a real-world case study are used for the implementation and validation of the frameworks. The results of the implementation hold a promise for predicting remaining useful life while accommodating maintenance repairs. Therefore, ensuring increased asset availability with higher reliability, maintenance cost effectiveness and operational safety

    MMGP: a Mesh Morphing Gaussian Process-based machine learning method for regression of physical problems under non-parameterized geometrical variability

    Full text link
    When learning simulations for modeling physical phenomena in industrial designs, geometrical variabilities are of prime interest. While classical regression techniques prove effective for parameterized geometries, practical scenarios often involve the absence of shape parametrization during the inference stage, leaving us with only mesh discretizations as available data. Learning simulations from such mesh-based representations poses significant challenges, with recent advances relying heavily on deep graph neural networks to overcome the limitations of conventional machine learning approaches. Despite their promising results, graph neural networks exhibit certain drawbacks, including their dependency on extensive datasets and limitations in providing built-in predictive uncertainties or handling large meshes. In this work, we propose a machine learning method that do not rely on graph neural networks. Complex geometrical shapes and variations with fixed topology are dealt with using well-known mesh morphing onto a common support, combined with classical dimensionality reduction techniques and Gaussian processes. The proposed methodology can easily deal with large meshes without the need for explicit shape parameterization and provides crucial predictive uncertainties, which are essential for informed decision-making. In the considered numerical experiments, the proposed method is competitive with respect to existing graph neural networks, regarding training efficiency and accuracy of the predictions

    The stochastic aeroelastic response analysis of helicopter rotors using deep and shallow machine learning

    Get PDF
    This paper addresses the influence of manufacturing variability of a helicopter rotor blade on its aeroelastic responses. An aeroelastic analysis using finite elements in spatial and temporal domains is used to compute the helicopter rotor frequencies, vibratory hub loads, power required and stability in forward flight. The novelty of the work lies in the application of advanced data-driven machine learning (ML) techniques, such as convolution neural networks (CNN), multi-layer perceptron (MLP), random forests, support vector machines and adaptive Gaussian process (GP) for capturing the nonlinear responses of these complex spatio-temporal models to develop an efficient physics-informed ML framework for stochastic rotor analysis. Thus, the work is of practical significance as (i) it accounts for manufacturing uncertainties, (ii) accurately quantifies their effects on nonlinear response of rotor blade and (iii) makes the computationally expensive simulations viable by the use of ML. A rigorous performance assessment of the aforementioned approaches is presented by demonstrating validation on the training dataset and prediction on the test dataset. The contribution of the study lies in the following findings: (i) The uncertainty in composite material and geometric properties can lead to significant variations in the rotor aeroelastic responses and thereby highlighting that the consideration of manufacturing variability in analyzing helicopter rotors is crucial for assessing their behaviour in real-life scenarios. (ii) Precisely, the substantial effect of uncertainty has been observed on the six vibratory hub loads and the damping with the highest impact on the yawing hub moment. Therefore, sufficient factor of safety should be considered in the design to alleviate the effects of perturbation in the simulation results. (iii) Although advanced ML techniques are harder to train, the optimal model configuration is capable of approximating the nonlinear response trends accurately. GP and CNN followed by MLP achieved satisfactory performance. Excellent accuracy achieved by the above ML techniques demonstrates their potential for application in the optimization of rotors under uncertainty
    corecore