8,666 research outputs found

    Robust aerodynamic design of variable speed wind turbine rotors

    Get PDF
    This study focuses on the robust aerodynamic design of the bladed rotor of small horizontal axis wind turbines. The optimization process also considers the effects of manufacturing and assembly tolerances on the yearly energy production. The aerodynamic performance of the rotors so designed has reduced sensitivity to manufacturing and assembly errors. The geometric uncertainty affecting the rotor shape is represented by normal distributions of the pitch angle of the blades, and the twist angle and chord of their airfoils. The aerodynamic module is a blade element momentum theory code. Both Monte Carlo-based and the Univariate ReducedQuadrature technique, a novel deterministic uncertainty propagationmethod, are used. The performance of the two approaches is assessed both interms of accuracy and computational speed. The adopted optimization method is based on a hybrid multi-objective evolutionary strategy. The presented results highlight that the sensitivity of the yearly production to geometric uncertainties can be reduced by reducing the rotational speed and increasing the aerodynamic blade loads

    Have Econometric Analyses of Happiness Data Been Futile? A Simple Truth About Happiness Scales

    Full text link
    Econometric analyses in the happiness literature typically use subjective well-being (SWB) data to compare the mean of observed or latent happiness across samples. Recent critiques show that comparing the mean of ordinal data is only valid under strong assumptions that are usually rejected by SWB data. This leads to an open question whether much of the empirical studies in the economics of happiness literature have been futile. In order to salvage some of the prior results and avoid future issues, we suggest regression analysis of SWB (and other ordinal data) should focus on the median rather than the mean. Median comparisons using parametric models such as the ordered probit and logit can be readily carried out using familiar statistical softwares like STATA. We also show a previously assumed impractical task of estimating a semiparametric median ordered-response model is also possible by using a novel constrained mixed integer optimization technique. We use GSS data to show the famous Easterlin Paradox from the happiness literature holds for the US independent of any parametric assumption

    Disentangling causal webs in the brain using functional Magnetic Resonance Imaging: A review of current approaches

    Get PDF
    In the past two decades, functional Magnetic Resonance Imaging has been used to relate neuronal network activity to cognitive processing and behaviour. Recently this approach has been augmented by algorithms that allow us to infer causal links between component populations of neuronal networks. Multiple inference procedures have been proposed to approach this research question but so far, each method has limitations when it comes to establishing whole-brain connectivity patterns. In this work, we discuss eight ways to infer causality in fMRI research: Bayesian Nets, Dynamical Causal Modelling, Granger Causality, Likelihood Ratios, LiNGAM, Patel's Tau, Structural Equation Modelling, and Transfer Entropy. We finish with formulating some recommendations for the future directions in this area

    Multi-disciplinary robust design of variable speed wind turbines

    Get PDF
    This paper addresses the preliminary robust multi-disciplinary design of small wind turbines. The turbine to be designed is assumed to be connected to the grid by means of power electronic converters. The main input parameter is the yearly wind distribution at the selected site, and it is represented by means of a Weibull distribution. The objective function is the electrical energy delivered yearly to the grid. Aerodynamic and electrical characteristics are fully coupled and modelled by means of low- and medium-fidelity models. Uncertainty affecting the blade geometry is considered, and a multi-objective hybrid evolutionary algorithm code is used to maximise the mean value of the yearly energy production and minimise its variance

    A Survey on Compiler Autotuning using Machine Learning

    Full text link
    Since the mid-1990s, researchers have been trying to use machine-learning based approaches to solve a number of different compiler optimization problems. These techniques primarily enhance the quality of the obtained results and, more importantly, make it feasible to tackle two main compiler optimization problems: optimization selection (choosing which optimizations to apply) and phase-ordering (choosing the order of applying optimizations). The compiler optimization space continues to grow due to the advancement of applications, increasing number of compiler optimizations, and new target architectures. Generic optimization passes in compilers cannot fully leverage newly introduced optimizations and, therefore, cannot keep up with the pace of increasing options. This survey summarizes and classifies the recent advances in using machine learning for the compiler optimization field, particularly on the two major problems of (1) selecting the best optimizations and (2) the phase-ordering of optimizations. The survey highlights the approaches taken so far, the obtained results, the fine-grain classification among different approaches and finally, the influential papers of the field.Comment: version 5.0 (updated on September 2018)- Preprint Version For our Accepted Journal @ ACM CSUR 2018 (42 pages) - This survey will be updated quarterly here (Send me your new published papers to be added in the subsequent version) History: Received November 2016; Revised August 2017; Revised February 2018; Accepted March 2018

    Stat Med

    Get PDF
    Creating statistical models that generate accurate predictions of infectious disease incidence is a challenging problem whose solution could benefit public health decision makers. We develop a new approach to this problem using kernel conditional density estimation (KCDE) and copulas. We obtain predictive distributions for incidence in individual weeks using KCDE and tie those distributions together into joint distributions using copulas. This strategy enables us to create predictions for the timing of and incidence in the peak week of the season. Our implementation of KCDE incorporates 2 novel kernel components: a periodic component that captures seasonality in disease incidence and a component that allows for a full parameterization of the bandwidth matrix with discrete variables. We demonstrate via simulation that a fully parameterized bandwidth matrix can be beneficial for estimating conditional densities. We apply the method to predicting dengue fever and influenza and compare to a seasonal autoregressive integrated moving average model and HHH4, a previously published extension to the generalized linear model framework developed for infectious disease incidence. The KCDE outperforms the baseline methods for predictions of dengue incidence in individual weeks. The KCDE also offers more consistent performance than the baseline models for predictions of incidence in the peak week and is comparable to the baseline models on the other prediction targets. Using the periodic kernel function led to better predictions of incidence. Our approach and extensions of it could yield improved predictions for public health decision makers, particularly in diseases with heterogeneous seasonal dynamics such as dengue fever.CC999999/Intramural CDC HHS/United StatesR01 AI102939/AI/NIAID NIH HHS/United StatesR21 AI115173/AI/NIAID NIH HHS/United States2018-12-30T00:00:00Z28905403PMC5771499vault:2585

    Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    Get PDF
    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks

    A Multivariate Water Quality Investigation of Select Drainage Ditches in the Arroyo Colorado River Watershed, Texas

    Get PDF
    Drainage ditches are widely used for agricultural water management to help remove excess water from fields, which mitigates the effects of water logging and salinization. These ditches act as a direct hydraulic link between the agricultural field and streams and rivers. As such, there is an increasing concern that drainage ditches can act as conduits for nutrient transport and, in conjunction with other point and nonpoint sources, can contribute to eutrophication and decreased dissolved oxygen levels in receiving water bodies. Studies have linked drainage ditches to hypoxia in the Gulf of Mexico and eutrophication of the Great Lakes (Dagg and Breed, 2003; Moore et al., 2010). However, there is also evidence suggesting that drainage ditches can help attenuate the loadings of phosphorus and suspended sediments (R. Kröger et al., 2008) and thus foster water quality improvements at a watershed scale. There is a growing interest in understanding the nutrient behavior in drainage ditches both in the United States (Bhattarai et al. 2009; Moore, et al. 2010; Ahiablame et al. 2011) as well as other parts of the world (Nguyen and Sukias 2002; Leone et al. 2008; Bonaiti and Borin 2010)
    • 

    corecore