957 research outputs found

    LMI - BASED H2 AND H STATE - FEEDBACK CONTROLLER DESIGN FOR FIN STABILIZER OF NONLINEAR ROLL MOTION OF A FISHING BOAT

    Get PDF
    This paper presents the analyses of nonlinear roll responses of a fishing boat in waves. In addition to roll damping nonlinearity, the nonlinear roll restoring which has seventh order equation has been taken into consideration to accurate control application. To overcome nonlinearity and the effects of uncertainties, LMI (Linear Matrix Inequality) - based H2 and H State - Feedback Control are applied for the fin roll stabilizer of a fishing boat. The fin characteristics are calculated by Star CCM+ package software. Finally, utilising the studies presented to illustrate the feasibility and efficiency of the H2 and H control methods, the results of the simulations are demonstrated the performance of fin roll stabilizer

    Robust LPV Control for Attitude Stabilization of a Quadrotor Helicopter under Input Saturations

    Get PDF
    This article investigates the robust stabilization of the rotational subsystem of a quadrotor against external inputs (disturbances, noises, and parametric uncertainties) by the LFT-based LPV technique. By establishing the LPV attitude model, the LPV robust controller is designed for the system. The weighting functions are computed by Cuckoo Search, a meta-heuristic optimization algorithm. Besides, the input saturations are also taken into account through the Anti-Windup compensation technique. Simulation results show the robustness of the closed-loop system against disturbances, measurement noises, and the parametric uncertainties

    Global sensitivity analysis based on DIRECT-KG-HDMR and thermal optimization of pin-fin heat sink for the platform inertial navigation system

    Full text link
    In this study, in order to reduce the local high temperature of the platform in inertial navigation system (PINS), a pin-fin heat sink with staggered arrangement is designed. To reduce the dimension of the inputs and improve the efficiency of optimization, a feasible global sensitivity analysis (GSA) based on Kriging-High Dimensional Model Representation with DIviding RECTangles sampling strategy (DIRECT-KG-HDMR) is proposed. Compared with other GSA methods, the proposed method can indicate the effects of the structural and the material parameters on the maximum temperature at the bottom of the heat sink by using both sensitivity and coupling coefficients. From the results of GSA, it can be found that the structural parameters have greater effects on thermal performance than the material ones. Moreover, the coupling intensities between the structural and material parameters are weak. Therefore, the structural parameters are selected to optimize the thermal performance of the heat sink, and several popular optimization algorithms such as GA, DE, TLBO, PSO and EGO are used for the optimization. Moreover, steady thermal response of the PINS with the optimized heat sink is also studied, and its result shows that the maximum temperature of high temperature region of the platform is reduced by 1.09 degree Celsius compared with the PINS without the heat sink.Comment: 34 pages, 18 figures, 5 table

    A Polyhedral Study of Mixed 0-1 Set

    Get PDF
    We consider a variant of the well-known single node fixed charge network flow set with constant capacities. This set arises from the relaxation of more general mixed integer sets such as lot-sizing problems with multiple suppliers. We provide a complete polyhedral characterization of the convex hull of the given set

    Functional-input metamodeling: an application to coastal flood early warning

    Get PDF
    Les inondations en général affectent plus de personnes que tout autre catastrophe. Au cours de la dernière décennie du 20ème siècle, plus de 1.5 milliard de personnes ont été affectées. Afin d'atténuer l'impact de ce type de catastrophe, un effort scientifique significatif a été consacré à la constitution de codes de simulation numériques pour la gestion des risques. Les codes disponibles permettent désormais de modéliser correctement les événements d'inondation côtière à une résolution assez élevée. Malheureusement, leur utilisation est fortement limitée pour l'alerte précoce, avec une simulation de quelques heures de dynamique maritime prenant plusieurs heures à plusieurs jours de temps de calcul. Cette thèse fait partie du projet ANR RISCOPE, qui vise à remédier cette limitation en construisant des métamodèles pour substituer les codes hydrodynamiques coûteux en temps de calcul. En tant qu'exigence particulière de cette application, le métamodèle doit être capable de traiter des entrées fonctionnelles correspondant à des conditions maritimes variant dans le temps. À cette fin, nous nous sommes concentrés sur les métamodèles de processus Gaussiens, développés à l'origine pour des entrées scalaires, mais maintenant disponibles aussi pour des entrées fonctionnelles. La nature des entrées a donné lieu à un certain nombre de questions sur la bonne façon de les représenter dans le métamodèle: (i) quelles entrées fonctionnelles méritent d'être conservées en tant que prédicteurs, (ii) quelle méthode de réduction de dimension (e.g., B-splines, PCA, PLS) est idéale, (iii) quelle est une dimension de projection appropriée, et (iv) quelle est une distance adéquate pour mesurer les similitudes entre les points d'entrée fonctionnels dans la fonction de covariance. Certaines de ces caractéristiques - appelées ici paramètres structurels - du modèle et d'autres telles que la famille de covariance (e.g., Gaussien, Matérn 5/2) sont souvent arbitrairement choisies a priori. Comme nous l'avons montré à travers des expériences, ces décisions peuvent avoir un fort impact sur la capacité de prédiction du métamodèle. Ainsi, sans perdre de vue notre but de contribuer à l'amélioration de l'alerte précoce des inondations côtières, nous avons entrepris la construction d'une méthodologie efficace pour définir les paramètres structurels du modèle. Comme première solution, nous avons proposé une approche d'exploration basée sur la Méthodologie de Surface de Réponse. Elle a été utilisé efficacement pour configurer le métamodèle requis pour une fonction de test analytique, ainsi que pour une version simplifiée du code étudié dans RISCOPE. Bien que relativement simple, la méthodologie proposée a pu trouver des configurations de métamodèles de capacité de prédiction élevée avec des économies allant jusqu'à 76.7% et 38.7% du temps de calcul utilisé par une approche d'exploration exhaustive dans les deux cas étudiés. La solution trouvée par notre méthodologie était optimale dans la plupart des cas. Nous avons développé plus tard un deuxième prototype basé sur l'Optimisation par Colonies de Fourmis. Cette nouvelle approche est supérieure en termes de temps de solution et de flexibilité sur les configurations du modèle qu'elle permet d'explorer. Cette méthode explore intelligemment l'espace de solution et converge progressivement vers la configuration optimale. La collection d'outils statistiques utilisés dans cette thèse a motivé le développement d'un package R appelé funGp. Celui-ci est maintenant disponible dans GitHub et sera soumis prochainement au CRAN. Dans un travail indépendant, nous avons étudié l'estimation des paramètres de covariance d'un processus Gaussien transformé par Maximum de Vraisemblance (MV) et Validation Croisée. Nous avons montré la consistance et la normalité asymptotique des deux estimateurs. Dans le cas du MV, ces résultats peuvent être interprétés comme une preuve de robustesse du MV Gaussien dans le cas de processus non Gaussiens.Currently, floods in general affect more people than any other hazard. In just the last decade of the 20th century, more than 1.5 billion were affected. In the seek to mitigate the impact of this type of hazard, strong scientific effort has been devoted to the constitution of computer codes that could be used as risk management tools. Available computer models now allow properly modelling coastal flooding events at a fairly high resolution. Unfortunately, their use is strongly prohibitive for early warning, with a simulation of few hours of maritime dynamics taking several hours to days of processing time, even on multi-processor clusters. This thesis is part of the ANR RISCOPE project, which aims at addressing this limitation by means of surrogate modeling of the hydrodynamic computer codes. As a particular requirement of this application, the metamodel should be able to deal with functional inputs corresponding to time varying maritime conditions. To this end, we focused on Gaussian process metamodels, originally developed for scalar inputs, but now available also for functional inputs. The nature of the inputs gave rise to a number of questions about the proper way to represent them in the metamodel: (i) which functional inputs are worth keeping as predictors, (ii) which dimension reduction method (e.g., B-splines, PCA, PLS) is ideal, (iii) which is a suitable projection dimension, and given our choice to work with Gaussian process metamodels, also the question of (iv) which is a convenient distance to measure similarities between functional input points within the kernel function. Some of these characteristics - hereon called structural parameters - of the model and some others such as the family of kernel (e.g., Gaussian, Matérn 5/2) are often arbitrarily chosen a priori. Sometimes, those are selected based on other studies. As one may intuit and has been shown by us through experiments, those decisions could have a strong impact on the prediction capability of the resulting model. Thus, without losing sight of our final goal of contributing to the improvement of coastal flooding early warning, we undertook the construction of an efficient methodology to set up the structural parameters of the model. As a first solution, we proposed an exploration approach based on the Response Surface Methodology. It was effectively used to tune the metamodel for an analytic toy function, as well as for a simplified version of the code studied in RISCOPE. While relatively simple, the proposed methodology was able to find metamodel configurations of high prediction capability with savings of up to 76.7% and 38.7% of the time spent by an exhaustive search approach in the analytic case and coastal flooding case, respectively. The solution found by our methodology was optimal in most cases. We developed later a second prototype based on Ant Colony Optimization (ACO). This new approach is more powerful in terms of solution time and flexibility in the features of the model allowed to be explored. The ACO based method smartly samples the solution space and progressively converges towards the optimal configuration. The collection of statistical tools used for metamodeling in this thesis motivated the development of the funGp R package, which is now available in GitHub and about to be submitted to CRAN. In an independent work, we studied the estimation of the covariance parameters of a Transformed Gaussian Process by Maximum Likelihood (ML) and Cross Validation. We showed that both estimators are consistent and asymptotically normal. In the case of ML, these results can be interpreted as a proof of robustness of Gaussian ML in the case of non-Gaussian processes

    Underwater Vehicles

    Get PDF
    For the latest twenty to thirty years, a significant number of AUVs has been created for the solving of wide spectrum of scientific and applied tasks of ocean development and research. For the short time period the AUVs have shown the efficiency at performance of complex search and inspection works and opened a number of new important applications. Initially the information about AUVs had mainly review-advertising character but now more attention is paid to practical achievements, problems and systems technologies. AUVs are losing their prototype status and have become a fully operational, reliable and effective tool and modern multi-purpose AUVs represent the new class of underwater robotic objects with inherent tasks and practical applications, particular features of technology, systems structure and functional properties

    L'intertextualité dans les publications scientifiques

    No full text
    La base de données bibliographiques de l'IEEE contient un certain nombre de duplications avérées avec indication des originaux copiés. Ce corpus est utilisé pour tester une méthode d'attribution d'auteur. La combinaison de la distance intertextuelle avec la fenêtre glissante et diverses techniques de classification permet d'identifier ces duplications avec un risque d'erreur très faible. Cette expérience montre également que plusieurs facteurs brouillent l'identité de l'auteur scientifique, notamment des collectifs de chercheurs à géométrie variable et une forte dose d'intertextualité acceptée voire recherchée

    Operational Research: Methods and Applications

    Get PDF
    Throughout its history, Operational Research has evolved to include a variety of methods, models and algorithms that have been applied to a diverse and wide range of contexts. This encyclopedic article consists of two main sections: methods and applications. The first aims to summarise the up-to-date knowledge and provide an overview of the state-of-the-art methods and key developments in the various subdomains of the field. The second offers a wide-ranging list of areas where Operational Research has been applied. The article is meant to be read in a nonlinear fashion. It should be used as a point of reference or first-port-of-call for a diverse pool of readers: academics, researchers, students, and practitioners. The entries within the methods and applications sections are presented in alphabetical order
    corecore