75 research outputs found

    Hyperparameter optimization: Foundations, algorithms, best practices, and open challenges

    Get PDF
    Most machine learning algorithms are configured by a set of hyperparameters whose values must be carefully chosen and which often considerably impact performance. To avoid a time-consuming and irreproducible manual process of trial-and-error to find well-performing hyperparameter configurations, various automatic hyperparameter optimization (HPO) methods—for example, based on resampling error estimation for supervised machine learning—can be employed. After introducing HPO from a general perspective, this paper reviews important HPO methods, from simple techniques such as grid or random search to more advanced methods like evolution strategies, Bayesian optimization, Hyperband, and racing. This work gives practical recommendations regarding important choices to be made when conducting HPO, including the HPO algorithms themselves, performance evaluation, how to combine HPO with machine learning pipelines, runtime improvements, and parallelization. This article is categorized under: Algorithmic Development > Statistics Technologies > Machine Learning Technologies > Prediction

    Data-driven solutions to enhance planning, operation and design tools in Industry 4.0 context

    Get PDF
    This thesis proposes three different data-driven solutions to be combined to state-of-the-art solvers and tools in order to primarily enhance their computational performances. The problem of efficiently designing the open sea floating platforms on which wind turbines can be mount on will be tackled, as well as the tuning of a data-driven engine's monitoring tool for maritime transportation. Finally, the activities of SAT and ASP solvers will be thoroughly studied and a deep learning architecture will be proposed to enhance the heuristics-based solving approach adopted by such software. The covered domains are different and the same is true for their respective targets. Nonetheless, the proposed Artificial Intelligence and Machine Learning algorithms are shared as well as the overall picture: promote Industrial AI and meet the constraints imposed by Industry 4.0 vision. The lesser presence of human-in-the-loop, a data-driven approach to discover causalities otherwise ignored, a special attention to the environmental impact of industries' emissions, a real and efficient exploitation of the Big Data available today are just a subset of the latter. Hence, from a broader perspective, the experiments carried out within this thesis are driven towards the aforementioned targets and the resulting outcomes are satisfactory enough to potentially convince the research community and industrialists that they are not just "visions" but they can be actually put into practice. However, it is still an introduction to the topic and the developed models are at what can be defined a "pilot" stage. Nonetheless, the results are promising and they pave the way towards further improvements and the consolidation of the dictates of Industry 4.0

    Simulator adaptation at runtime for component-based simulation software

    Get PDF
    Component-based simulation software can provide many opportunities to compose and configure simulators, resulting in an algorithm selection problem for the user of this software. This thesis aims to automate the selection and adaptation of simulators at runtime in an application-independent manner. Further, it explores the potential of tailored and approximate simulators - in this thesis concretely developed for the modeling language ML-Rules - supporting the effectiveness of the adaptation scheme.Komponenten-basierte Simulationssoftware kann viele Möglichkeiten zur Komposition und Konfiguration von Simulatoren bieten und damit zu einem Konfigurationsproblem fĂŒr Nutzer dieser Software fĂŒhren. Das Ziel dieser Arbeit ist die Entwicklung einer generischen und automatisierten Auswahl- und Adaptionsmethode fĂŒr Simulatoren. DarĂŒber hinaus wird das Potential von spezifischen und approximativen Simulatoren anhand der Modellierungssprache ML-Rules untersucht, welche die EffektivitĂ€t des entwickelten Adaptionsmechanismus erhöhen können

    Development of a hybrid genetic programming technique for computationally expensive optimisation problems

    Get PDF
    The increasing computational power of modern computers has contributed to the advance of nature-inspired algorithms in the fields of optimisation and metamodelling. Genetic programming (GP) is a genetically-inspired technique that can be used for metamodelling purposes. GP main strength is in the ability to infer the mathematical structure of the best model fitting a given data set, relying exclusively on input data and on a set of mathematical functions given by the user. Model inference is based on an iterative or evolutionary process, which returns the model as a symbolic expression (text expression). As a result, model evaluation is inexpensive and the generated expressions can be easily deployed to other users. Despite genetic programming has been used in many different branches of engineering, its diffusion on industrial scale is still limited. The aims of this thesis are to investigate the intrinsic limitations of genetic programming, to provide a comprehensive review of how researchers have tackled genetic programming main weaknesses and to improve genetic programming ability to extract accurate models from data. In particular, research has followed three main directions. The first has been the development of regularisation techniques to improve the generalisation ability of a model of a given mathematical structure, based on the use of a specific tuning algorithm in case sinusoidal functions are among the functions the model is composed of. The second has been the analysis of the influence that prior knowledge regarding the function to approximate may have on genetic programming inference process. The study has led to the introduction of a strategy that allows to use prior knowledge to improve model accuracy. Thirdly, the mathematical structure of the models returned by genetic programming has been systematically analysed and has led to the conclusion that the linear combination is the structure that is mostly returned by genetic programming runs. A strategy has been formulated to reduce the evolutionary advantage of linear combinations and to protect more complex classes of individuals throughout the evolution. The possibility to use genetic programming in industrial optimisation problems has also been assessed with the help of a new genetic programming implementation developed during the research activity. Such implementation is an open source project and is freely downloadable from http://www.personal.leeds.ac.uk/~cnua/mypage.html

    Proceedings of the NASA Conference on Space Telerobotics, volume 2

    Get PDF
    These proceedings contain papers presented at the NASA Conference on Space Telerobotics held in Pasadena, January 31 to February 2, 1989. The theme of the Conference was man-machine collaboration in space. The Conference provided a forum for researchers and engineers to exchange ideas on the research and development required for application of telerobotics technology to the space systems planned for the 1990s and beyond. The Conference: (1) provided a view of current NASA telerobotic research and development; (2) stimulated technical exchange on man-machine systems, manipulator control, machine sensing, machine intelligence, concurrent computation, and system architectures; and (3) identified important unsolved problems of current interest which can be dealt with by future research

    Uncertainty modeling in higher dimensions

    Get PDF
    Moderne Design Probleme stellen Ingenieure vor mehrere elementare Aufgaben. 1) Das Design muss die angestrebten FunktionalitĂ€ten aufweisen. 2) Es muss optimal sein in Hinsicht auf eine vorgegebene Zielfunktion. 3) Schließlich muss das Design abgesichert sein gegen Unsicherheiten, die nicht zu Versagen des Designs fĂŒhren dĂŒrfen. All diese Aufgaben lassen sich unter dem Begriff der robusten Design Optimierung zusammenfassen und verlangen nach computergestĂŒtzten Methoden, die Unsicherheitsmodellierung und Design Optimierung in sich vereinen. Unsicherheitsmodellierung enthĂ€lt einige fundamentale Herausforderungen: Der Rechenaufwand darf gewisse Grenzen nicht ĂŒberschreiten; unbegrĂŒndete Annahmen mĂŒssen so weit wie möglich vermieden werden. Die beiden kritischsten Probleme betreffen allerdings den Umgang mit unvollstĂ€ndiger stochastischer Information und mit hoher DimensionalitĂ€t. Der niedrigdimensionale Fall ist gut erforscht, und es existieren diverse Methoden, auch unvollstĂ€ndige Informationen zu verarbeiten. In höheren Dimensionen hingegen ist die Anzahl der Möglichkeiten derzeit sehr begrenzt. Ungenauigkeit und UnvollstĂ€ndigkeit von Daten kann schwerwiegende Probleme verursachen - aber die Lage ist nicht hoffnungslos. In dieser Dissertation zeigen wir, wie man den hochdimensionalen Fall mit Hilfe von "Potential Clouds" in ein eindimensionales Problem ĂŒbersetzt. Dieser Ansatz fĂŒhrt zu einer Unsicherheitsanalyse auf Konfidenzregionen relevanter Szenarien mittels einer Potential Funktion. Die Konfidenzregionen werden als Nebenbedingungen in einem Design Optimierungsproblem formuliert. Auf diese Weise verknĂŒpfen wir Unsicherheitsmodellierung und Design Optimierung, wobei wir außerdem eine adaptive Aktualisierung der Unsicherheitsinformationen ermöglichen. Abschließend wenden wir unsere Methode in zwei Fallstudien an, in 24, bzw. in 34 Dimensionen.Modern design problems impose multiple major tasks an engineer has to accomplish. 1) The design should account for the designated functionalities. 2) It should be optimal with respect to a given design objective. 3) Ultimately the design must be safeguarded against uncertain perturbations which should not cause failure of the design. These tasks are united in the problem of robust design optimization giving rise to the development of computational methods for uncertainty modeling and design optimization, simultaneously. Methods for uncertainty modeling face some fundamental challenges: The computational effort should not exceed certain limitations; unjustified assumptions must be avoided as far as possible. However, the most critical issues concern the handling of incomplete information and of high dimensionality. While the low dimensional case is well studied and several methods exist to handle incomplete information, in higher dimensions there are only very few techniques. Imprecision and lack of sufficient information cause severe difficulties - but the situation is not hopeless. In this dissertation, it is shown how to transfer the high-dimensional to the one-dimensional case by means of the potential clouds formalism. Using a potential function, this enables a worst-case analysis on confidence regions of relevant scenarios. The confidence regions are weaved into an optimization problem formulation for robust design as safety constraints. Thus an interaction between optimization phase and worst-case analysis is modeled which permits a posteriori adaptive information updating. Finally, we apply our approach in two case studies in 24 and 34 dimensions, respectively
    • 

    corecore