527 research outputs found

    Bayesian optimization in adverse scenarios

    Get PDF
    Optimization problems with expensive-to-evaluate objective functions are ubiquitous in scientific and industrial settings. Bayesian optimization has gained widespread acclaim for optimizing expensive (and often black box) functions due to its theoretical performance guarantees and empirical sample efficiency in a variety of settings. Nevertheless, many practical scenarios remain where prevailing Bayesian optimization techniques fall short. We consider four such scenarios. First, we formalize the optimization problem where the goal is to identify robust designs with respect to multiple objective functions that are subject to input noise. Such robust design problems frequently arise, for example, in manufacturing settings where fabrication can only be performed with limited precision. We propose a method that identifies a set of optimal robust designs, where each design provides probabilistic guarantees jointly on multiple objectives. Second, we consider sample-efficient high-dimensional multi-objective optimization. This line of research is motivated by the challenging task of designing optical displays for augmented reality to optimize visual quality and efficiency, where the designs are specified by high-dimensional parameterizations governing complex geometries. Our proposed trust-region based algorithm yields order-of-magnitude improvements in sample complexity on this problem. Third, we consider multi-objective optimization of expensive functions with variable-cost, decoupled, and/or multi-fidelity evaluations and propose a Bayes-optimal, non-myopic acquisition function, which significantly improves sample efficiency in scenarios with incomplete information. We apply this to hardware-aware neural architecture search where the objective, on-device latency and model accuracy, can often be evaluated independently. Fourth, we consider the setting where the search space consists of discrete (and potentially continuous) parameters. We propose a theoretically grounded technique that uses a probabilistic reparameterization to transform the discrete or mixed inner optimization problem into a continuous one leading to more effective Bayesian optimization policies. Together, this thesis provides a playbook for Bayesian optimization in several practical adverse scenarios

    Learning Discriminative Features and Structured Models for Segmentation in Microscopy and Natural Images

    Get PDF
    Segmenting images is a significant challenge that has drawn a lot of attention from different fields of artificial intelligence and has many practical applications. One such challenge addressed in this thesis is the segmentation of electron microscope (EM) imaging of neural tissue. EM microscopy is one of the key tools used to analyze neural tissue and understand the brain, but the huge amounts of data it produces make automated analysis necessary. In addition to the challenges specific to EM data, the common problems encountered in image segmentation must also be addressed. These problems include extracting discriminative features from the data and constructing a statistical model using ground-truth data. Although complex models appear to be more attractive because they allow for more expressiveness, they also lead to a higher computational complexity. On the other hand, simple models come with a lower complexity but less faithfully express the real world. Therefore, one of the most challenging tasks in image segmentation is in constructing models that are expressive enough while remaining tractable. In this work, we propose several automated graph partitioning approaches that address these issues. These methods reduce the computational complexity by operating on supervoxels instead of voxels, incorporating features capable of describing the 3D shape of the target objects and using structured models to account for correlation in output variables. One of the non-trivial issues with such models is that their parameters must be carefully chosen for optimal performance. A popular approach to learning model parameters is a maximum-margin approach called Structured SVM (SSVM) that provides optimality guarantees but also suffers from two main drawbacks. First, SSVM-based approaches are usually limited to linear kernels, since more powerful nonlinear kernels cause the learning to become prohibitively expensive. In this thesis, we introduce an approach to “kernelize” the features so that a linear SSVM framework can leverage the power of nonlinear kernels without incurring their high computational cost. Second, the optimality guarentees are violated for complex models with strong inter-relations between the output variables. We propose a new subgradient-based method that is more robust and leads to improved convergence properties and increased reliability. The different approaches presented in this thesis are applicable to both natural and medical images. They are able to segment mitochondria at a performance level close to that of a human annotator, and outperform state-of-the-art segmentation techniques while still benefiting from a low learning time

    Proactive-reactive, robust scheduling and capacity planning of deconstruction projects under uncertainty

    Get PDF
    A project planning and decision support model is developed and applied to identify and reduce risk and uncertainty in deconstruction project planning. It allows calculating building inventories based on sensor information and construction standards and it computes robust project plans for different scenarios with multiple modes, constrained renewable resources and locations. A reactive and flexible planning element is proposed in the case of schedule infeasibility during project execution

    Reconciliation, Restoration and Reconstruction of a Conflict Ridden Country

    Get PDF
    Conflict has sadly been a constant part of history. Winning a conflict and making a lasting peace are often not the same thing. While a peace treaty ends a conflict and often dictates terms from the winners’ perspective, it may not create a lasting peace. Short of unconditional surrender, modern conflict ends with a negotiated cessation of hostilities. Such accords may have some initial reconstruction agreements, but Reconciliation, Restoration and Reconstruction (RRR) is a long term process. This study maintains that to achieve a lasting peace: 1) The culture and beliefs of the conflict nation must be continuously considered and 2) RRR is a long term effort which will occur over years not just in the immediate wake of signing a treaty or agreement. To assure the inclusion of all stakeholders and gain the best results in dealing with this “wicked problem”, an array of Operations Research techniques can be used to support the long term planning and execution of a RRR effort. The final decisions will always be political, but the analysis provided by an OR support team will guide the decision makers to better execute consensus decisions that consider all stakeholder needs. The development of the value hierarchy framework in this dissertation is a keystone of building a rational OR supported long term plan for a successful RRR. The primary aim of the research is to propose a framework and associated set of guidelines derived from appropriate techniques of OR, Decision Analysis and Project Management (right from development of a consensus based value hierarchy to its implementation, feedback and steering corrections) that may be applied to help RRR efforts in any conflict ridden country across the globe. The framework is applicable to any conflict ridden country after incorporating changes particular to any country witnessing a prolonged conflict

    Data analysis and data assimilation of Arctic Ocean observations

    Get PDF
    Thesis (Ph.D.) University of Alaska Fairbanks, 2019Arctic-region observations are sparse and represent only a small portion of the physical state of nature. It is therefore essential to maximize the information content of observations and bservation-conditioned analyses whenever possible, including the quantification of their accuracy. The four largely disparate works presented here emphasize observation analysis and assimilation in the context of the Arctic Ocean (AO). These studies focus on the relationship between observational data/products, numerical models based on physical processes, and the use of such data to constrain and inform those products/models to di_erent ends. The first part comprises Chapters 1 and 2 which revolve around oceanographic observations collected during the International Polar Year (IPY) program of 2007-2009. Chapter 1 validates pan- Arctic satellite-based sea surface temperature and salinity products against these data to establish important estimates of product reliability in terms of bias and bias-adjusted standard errors. It establishes practical regional reliability for these products which are often used in modeling and climatological applications, and provides some guidance for improving them. Chapter 2 constructs a gridded full-depth snapshot of the AO during the IPY to visually outline recent, previouslydocumented AO watermass distribution changes by comparing it to a historical climatology of the latter 20th century derived from private Russian data. It provides an expository review of literature documenting major AO climate changes and augments them with additional changes in freshwater distribution and sea surface height in the Chukchi and Bering Seas. The last two chapters present work focused on the application of data assimilation (DA) methodologies, and constitute the second part of this thesis focused on the synthesis of numerical modeling and observational data. Chapter 3 presents a novel approach to sea ice model trajectory optimization whereby spatially-variable sea ice rheology parameter distributions provide the additional model flexibility needed to assimilate observable components of the sea ice state. The study employs a toy 1D model to demonstrate the practical benefits of the approach and serves as a proof-of-concept to justify the considerable effort needed to extend the approach to 2D. Chapter 4 combines an ice-free model of the Chukchi Sea with a modified ensemble filter to develop a DA system which would be suitable for operational forecasting and monitoring the region in support of oil spill mitigation. The method improves the assimilation of non-Gaussian asynchronous surface current observations beyond the traditional approach.Chapter 1: Sea-surface temperature and salinity product comparison against external in situ data in the Arctic Ocean -- Chapter 2: Changes in Arctic Ocean climate evinced through analysis of IPY 2007-2008 oceanographic observations -- Chapter 3: Toward optimization of rheology in sea ice models through data assimilation -- Chapter 4: Ensemble-transform filtering of HFR & ADCP velocities in the Chukchi Sea -- General conclusion
    • …
    corecore