687 research outputs found

    Optimization with Discrete Simultaneous Perturbation Stochastic Approximation Using Noisy Loss Function Measurements

    Get PDF
    Discrete stochastic optimization considers the problem of minimizing (or maximizing) loss functions defined on discrete sets, where only noisy measurements of the loss functions are available. The discrete stochastic optimization problem is widely applicable in practice, and many algorithms have been considered to solve this kind of optimization problem. Motivated by the efficient algorithm of simultaneous perturbation stochastic approximation (SPSA) for continuous stochastic optimization problems, we introduce the middle point discrete simultaneous perturbation stochastic approximation (DSPSA) algorithm for the stochastic optimization of a loss function defined on a p-dimensional grid of points in Euclidean space. We show that the sequence generated by DSPSA converges to the optimal point under some conditions. Consistent with other stochastic approximation methods, DSPSA formally accommodates noisy measurements of the loss function. We also show the rate of convergence analysis of DSPSA by solving an upper bound of the mean squared error of the generated sequence. In order to compare the performance of DSPSA with the other algorithms such as the stochastic ruler algorithm (SR) and the stochastic comparison algorithm (SC), we set up a bridge between DSPSA and the other two algorithms by comparing the probability in a big-O sense of not achieving the optimal solution. We show the theoretical and numerical comparison results of DSPSA, SR, and SC. In addition, we consider an application of DSPSA towards developing optimal public health strategies for containing the spread of influenza given limited societal resources

    VI Workshop on Computational Data Analysis and Numerical Methods: Book of Abstracts

    Get PDF
    The VI Workshop on Computational Data Analysis and Numerical Methods (WCDANM) is going to be held on June 27-29, 2019, in the Department of Mathematics of the University of Beira Interior (UBI), Covilhã, Portugal and it is a unique opportunity to disseminate scientific research related to the areas of Mathematics in general, with particular relevance to the areas of Computational Data Analysis and Numerical Methods in theoretical and/or practical field, using new techniques, giving especial emphasis to applications in Medicine, Biology, Biotechnology, Engineering, Industry, Environmental Sciences, Finance, Insurance, Management and Administration. The meeting will provide a forum for discussion and debate of ideas with interest to the scientific community in general. With this meeting new scientific collaborations among colleagues, namely new collaborations in Masters and PhD projects are expected. The event is open to the entire scientific community (with or without communication/poster)

    Simulation Optimization for Manufacturing System Design

    Get PDF
    A manufacturing system characterized by its stochastic nature, is defined by both qualitative and quantitative variables. Often there exists a situation when a performance measure such as throughput, work-in-process or cycle time of the system needs to be optimized with respect to some decision variables. It is generally convenient to express a manufacturing system in the form of an analytical model, to get the solutions as quickly as possible. However, as the complexity of the system increases, it gets more and more difficult to accommodate that complexity into the analytical model due to the uncertainty involved. In such situations, we resort to simulation modeling as an effective alternative.Equipment selection forms a separate class of problems in the domain of manufacturing systems. It assumes a high significance for capital-intensive industry, especially the semiconductor industry whose equipment cost comprises a significant amount of the total budget spent. For semiconductor wafer fabs that incorporate complex product flows of multiple product families, a reduction in the cycle time through the choice of appropriate equipment could result in significant profits. This thesis focuses on the equipment selection problem, which selects tools for the workstations with a choice of different tool types at each workstation. The objective is to minimize the average cycle time of a wafer lot in a semiconductor fab, subject to throughput and budget constraints. To solve the problem, we implement five simulation-based algorithms and an analytical algorithm. The simulation-based algorithms include the hill climbing algorithm, two gradient-based algorithms biggest leap and safer leap, and two versions of the nested partitions algorithm. We compare the performance of the simulation-based algorithms against that of the analytical algorithm and discuss the advantages of prior knowledge of the problem structure for the selection of a suitable algorithm

    Investigating Cellular Structures at the Nanoscale with Organic Fluorophores

    Get PDF
    Super-resolution fluorescence imaging can provide insights into cellular structure and organization with a spatial resolution approaching virtually electron microscopy. Among all the different super-resolution methods single-molecule-based localization microscopy could play an exceptional role in the future because it can provide quantitative information, for example, the absolute number of biomolecules interacting in space and time. Here, small organic fluorophores are a decisive factor because they exhibit high fluorescence quantum yields and photostabilities, thus enabling their localization with nanometer precision. Besides past progress, problems with high-density and specific labeling, especially in living cells, and the lack of suited standards and long-term continuous imaging methods with minimal photodamage render the exploitation of the full potential of the method currently challenging

    Multi-fidelity modelling approach for airline disruption management using simulation

    Get PDF
    Disruption to airline schedules is a key issue for the industry. There are various causes for disruption, ranging from weather events through to technical problems grounding aircraft. Delays can quickly propagate through a schedule, leading to high financial and reputational costs. Mitigating the impact of a disruption by adjusting the schedule is a high priority for the airlines. The problem involves rearranging aircraft, crew and passengers, often with large fleets and many uncertain elements. The multiple objectives, cost, delay and minimising schedule alterations, create a trade-off. In addition, the new schedule should be achievable without over-promising. This thesis considers the rescheduling of aircraft, the Aircraft Recovery Problem. The Aircraft Recovery Problem is well studied, though the literature mostly focusses on deterministic approaches, capable of modelling the complexity of the industry but with limited ability to capture the inherent uncertainty. Simulation offers a natural modelling framework, handling both the complexity and variability. However, the combinatorial aircraft allocation constraints are difficult for many simulation optimisation approaches, suggesting that a more tailored approach is required. This thesis proposes a two-stage multi-fidelity modelling approach, combining a low-fidelity Integer Program and a simulation. The deterministic Integer Program allocates aircraft to flights and gives an initial estimate of the delay of each flight. By solving in a multi-objective manner, it can quickly produce a set of promising solutions representing different trade-offs between disruption costs, total delay and the number of schedule alterations. The simulation is used to evaluate the candidate solutions and look for further local improvement. The aircraft allocation is fixed whilst a local search is performed over the flight delays, a continuous valued problem, aiming reduce costs. This is done by developing an adapted version of STRONG, a stochastic trust-region approach. The extension incorporates experimental design principles and projected gradient steps into STRONG to enable it to handle bound constraints. This method is demonstrated and evaluated with computational experiments on a set of disruptions with different fleet sizes and different numbers of disrupted aircraft. The results suggest that this multi-fidelity combination can produce good solutions to the Aircraft Recovery Problem. A more theoretical treatment of the extended trust-region simulation optimisation is also presented. The conditions under which a guarantee of the algorithm's asymptotic performance may be possible and a framework for proving these guarantees is presented. Some of the work towards this is discussed and we highlight where further work is required. This multi-fidelity approach could be used to implement a simulation-based decision support system for real-time disruption handling. The use of simulation for operational decisions raises the issue of how to evaluate a simulation-based tool and its predictions. It is argued that this is not a straightforward question of the real-world result being good or bad, as natural system variability can mask the results. This problem is formalised and a method is proposed for detecting systematic errors that could lead to poor decision making. The method is based on the Probability Integral Transformation using the simulation Empirical Cumulative Distribution Function and goodness of fit hypothesis tests for uniformity. This method is tested by applying it to the airline disruption problem previously discussed. Another simulation acts as a proxy real world, which deviates from the simulation in the runway service times. The results suggest that the method has high power when the deviations have a high impact on the performance measure of interest (more than 20%), but low power when the impact is less than 5%
    corecore