229,045 research outputs found

    Multivariable Iterative Learning Control Design Procedures: from Decentralized to Centralized, Illustrated on an Industrial Printer

    Get PDF
    Iterative Learning Control (ILC) enables high control performance through learning from measured data, using only limited model knowledge in the form of a nominal parametric model. Robust stability requires robustness to modeling errors, often due to deliberate undermodeling. The aim of this paper is to develop a range of approaches for multivariable ILC, where specific attention is given to addressing interaction. The proposed methods either address the interaction in the nominal model, or as uncertainty, i.e., through robust stability. The result is a range of techniques, including the use of the structured singular value (SSV) and Gershgorin bounds, that provide a different trade-off between modeling requirements, i.e., modeling effort and cost, and achievable performance. This allows control engineers to select the approach that fits the modeling budget and control requirements. This trade-off is demonstrated in a case study on an industrial flatbed printer

    Optimization with multivariate conditional value-at-risk constraints

    Get PDF
    For many decision making problems under uncertainty, it is crucial to develop risk-averse models and specify the decision makers' risk preferences based on multiple stochastic performance measures (or criteria). Incorporating such multivariate preference rules into optimization models is a fairly recent research area. Existing studies focus on extending univariate stochastic dominance rules to the multivariate case. However, enforcing multivariate stochastic dominance constraints can often be overly conservative in practice. As an alternative, we focus on the widely-applied risk measure conditional value-at-risk (CVaR), introduce a multivariate CVaR relation, and develop a novel optimization model with multivariate CVaR constraints based on polyhedral scalarization. To solve such problems for finite probability spaces we develop a cut generation algorithm, where each cut is obtained by solving a mixed integer problem. We show that a multivariate CVaR constraint reduces to finitely many univariate CVaR constraints, which proves the finite convergence of our algorithm. We also show that our results can be naturally extended to a wider class of coherent risk measures. The proposed approach provides a flexible, and computationally tractable way of modeling preferences in stochastic multi-criteria decision making. We conduct a computational study for a budget allocation problem to illustrate the effect of enforcing multivariate CVaR constraints and demonstrate the computational performance of the proposed solution methods

    Optimization with multivariate conditional value-at-risk constraints

    Get PDF
    For many decision making problems under uncertainty, it is crucial to develop risk-averse models and specify the decision makers' risk preferences based on multiple stochastic performance measures (or criteria). Incorporating such multivariate preference rules into optimization models is a fairly recent research area. Existing studies focus on extending univariate stochastic dominance rules to the multivariate case. However, enforcing multivariate stochastic dominance constraints can often be overly conservative in practice. As an alternative, we focus on the widely-applied risk measure conditional value-at-risk (CVaR), introduce a multivariate CVaR relation, and develop a novel optimization model with multivariate CVaR constraints based on polyhedral scalarization. To solve such problems for finite probability spaces we develop a cut generation algorithm, where each cut is obtained by solving a mixed integer problem. We show that a multivariate CVaR constraint reduces to finitely many univariate CVaR constraints, which proves the finite convergence of our algorithm. We also show that our results can be naturally extended to a wider class of coherent risk measures. The proposed approach provides a flexible, and computationally tractable way of modeling preferences in stochastic multi-criteria decision making. We conduct a computational study for a budget allocation problem to illustrate the effect of enforcing multivariate CVaR constraints and demonstrate the computational performance of the proposed solution methods

    Analysis of online advertisement performance using Markov chains

    Get PDF
    The measurement and performance analysis of online marketing is far from simple as it is usually conducted in multiple channels which results depend on each other. The results of the performance analysis can vary drastically depending on the attribution model used. An online marketing attribution analysis is needed to make better decisions on where to allocate marketing budgets. This thesis aims to provide a framework for more optimal budget alloca- tion by conducting a data-driven attribution model analysis to the case company’s dataset and comparing the results with the de-facto last-click attribution model’s results. The frame- work is currently utilized in the case company to improve the online marketing budget allo- cation and to gain better understanding of the marketing efforts. The thesis begins with literature review to online marketing, measurement techniques and most used attribution modeling models in the industry. The Markov’s attribution model was chosen to the analysis because of its promising results in other research and the ease of implementation with the dataset available. The dataset used in the analysis contains 582 111 user paths collected during 7 months period from the case company’s website. The analysis was conducted using R programming language and open source ChannelAttribution package that includes tools for fitting a k-order Markovian model in to a dataset and analyzing the results and the model’s reliability. The performance of the attribution model was analyzed using a ROC curve to evaluate the prediction accuracy of the model. The results of the research indicate the Markov’s model gives more reliable results on where to allocate the marketing budget than then last-click attribution model that is widely used in the industry. Overall the objectives of this thesis were achieved, and this study pro- vides a solid framework for marketing managers to analyze their marketing efforts and real- locate their marketing budgets in more optimal way. However, more research is needed to improve the prediction accuracy of the model and to improve the understanding of the effects of budget reallocation

    Hydrologic analysis of a limestone quarry using EPA\u27s HELP Version 3.08 Model

    Get PDF
    Aggregates were historically a low cost commodity but with communities and governmental agencies reducing the amount of mining the cost is increasing dramatically. An awareness needs to be brought to communities that aggregate production is necessary for ensuring the existing infrastructure in today’s world. This can be accomplished using proven technologies in other areas and applying them to show how viable reclamation is feasible. A proposed mine reclamation, Douglas Township quarry (DTQ), in Dakota Township, MN was evaluated using Visual Hydrologic Evaluation of Landfill Performance (HELP) model. The HELP is commonly employed for estimating the water budget of a landfill, however, it was applied to determine the water budget of the DTQ following mining. Using an environmental impact statement as the case study, modeling predictions indicated the DTQ will adequately drain the water being put into the system. The height of the groundwater table will rise slightly due to the mining excavations but no ponding will occur. The application of HELP model determined the water budget of the DTQ and can be used as a viable option for mining companies to demonstrate how land can be reclaimed following mining operations

    Application of As-built Data in Building Retrofit Decision Making Process

    Get PDF
    AbstractWith the growing needs of improving building sustainability, an increasing number of existing buildings need renovation to meet the expectation of the stakeholders. In the pre-design phase, it is very critical to have the best decision made to satisfy both the project budget and the performance standard. For a new buildings, a whole building energy simulation analysis is very helpful for this decision making process because it can provide the stakeholders the evaluation results of all alternative solutions. However, for existing buildings, the as-built data required for the building energy modeling process is not always available, and its manual collection process is time-consuming and error prone. This paper first reviews the state-of-the-art methods of automated data collection, and then introduces the automatic as-built BIM model creation process through a case study. This study also successfully demonstrated the interoperability between the created as-built model and a typical energy simulation tool. At last, a discussion is made about the limitations and challenges of the current state of practice to enlighten the future direction

    Performance enhancement of a Neato XV-11 laser scanner applied to mobile robot localization: a stochastic modeling approach

    Get PDF
    Laser scanners are widely used in mobile robotics localization systems but, despite the enormous potential of its use, their high price tag is a major drawback, mainly for hobbyist and educational robotics practitioners that usually have a reduced budget. The Neato XV-11 Laser Scanner is a very low cost alternative, when compared with the current available laser scanners, being this fact the main motivation for its use. The modeling of a hacked Neato XV-11 Laser Scanner allows to provide valuable information that can promote the development of better designs of robot localization systems based on this sensor. This paper presents, as an example, the performance enhancement of a Neato XV-11 Laser Scanner applied to mobile robot self-localization, being used as case study the Perfect Match Algorithm applied to the Robot@Factory competition.This work has been supported by FCT - Fundação para a Ciência e Tecnologia with in the Project Scope: UIDB/05757/2020info:eu-repo/semantics/publishedVersio

    Some considerations regarding the use of multi-fidelity Kriging in the construction of surrogate models

    No full text
    Surrogate models or metamodels are commonly used to exploit expensive computational simulations within a design optimization framework. The application of multi-fidelity surrogate modeling approaches has recently been gaining ground due to the potential for further reductions in simulation effort over single fidelity approaches. However, given a black box problem when exactly should a designer select a multi-fidelity approach over a single fidelity approach and vice versa? Using a series of analytical test functions and engineering design examples from the literature, the following paper illustrates the potential pitfalls of choosing one technique over the other without a careful consideration of the optimization problem at hand. These examples are then used to define and validate a set of guidelines for the creation of a multi-fidelity Kriging model. The resulting guidelines state that the different fidelity functions should be well correlated, that the amount of low fidelity data in the model should be greater than the amount of high fidelity data and that more than 10\% and less than 80\% of the total simulation budget should be spent on low fidelity simulations in order for the resulting multi-fidelity model to perform better than the equivalent costing high fidelity model
    corecore