9 research outputs found

    Inverse Optimization: Closed-form Solutions, Geometry and Goodness of fit

    Full text link
    In classical inverse linear optimization, one assumes a given solution is a candidate to be optimal. Real data is imperfect and noisy, so there is no guarantee this assumption is satisfied. Inspired by regression, this paper presents a unified framework for cost function estimation in linear optimization comprising a general inverse optimization model and a corresponding goodness-of-fit metric. Although our inverse optimization model is nonconvex, we derive a closed-form solution and present the geometric intuition. Our goodness-of-fit metric, ρ\rho, the coefficient of complementarity, has similar properties to R2R^2 from regression and is quasiconvex in the input data, leading to an intuitive geometric interpretation. While ρ\rho is computable in polynomial-time, we derive a lower bound that possesses the same properties, is tight for several important model variations, and is even easier to compute. We demonstrate the application of our framework for model estimation and evaluation in production planning and cancer therapy

    Using programming to optimize mineral processing

    Get PDF
    Ore beneficiation at a mine could be described as complex and expensive, involving many balancing processes where material flow rates, size, density and other factors must all be in balance, if any degree of plant optimization is to be achieved. To determine the optimum setup for maximizing throughput at the final step in the beneficiation process, such as the dense media separation units, a mine optimizer is developed to maximize the production throughput as objective function, using constraint-based global optimization. The mine optimizer uses a search engine to find a set of operational conditions, that will help achieve the maximum production within all constraints, such as the availability of plant, the capacity of all press units; the change in material size and property (between crushers) and other operational conditions at the mineral process plant. The result is that improving cheaper upstream processes, such as blasting, can significantly increase the throughput of expensive downstream processes, like crushing, through improved fragmentation of the ROM ore. For instance, if the ROM ore is not in the required range, the plant production is unbalanced and consequently the mine could loss production by 10-20%, even up to 50% of production loss in the worst case. On one hand, a finer ROM ore may result in lower production of both crushing and coarse separation by 50%, while other process units are running at 100% capacity, such as slimes and tailing dumping. In addition, a finer ROM ore! may destroy the mineral value as well, such as in the cases of mining coal, iron ore, and diamond ore, where a higher price is paid for the products of larger size

    Inverse Integer Optimization With an Application in Recommender Systems

    Get PDF
    In typical (forward) optimization, the goal is to obtain optimal values for the decision variables given known values of optimization model parameters. However, in practice, it may be challenging to determine appropriate values for these parameters. Assuming the availability of historical observations that represent past decisions made by an optimizing agent, the goal of inverse optimization is to impute the unknown model parameters that would make these observations optimal (or approximately optimal) solutions to the forward optimization problem. Inverse optimization has many applications, including geology, healthcare, transportation, and production planning. In this dissertation, we study inverse optimization with integer observation(s), focusing on the cost coefficients as the unknown parameters. Furthermore, we demonstrate an application of inverse optimization to recommender systems. First, we address inverse optimization with a single imperfect integer observation. The aim is to identify the unknown cost vector so that it makes the given imperfect observation approximately optimal by minimizing the optimality error. We develop a cutting plane algorithm for this problem. Results show that the proposed cutting plane algorithm works well for small instances. To reduce computational time, we propose an LP relaxation heuristic method. Furthermore, to obtain an optimal solution in a shorter amount of time, we combine both methods into a hybrid approach by initializing the cutting plane algorithm with a solution from the heuristic method. In the second study, we generalize the previous approach to inverse optimization with multiple imperfect integer observations that are all feasible solutions to one optimization problem. A cutting plane algorithm is proposed and then compared with an LP heuristic method. The results show the value of using multiple data points instead of a single observation. Finally, we apply the proposed methods in the setting of recommender systems. By accessing past user preferences, through inverse optimization we identify the unknown model parameters that minimize an aggregate of the optimality errors over multiple points. Once the unknown parameters are imputed, the recommender system can recommend the best items to the users. The advantage of using inverse optimization is that when users are optimizing their decisions, there is no need to have access to a large amount of data for imputing recommender system model parameters. We demonstrate the accuracy of our approach on a real data set for a restaurant recommender system

    Robust inverse optimization

    Get PDF
    Given an observation of a decision-maker’s uncertain behavior, we develop a robust inverse optimization model for imputing an objective function that is robust against mis-specifications of the behavior. We characterize the inversely optimized cost vectors for uncertainty sets that may or may not intersect the feasible region, and propose tractable solution methods for special cases. We demonstrate the proposed model in the context of diet recommendation
    corecore