38 research outputs found

    Using machine learning to predict the number of alternative solutions to a minimum cardinality set covering problem

    Get PDF
    Although the characterization of alternative optimal solutions for linear programming problems is well known, such characterizations for combinatorial optimization problems are essentially non-existent. This is the first article to qualitatively predict the number of alternative optima for a classic NP-hard combinatorial optimization problem, namely, the minimum cardinality (also called unicost) set covering problem (MCSCP). For the MCSCP, a set must be covered by a minimum number of subsets selected from a specified collection of subsets of the given set. The MCSCP has numerous industrial applications that require that a secondary objective is optimized once the size of a minimum cover has been determined. To optimize the secondary objective, the number of MCSCP solutions is optimized. In this article, for the first time, a machine learning methodology is presented to generate categorical regression trees to predict, qualitatively (extra-small, small, medium, large, or extra-large), the number of solutions to an MCSCP. Within the machine learning toolbox of MATLAB®, 600,000 unique random MCSCPs were generated and used to construct regression trees. The prediction quality of these regression trees was tested on 5000 different MCSCPs. For the 5-output model, the average accuracy of being at most one off from the predicted category was 94.2%.Â

    Math Students help their Community develop Balanced Refuse Collection Routes

    Get PDF
    In fall of 2017, the Superintendent of Public Works for Kutztown Borough approached Kutztown University’s Department of Mathematics seeking help in “re-balancing” refuse collection routes in the Borough of Kutztown.  Historically, refuse was collected two days a week on the south side (Mondays and Thursdays) of Main Street and two days a week on the north side (Tuesdays and Fridays) of Main Street.  Wednesdays were used for recycling collection.  Over the years, new housing development was primarily on the north side of Main Street.  As a result of this development, refuse collection time had become “unbalanced”; requiring more time for the north side collection.  During the spring semester 2018, several math majors in their last semester at Kutztown University developed a new refuse collection strategy.  This strategy balanced collection times over the four collection days and just as importantly, minimized the modifications to the existing routes.  Additionally, a minimum number of residents were impacted while accounting for future housing development.  Their strategy has been successfully used in the Borough of Kutztown since August 2018

    Using general-purpose integer programming software to generate bounded solutions for the multiple knapsack problem: a guide for or practitioners

    Get PDF
    An NP-Hard combinatorial optimization problem that has significant industrial applications is the Multiple Knapsack Problem. If approximate solution approaches are used to solve the Multiple Knapsack Problem there are no guarantees on solution quality and exact solution approaches can be intricate and challenging to implement.  This article demonstrates the iterative use of general-purpose integer programming software (Gurobi) to generate solutions for test problems that are available in the literature. Using the software package Gurobi on a standard PC, we generate in a relatively straightforward manner solutions to these problems in an average of less than a minute that are guaranteed to be within 0.16% of the optimum.  This algorithm, called the Simple Sequential Increasing Tolerance (SSIT) algorithm, iteratively increases tolerances in Gurobi to generate a solution that is guaranteed to be close to the optimum in a short time. This solution strategy generates bounded solutions in a timely manner without requiring the coding of a problem-specific algorithm. This approach is attractive to management for solving industrial problems because it is both cost and time effective and guarantees the quality of the generated solutions.  Finally, comparing SSIT results for 480 large multiple knapsack problem instances to results using published multiple knapsack problem algorithms demonstrates that SSIT outperforms these specialized algorithms

    Generating bounded solutions for multi-demand multidimensional knapsack problems: a guide for operations research practitioners

    Get PDF
    A generalization of the 0-1 knapsack problem that is hard-to-solve both theoretically (NP-hard) and in practice is the multi-demand multidimensional knapsack problem (MDMKP). Solving an MDMKP can be difficult because of its conflicting knapsack and demand constraints. Approximate solution approaches provide no guarantees on solution quality. Recently, with the use of classification trees, MDMKPs were partitioned into three general categories based on their expected performance using the integer programming option of the CPLEX® software package on a standard PC: Category A—relatively easy to solve, Category B—somewhat difficult to solve, and Category C—difficult to solve. However, no solution methods were associated with these categories. The primary contribution of this article is that it demonstrates, customized to each category, how general-purpose integer programming software (CPLEX in this case) can be iteratively used to efficiently generate bounded solutions for MDMKPs. Specifically, the simple sequential increasing tolerance (SSIT) methodology will iteratively use CPLEX with loosening tolerances to efficiently generate these bounded solutions. The real strength of this approach is that the SSIT methodology is customized based on the particular category (A, B, or C) of the MDMKP instance being solved. This methodology is easy for practitioners to use because it requires no time-consuming effort of coding problem specific-algorithms. Statistical analyses will compare the SSIT results to a single-pass execution of CPLEX in terms of execution time and solution quality

    An undergraduate uses OR to improve Final Exam Schedules at her university

    Get PDF
    Final examination scheduling is typically a complex problem that impacts students, faculty, and administrators at every university. In this paper, we describe how an undergraduate student, for her senior project at Kutztown University, analysed the final exam schedules at Kutztown University to see if she could improve them. Specifically, she wanted to see if she could reduce student conflicts defined to be a student having three exams scheduled on the same day. The approach that she developed, based on a balanced bin packing algorithm, was very appealing because it could be implemented manually by a staff member of the Registrars office, requiring at most 30 minutes to generate the schedule. Testing this approach using actual data from the Fall 2015 semester resulted in a 42% reduction in student conflicts. This approach, because of its simplicity and intuitive appeal, was widely accepted by the Kutztown University faculty and administrators and is being implemented for the Fall 2016 semester

    An undergraduate uses OR to improve Final Exam Schedules at her university

    Get PDF
    Final examination scheduling is typically a complex problem that impacts students, faculty, and administrators at every university. In this paper, we describe how an undergraduate student, for her senior project at Kutztown University, analysed the final exam schedules at Kutztown University to see if she could improve them. Specifically, she wanted to see if she could reduce student conflicts defined to be a student having three exams scheduled on the same day. The approach that she developed, based on a balanced bin packing algorithm, was very appealing because it could be implemented manually by a staff member of the Registrars office, requiring at most 30 minutes to generate the schedule. Testing this approach using actual data from the Fall 2015 semester resulted in a 42% reduction in student conflicts. This approach, because of its simplicity and intuitive appeal, was widely accepted by the Kutztown University faculty and administrators and is being implemented for the Fall 2016 semester

    Effects of Anacetrapib in Patients with Atherosclerotic Vascular Disease

    Get PDF
    BACKGROUND: Patients with atherosclerotic vascular disease remain at high risk for cardiovascular events despite effective statin-based treatment of low-density lipoprotein (LDL) cholesterol levels. The inhibition of cholesteryl ester transfer protein (CETP) by anacetrapib reduces LDL cholesterol levels and increases high-density lipoprotein (HDL) cholesterol levels. However, trials of other CETP inhibitors have shown neutral or adverse effects on cardiovascular outcomes. METHODS: We conducted a randomized, double-blind, placebo-controlled trial involving 30,449 adults with atherosclerotic vascular disease who were receiving intensive atorvastatin therapy and who had a mean LDL cholesterol level of 61 mg per deciliter (1.58 mmol per liter), a mean non-HDL cholesterol level of 92 mg per deciliter (2.38 mmol per liter), and a mean HDL cholesterol level of 40 mg per deciliter (1.03 mmol per liter). The patients were assigned to receive either 100 mg of anacetrapib once daily (15,225 patients) or matching placebo (15,224 patients). The primary outcome was the first major coronary event, a composite of coronary death, myocardial infarction, or coronary revascularization. RESULTS: During the median follow-up period of 4.1 years, the primary outcome occurred in significantly fewer patients in the anacetrapib group than in the placebo group (1640 of 15,225 patients [10.8%] vs. 1803 of 15,224 patients [11.8%]; rate ratio, 0.91; 95% confidence interval, 0.85 to 0.97; P=0.004). The relative difference in risk was similar across multiple prespecified subgroups. At the trial midpoint, the mean level of HDL cholesterol was higher by 43 mg per deciliter (1.12 mmol per liter) in the anacetrapib group than in the placebo group (a relative difference of 104%), and the mean level of non-HDL cholesterol was lower by 17 mg per deciliter (0.44 mmol per liter), a relative difference of -18%. There were no significant between-group differences in the risk of death, cancer, or other serious adverse events. CONCLUSIONS: Among patients with atherosclerotic vascular disease who were receiving intensive statin therapy, the use of anacetrapib resulted in a lower incidence of major coronary events than the use of placebo. (Funded by Merck and others; Current Controlled Trials number, ISRCTN48678192 ; ClinicalTrials.gov number, NCT01252953 ; and EudraCT number, 2010-023467-18 .)

    My Fair Share

    No full text
    corecore