16 research outputs found

    Column Generation-Based Techniques for Intensity-Modulated Radiation Therapy (IMRT) and Volumetric Modulated Arc Therapy (VMAT) Treatment Planning

    Get PDF
    RÉSUMÉ: Les statistiques ont estimĂ© Ă  environ 14,1 millions le nombre de cas de cancer en 2018 dans le monde, et qui devrait passer Ă  24 millions d’ici 2035. La radiothĂ©rapie est l’une des premiĂšres mĂ©thodes de traitement du cancer, qu’environ 50% des patients reçoivent au cours de leur maladie. Cette mĂ©thode endommage le matĂ©riel gĂ©nĂ©tique des cellules cancĂ©reuses, dĂ©truisant ainsi leur capacitĂ© de reproduction. Cependant, les cellules normales sont Ă©galement affectĂ©es par le rayonnement ; par consĂ©quent, le traitement doit ĂȘtre effectuĂ© de maniĂšre Ă  maximiser la dose de rayonnement aux tumeurs, tout en minimisant les effets nĂ©fastes des radiations sur les tissus sains. Les techniques d’optimisation sont utilisĂ©es afin de dĂ©terminer la dose et la position du rayonnement Ă  administrer au corps du patient. Ce projet aborde la radiothĂ©rapie externe Ă  travers la radiothĂ©rapie par modulation d’intensitĂ© (IMRT), ainsi qu’une nouvelle forme appelĂ©e modulation d’intensitĂ© volumĂ©trique par thĂ©rapie par arcs (VMAT). En IMRT, un nombre fini de directions sont dĂ©terminĂ©es pour le rayonnement du faisceau, tandis qu’en VMAT l’accĂ©lĂ©rateur linĂ©aire tourne autour du corps du patient alors que le faisceau est allumĂ©. Cette technologie permet de modifier dynamiquement la forme du faisceau et le dĂ©bit de dose pendant le traitement. Le problĂšme de planification du traitement consiste Ă  choisir une sĂ©quence de distribution des formes de faisceaux, Ă  optimiser le dĂ© bit de dose du faisceau et Ă  dĂ©terminer la vitesse de rotation du portique, si nĂ©cessaire. Cette recherche tire profit de la mĂ©thode de gĂ©nĂ©ration de colonnes, en tant que mĂ©thode d’optimisation efficace en particulier pour les problĂšmes Ă  grande Ă©chelle. Cette technique permet d’amĂ©liorer le temps de traitement et les objectifs cliniques non linĂ©aires et non convexes, dans la planification de traitement en VMAT. Un nouveau modĂšle multi-objectif de gĂ©nĂ©ration de colonnes pour l’IMRT est Ă©galement dĂ©veloppĂ©. Dans le premier essai, nous dĂ©veloppons un nouvel algorithme de gĂ©nĂ©ration de colonnes qui optimise le compromis entre le temps et la qualitĂ© du traitement dĂ©livrĂ© pour la planification de traitement en VMAT. Pour ce faire, une gĂ©nĂ©ration simultanĂ©e de colonnes et de rangĂ©es est dĂ©veloppĂ©e, afin de relier les colonnes, contenant la configuration des ouvertures de faisceaux, aux rangĂ©es du modĂšle, reprĂ©sentant la restriction de temps de traitement. De plus, nous proposons une technique de regroupement par grappe modifiĂ©e, afin d’agrĂ©ger des Ă©lĂ©ments de volume similaires du corps du patient, et de rĂ©duire efficacement le nombre de contraintes dans le modĂšle. Les rĂ©sultats de calcul montrent qu’il est possible d’obtenir un traitement de haute qualitĂ© sur quatre processeurs en parallĂšle. Dans le deuxiĂšme essai, nous dĂ©veloppons une approche de planification automatique intĂ©grant les critĂšres de l’histogramme dose-volume (DVH). Les DVH sont la reprĂ©sentation de dose la plus courante pour l’évaluation de la qualitĂ© de traitement en technologie VMAT. Nous profitons de la procĂ©dure itĂ©rative de gĂ©nĂ©ration de colonnes pour ajuster les paramĂštres du modĂšle lors de la gĂ©nĂ©ration d’ouverture, et rĂ©pondre aux critĂšres DVH non linĂ©aires, sans tenir compte des contraintes dures dans le modĂšle. Les rĂ©sultats sur les cas cliniques montrent que notre mĂ©thodologie a Ă©tĂ© significativement amĂ©liorĂ©e, pour obtenir des plans cliniquement acceptables sans intervention humaine par rapport Ă  une simple optimisation VMAT. De plus, la comparaison avec un systĂšme de planification de traitement commercial existant montre que la qualitĂ© des plans obtenus Ă  partir de la mĂ©thode proposĂ©e, en particulier pour les tissus sains, est largement meilleure alors que le temps de calcul est moindre. Dans le troisiĂšme essai, nous abordons la planification de traitement en IMRT, qui est formulĂ©e comme un problĂšme d’optimisation convexe Ă  grande Ă©chelle, avec un espace de faisabilitĂ© simplex. Nous intĂ©grons d’abord une nouvelle approche de solution basĂ©e sur la mĂ©thode Frank-Wolfe, appelĂ©e Blended Conditional Gradients, dans la gĂ©nĂ©ration de colonnes, pour amĂ©liorer les performances de calcul de la mĂ©thode. Nous proposons ensuite une technique de gĂ©nĂ©ration de colonnes multi-objectif, pour obtenir directement des ouvertures qui se rapprochent d’un ensemble efficace de plans de traitement non dominĂ©s. A cette fin, nous trouvons les limites infĂ©rieure et supĂ©rieure du front de Pareto, et gĂ©nĂ©rons une colonne avec un vecteur de poids des objectifs prĂ©-assignĂ© ou nouveau, rĂ©duisant la distance maximale de deux bornes. Nous prouvons que cet algorithme converge vers le front de Pareto. Les rĂ©sultats de recherche d’un bon compromis de traitement entre la destruction des volumes cibles et la protection des structures saines dans un espace objectif bidimensionnel, montrent l’efficacitĂ© de l’algorithme dans l’approche du front de Pareto, avec des plans de traitement livrables en 3 minutes environ, et Ă©vitant un grand nombre de colonnes. Cette mĂ©thode s’applique Ă©galement Ă  d’autres classes de problĂšmes d’optimisation convexe, faisant appel Ă  la fois Ă  une gĂ©nĂ©ration de colonnes et Ă  une optimisation multi-objectifs.----------ABSTRACT: The statistics have estimated about 18.1 million cancer cases in 2018 around the world, which is expected to increase to 24 million by 2035. Radiation therapy is one of the most important cancer treatment methods, which about 50% of patients receive during their illness. This method works by damaging the genetic material within cancerous cells and destroying their ability to reproduce. However, normal cells are also affected by radiation; therefore, the treatment should be performed in such a way that it maximizes the dose of radiation to tumors, while simultaneously minimizing the adverse effects of radiations to healthy tissues. The optimization techniques are useful to determine where and how much radiation should be delivered to patient’s body. In this project, we address the intensity-modulated radiation therapy (IMRT) as a widelyused external radiotherapy method and also a novel form called volumetric modulated arc therapy (VMAT). In IMRT, a finite number of directions are determined for the beam radiation, while in VMAT, the linear accelerator rotates around the patient’s body while the beam is on. These technologies give us the ability of changing the beam shape and the dose rate dynamically during the treatment. The treatment planning problem consists of selecting a delivery sequence of beam shapes, optimizing the dose rate of the beam, and determining the rotation speed of the gantry, if required. In this research, we take advantages of the column generation technique, as a leading optimization method specifically for large-scale problems, to improve the treatment time and non-linear non-convex clinical objectives in VMAT treatment planning, and also develop a new multi-objective column generation framework for IMRT. In the first essay, we develop a novel column generation algorithm optimizing the trade-off between delivery time and treatment quality for VMAT treatment planning. To this end, simultaneous column-and-row generation is developed to relate the configuration of beam apertures in columns to the treatment time restriction in the rows of the model. Moreover, we propose a modified clustering technique to aggregate similar volume elements of the patient’s body and efficiently reduce the number of constraints in the model. The computational results show that a high-quality treatment is achievable using a four-thread CPU. In the second essay, we develop an automatic planning approach integrating dose-volume histogram (DVH) criteria, the most common method of treatment evaluation in practice, for VMAT treatment planning. We take advantage of the iterative procedure of column generation to adjust the model parameters during aperture generation and meet nonlinear DVH criteria without considering hard constraints in the model. The results on clinical cases show that our methodology had significant improvement to obtain clinically acceptable plans without human intervention in comparison to simple VMAT optimization. In addition, the comparison to an existing commercial treatment planning system shows the quality of the obtained plans from the proposed method, especially for the healthy tissues, is significantly better while the computational time is less. In the third essay, we address the IMRT treatment planning, which is formulated as a large scale convex optimization problem with simplex feasibility space. We first integrate a novel Frank-Wolfe-based solution approach, so-called Blended Conditional Gradients, into the column generation to improve the computational performance for the method. We then propose a multi-objective column generation technique to directly obtain apertures that approximate an efficient non-dominated set of treatment plans. To this end, we find lower and upper bounds for the Pareto front and generate a column with a pre-assigned or new weight-vector of the objectives, reducing the maximum distance of two bounds. We prove this algorithm converges to the Pareto front. The results in a two-dimensional objective space to find the trade-off plans between the treat of target volumes and sparing the healthy structures show the efficiency of the algorithm to approximate the Pareto front with deliverable treatment plans in about 3 minutes, avoiding a large number of columns. This method is also applicable for other classes of convex optimization problems requiring both column generation and multi-objective optimization

    Optimization via Benders' Decomposition

    Get PDF
    In a period when optimization has entered almost every facet of our lives, this thesis is designed to establish an understanding about the rather contemporary optimization technique: Benders' Decomposition. It can be roughly stated as a method that handles problems with complicating variables, which when temporarily fixed, yield a problem much easier to solve. We examine the classical Benders' Decomposition algorithm in greater depth followed by a mathematical defense to verify the correctness, state how the convergence of the algorithm depends on the formulation of the problem, identify its correlation to other well-known decomposition methods for Linear Programming problems, and discuss some real-world examples. We introduce present extensions of the method that allow its application to a wider range of problems. We also present a classification of acceleration strategies which is centered round the key sections of the algorithm. We conclude by illustrating the shortcomings, trends, and potential research directions

    On the minimum cardinality problem in intensity modulated radiotherapy

    Full text link
    The thesis examines an optimisation problem that appears in the treatment planning of intensity modulated radiotherapy. An approach is presented which solved the optimisation problem in question while also extending the approach to execute in a massively parallel environment. The performance of the approach presented is among the fastest available

    A novel combination of Cased-Based Reasoning and Multi Criteria Decision Making approach to radiotherapy dose planning

    Get PDF
    In this thesis, a set of novel approaches has been developed by integration of Cased-Based Reasoning (CBR) and Multi-Criteria Decision Making (MCDM) techniques. Its purpose is to design a support system to assist oncologists with decision making about the dose planning for radiotherapy treatment with a focus on radiotherapy for prostate cancer. CBR, an artificial intelligence approach, is a general paradigm to reasoning from past experiences. It retrieves previous cases similar to a new case and exploits the successful past solutions to provide a suggested solution for the new case. The case pool used in this research is a dataset consisting of features and details related to successfully treated patients in Nottingham University Hospital. In a typical run of prostate cancer radiotherapy simple CBR, a new case is selected and thereafter based on the features available at our data set the most similar case to the new case is obtained and its solution is prescribed to the new case. However, there are a number of deficiencies associated with this approach. Firstly, in a real-life scenario, the medical team considers multiple factors rather than just the similarity between two cases and not always the most similar case provides with the most appropriate solution. Thus, in this thesis, the cases with high similarity to a new case have been evaluated with the application of the Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS). This approach takes into account multiple criteria besides similarity to prescribe a final solution. Moreover, the obtained dose plans were optimised through a Goal Programming mathematical model to improve the results. By incorporating oncologists’ experiences about violating the conventionally available dose limits a system was devised to manage the trade-off between treatment risk for sensitive organs and necessary actions to effectively eradicate cancer cells. Additionally, the success rate of the treatment, the 2-years cancer free possibility, has a vital role in the efficiency of the prescribed solutions. To consider the success rate, as well as uncertainty involved in human judgment about the values of different features of radiotherapy Data Envelopment Analysis (DEA) based on grey numbers, was used to assess the efficiency of different treatment plans on an input and output based approach. In order to deal with limitations involved in DEA regarding the number of inputs and outputs, we presented an approach for Factor Analysis based on Principal Components to utilize the grey numbers. Finally, to improve the CBR base of the system, we applied Grey Relational Analysis and Gaussian distant based CBR along with features weight selection through Genetic Algorithm to better handle the non-linearity exists within the problem features and the high number of features. Finally, the efficiency of each system has been validated through leave-one-out strategy and the real dataset. The results demonstrated the efficiency of the proposed approaches and capability of the system to assist the medical planning team. Furthermore, the integrated approaches developed within this thesis can be also applied to solve other real-life problems in various domains other than healthcare such as supply chain management, manufacturing, business success prediction and performance evaluation

    A Polyhedral Study of Mixed 0-1 Set

    Get PDF
    We consider a variant of the well-known single node fixed charge network flow set with constant capacities. This set arises from the relaxation of more general mixed integer sets such as lot-sizing problems with multiple suppliers. We provide a complete polyhedral characterization of the convex hull of the given set

    Quayside Operations Planning Under Uncertainty

    Get PDF

    On High-Performance Benders-Decomposition-Based Exact Methods with Application to Mixed-Integer and Stochastic Problems

    Get PDF
    RÉSUMÉ : La programmation stochastique en nombres entiers (SIP) combine la difficultĂ© de l’incertitude et de la non-convexitĂ© et constitue une catĂ©gorie de problĂšmes extrĂȘmement difficiles Ă  rĂ©soudre. La rĂ©solution efficace des problĂšmes SIP est d’une grande importance en raison de leur vaste applicabilitĂ©. Par consĂ©quent, l’intĂ©rĂȘt principal de cette dissertation porte sur les mĂ©thodes de rĂ©solution pour les SIP. Nous considĂ©rons les SIP en deux Ă©tapes et prĂ©sentons plusieurs algorithmes de dĂ©composition amĂ©liorĂ©s pour les rĂ©soudre. Notre objectif principal est de dĂ©velopper de nouveaux schĂ©mas de dĂ©composition et plusieurs techniques pour amĂ©liorer les mĂ©thodes de dĂ©composition classiques, pouvant conduire Ă  rĂ©soudre optimalement divers problĂšmes SIP. Dans le premier essai de cette thĂšse, nous prĂ©sentons une revue de littĂ©rature actualisĂ©e sur l’algorithme de dĂ©composition de Benders. Nous fournissons une taxonomie des amĂ©liorations algorithmiques et des stratĂ©gies d’accĂ©lĂ©ration de cet algorithme pour synthĂ©tiser la littĂ©rature et pour identifier les lacunes, les tendances et les directions de recherche potentielles. En outre, nous discutons de l’utilisation de la dĂ©composition de Benders pour dĂ©velopper une (mĂ©ta- )heuristique efficace, dĂ©crire les limites de l’algorithme classique et prĂ©senter des extensions permettant son application Ă  un plus large Ă©ventail de problĂšmes. Ensuite, nous dĂ©veloppons diverses techniques pour surmonter plusieurs des principaux inconvĂ©nients de l’algorithme de dĂ©composition de Benders. Nous proposons l’utilisation de plans de coupe, de dĂ©composition partielle, d’heuristiques, de coupes plus fortes, de rĂ©ductions et de stratĂ©gies de dĂ©marrage Ă  chaud pour pallier les difficultĂ©s numĂ©riques dues aux instabilitĂ©s, aux inefficacitĂ©s primales, aux faibles coupes d’optimalitĂ© ou de rĂ©alisabilitĂ©, et Ă  la faible relaxation linĂ©aire. Nous testons les stratĂ©gies proposĂ©es sur des instances de rĂ©fĂ©rence de problĂšmes de conception de rĂ©seau stochastique. Des expĂ©riences numĂ©riques illustrent l’efficacitĂ© des techniques proposĂ©es. Dans le troisiĂšme essai de cette thĂšse, nous proposons une nouvelle approche de dĂ©composition appelĂ©e mĂ©thode de dĂ©composition primale-duale. Le dĂ©veloppement de cette mĂ©thode est fondĂ© sur une reformulation spĂ©cifique des sous-problĂšmes de Benders, oĂč des copies locales des variables maĂźtresses sont introduites, puis relĂąchĂ©es dans la fonction objective. Nous montrons que la mĂ©thode proposĂ©e attĂ©nue significativement les inefficacitĂ©s primales et duales de la mĂ©thode de dĂ©composition de Benders et qu’elle est Ă©troitement liĂ©e Ă  la mĂ©thode de dĂ©composition duale lagrangienne. Les rĂ©sultats de calcul sur divers problĂšmes SIP montrent la supĂ©rioritĂ© de cette mĂ©thode par rapport aux mĂ©thodes classiques de dĂ©composition. Enfin, nous Ă©tudions la parallĂ©lisation de la mĂ©thode de dĂ©composition de Benders pour Ă©tendre ses performances numĂ©riques Ă  des instances plus larges des problĂšmes SIP. Les variantes parallĂšles disponibles de cette mĂ©thode appliquent une synchronisation rigide entre les processeurs maĂźtre et esclave. De ce fait, elles souffrent d’un important dĂ©sĂ©quilibre de charge lorsqu’elles sont appliquĂ©es aux problĂšmes SIP. Cela est dĂ» Ă  un problĂšme maĂźtre difficile qui provoque un important dĂ©sĂ©quilibre entre processeur et charge de travail. Nous proposons une mĂ©thode Benders parallĂšle asynchrone dans un cadre de type branche-et-coupe. L’assouplissement des exigences de synchronisation entraine des problĂšmes de convergence et d’efficacitĂ© divers auxquels nous rĂ©pondons en introduisant plusieurs techniques d’accĂ©lĂ©ration et de recherche. Les rĂ©sultats indiquent que notre algorithme atteint des taux d’accĂ©lĂ©ration plus Ă©levĂ©s que les mĂ©thodes synchronisĂ©es conventionnelles et qu’il est plus rapide de plusieurs ordres de grandeur que CPLEX 12.7.----------ABSTRACT : Stochastic integer programming (SIP) combines the difficulty of uncertainty and non-convexity, and constitutes a class of extremely challenging problems to solve. Efficiently solving SIP problems is of high importance due to their vast applicability. Therefore, the primary focus of this dissertation is on solution methods for SIPs. We consider two-stage SIPs and present several enhanced decomposition algorithms for solving them. Our main goal is to develop new decomposition schemes and several acceleration techniques to enhance the classical decomposition methods, which can lead to efficiently solving various SIP problems to optimality. In the first essay of this dissertation, we present a state-of-the-art survey of the Benders decomposition algorithm. We provide a taxonomy of the algorithmic enhancements and the acceleration strategies of this algorithm to synthesize the literature, and to identify shortcomings, trends and potential research directions. In addition, we discuss the use of Benders decomposition to develop efficient (meta-)heuristics, describe the limitations of the classical algorithm, and present extensions enabling its application to a broader range of problems. Next, we develop various techniques to overcome some of the main shortfalls of the Benders decomposition algorithm. We propose the use of cutting planes, partial decomposition, heuristics, stronger cuts, and warm-start strategies to alleviate the numerical challenges arising from instabilities, primal inefficiencies, weak optimality/feasibility cuts, and weak linear relaxation. We test the proposed strategies with benchmark instances from stochastic network design problems. Numerical experiments illustrate the computational efficiency of the proposed techniques. In the third essay of this dissertation, we propose a new and high-performance decomposition approach, called Benders dual decomposition method. The development of this method is based on a specific reformulation of the Benders subproblems, where local copies of the master variables are introduced and then priced out into the objective function. We show that the proposed method significantly alleviates the primal and dual shortfalls of the Benders decomposition method and it is closely related to the Lagrangian dual decomposition method. Computational results on various SIP problems show the superiority of this method compared to the classical decomposition methods as well as CPLEX 12.7. Finally, we study parallelization of the Benders decomposition method. The available parallel variants of this method implement a rigid synchronization among the master and slave processors. Thus, it suffers from significant load imbalance when applied to the SIP problems. This is mainly due to having a hard mixed-integer master problem that can take hours to be optimized. We thus propose an asynchronous parallel Benders method in a branchand- cut framework. However, relaxing the synchronization requirements entails convergence and various efficiency problems which we address them by introducing several acceleration techniques and search strategies. In particular, we propose the use of artificial subproblems, cut generation, cut aggregation, cut management, and cut propagation. The results indicate that our algorithm reaches higher speedup rates compared to the conventional synchronized methods and it is several orders of magnitude faster than CPLEX 12.7
    corecore