83 research outputs found

    Forested Watersheds and Water Supply: Exploring Effects of Wildfires, Silviculture, and Climate Change on Downstream Waters

    Get PDF
    Drinking water supplies for much of society originate in forests. To preserve the capability of these forests to produce clean and easily treatable water, source water supply and protection strategies focus in particular on potential disturbances to the landscape, which include prescribed forest harvesting and wildfires of varying intensity. While decades of work have revealed relationships between forest harvesting and stream flow response, there is a considerable lack of synthesis disentangling the interactions of climate, wildfires, stream flow, and water quality. Revealing the mechanisms for impacts on downstream waters after disturbances of harvesting and wildfire will greatly improve land and water management. In this dissertation, I combined synthesis of previously published or available data, novel mathematical analyses, and deterministic modeling to disentangle various disturbance effects and further our understanding of processes in forested watersheds. I broadly sought to explore how streamflow and water quality change after forest disturbances, and how new methods and analyses can provide insight into the biogeochemical and ecohydrologic processes changing during disturbances. First, I examined the effect of wildfire on hydrology, and developed a novel Budyko decomposition method to separate climatic and disturbance effects on streamflow. Using a set of 17 watersheds in southern California, I showed that while traditional metrics like changes in flow or runoff ratio might not detect a disturbance effect from wildfire due to confounding climate signals, the Budyko framework can be used successfully for statistical change detection. The method was used to estimate hydrologic recovery timescales that varied between 5 and 45 years, with an increase of about 4 years of recovery time per 10% of the watershed burned. Next, in Chapter 3 I used a meta-analysis approach to examine the effect of wildfire on water quality, using data from 121 catchments around the world. Analyzing the changes in concentrations of stream water nutrients, including carbon, nitrogen, and phosphorus, I showed that concentrations generally increased after fire. While a large amount of variability existed in the data, we found concurrent increases in the constituents after fire highlighting tight coupling of the biogeochemical cycles. Most interestingly, we found fire to increase the concentrations of biologically active nutrients like nitrate and phosphate at a greater rate than total nitrogen and phosphorus, with median increases of 40-60% in the nitrate to TN, and SRP to TP ratios. Next, I conducted an analysis of both water quality and hydrology together after fire in Chapter 4, using a set of 29 wildfire-impacted watersheds in the United States. Concentration-discharge relationships can be used to reveal pathways and sources of elements exported from watersheds, and my overall hypothesis was that these relationships change in post-fire landscapes. I developed a new methodology, using k-means clustering, to classify watersheds as chemostatic, dilution, mobilization and chemodynamic, and explored how their position within the cluster changed in post-fire landscapes. I found that the behavior of nitrate and ammonium was increasingly chemostatic after fire, while behavior of total nitrogen, phosphorus, and organic phosphorus was increasingly mobilizing after fire. Finally, I developed a coupled hydrology-vegetation-biogeochemistry model to simulate and elucidate processes controlling the impact of harvesting on downstream waters. I focused on the Turkey Lakes watershed where a significant amount of data has been collected on vegetation and soil nutrient dynamics, in addition to traditional streamflow and water quality metrics, and developed a novel multi-part calibration process that used measured data on stream, forest, and soil characteristics and dynamics. Future work would involve using the model to explore the data driven relationships that have been developed in the earlier chapters of the paper. The work presented in this dissertation highlights new small and large-scale relationships between disturbances in forested watersheds and effects on downstream waters. With more threats predicted to escalate and overlap in the coming years, the novel results and methodologies that I have presented here should contribute to improving land and water management

    Structural optimization in steel structures, algorithms and applications

    Get PDF
    L'abstract è presente nell'allegato / the abstract is in the attachmen

    Quality Indicators for Preference-based Evolutionary Multi-objective Optimization Using a Reference Point: A Review and Analysis

    Full text link
    Some quality indicators have been proposed for benchmarking preference-based evolutionary multi-objective optimization algorithms using a reference point. Although a systematic review and analysis of the quality indicators are helpful for both benchmarking and practical decision-making, neither has been conducted. In this context, first, this paper reviews existing regions of interest and quality indicators for preference-based evolutionary multi-objective optimization using the reference point. We point out that each quality indicator was designed for a different region of interest. Then, this paper investigates the properties of the quality indicators. We demonstrate that an achievement scalarizing function value is not always consistent with the distance from a solution to the reference point in the objective space. We observe that the regions of interest can be significantly different depending on the position of the reference point and the shape of the Pareto front. We identify undesirable properties of some quality indicators. We also show that the ranking of preference-based evolutionary multi-objective optimization algorithms depends on the choice of quality indicators

    Dynamic multi-objective optimization using evolutionary algorithms

    Get PDF
    Dynamic Multi-objective Optimization Problems (DMOPs) offer an opportunity to examine and solve challenging real world scenarios where trade-off solutions between conflicting objectives change over time. Definition of benchmark problems allows modelling of industry scenarios across transport, power and communications networks, manufacturing and logistics. Recently, significant progress has been made in the variety and complexity of DMOP benchmarks and the incorporation of realistic dynamic characteristics. However, significant gaps still exist in standardised methodology for DMOPs, specific problem domain examples and in the understanding of the impacts and explanations of dynamic characteristics. This thesis provides major contributions on these three topics within evolutionary dynamic multi-objective optimization. Firstly, experimental protocols for DMOPs are varied. This limits the applicability and relevance of results produced and conclusions made in the field. A major source of the inconsistency lies in the parameters used to define specific problem instances being examined. The uninformed selection of these has historically held back understanding of their impacts and standardisation in experimental approach to these parameters in the multi-objective problem domain. Using the frequency and severity (or magnitude) of change events, a more informed approach to DMOP experimentation is conceptualized, implemented and evaluated. Establishment of a baseline performance expectation across a comprehensive range of dynamic instances for well-studied DMOP benchmarks is analyzed. To maximize relevance, these profiles are composed from the performance of evolutionary algorithms commonly used for baseline comparisons and those with simple dynamic responses. Comparison and contrast with the coverage of parameter combinations in the sampled literature highlights the importance of these contributions. Secondly, the provision of useful and realistic DMOPs in the combinatorial domain is limited in previous literature. A novel dynamic benchmark problem is presented by the extension of the Travelling Thief Problem (TTP) to include a variety of realistic and contextually justified dynamic changes. Investigation of problem information exploitation and it's potential application as a dynamic response is a key output of these results and context is provided through comparison to results obtained by adapting existing TTP heuristics. Observation driven iterative development prompted the investigation of multi-population island model strategies, together with improvements in the approaches to accurately describe and compare the performance of algorithm models for DMOPs, a contribution which is applicable beyond the dynamic TTP. Thirdly, the purpose of DMOPs is to reconstruct realistic scenarios, or features from them, to allow for experimentation and development of better optimization algorithms. However, numerous important characteristics from real systems still require implementation and will drive research and development of algorithms and mechanisms to handle these industrially relevant problem classes. The novel challenges associated with these implementations are significant and diverse, even for a simple development such as consideration of DMOPs with multiple time dependencies. Real world systems with dynamics are likely to contain multiple temporally changing aspects, particularly in energy and transport domains. Problems with more than one dynamic problem component allow for asynchronous changes and a differing severity between components that leads to an explosion in the size of the possible dynamic instance space. Both continuous and combinatorial problem domains require structured investigation into the best practices for experimental design, algorithm application and performance measurement, comparison and visualization. Highlighting the challenges, the key requirements for effective progress and recommendations on experimentation are explored here

    Modelling and Optimizing Supply Chain Integrated Production Scheduling Problems

    Full text link
    Globalization and advanced information technologies (e.g., Internet of Things) have considerably impacted supply chains (SCs) by persistently forcing original equipment manufacturers (OEMs) to switch production strategies from make-to-stock (MTS) to make-to-order (MTO) to survive in competition. Generally, an OEM follows the MTS strategy for products with steady demand. In contrast, the MTO strategy exists under a pull system with irregular demand in which the received customer orders are scheduled and launched into production. In comparison to MTS, MTO has the primary challenges of ensuring timely delivery at the lowest possible cost, satisfying the demands of high customization and guaranteeing the accessibility of raw materials throughout the production process. These challenges are increasing substantially since industrial productions are becoming more flexible, diversified, and customized. Besides, independently making the production scheduling decisions from other stages of these SCs often find sub-optimal results, creating substantial challenges to fulfilling demands timely and cost-effectively. Since adequately managing these challenges asynchronously are difficult, constructing optimization models by integrating SC decisions, such as customer requirements, supply portfolio (supplier selection and order allocation), delivery batching decisions, and inventory portfolio (inventory replenishment, consumption, and availability), with shop floor scheduling under a deterministic and dynamic environment is essential to fulfilling customer expectations at the least possible cost. These optimization models are computationally intractable. Consequently, designing algorithms to schedule or reschedule promptly is also highly challenging for these time-sensitive, operationally integrated optimization models. Thus, this thesis focuses on modelling and optimizing SC-integrated production scheduling problems, named SC scheduling problems (SCSPs). The objective of optimizing job shop scheduling problems (JSSPs) is to ensure that the requisite resources are accessible when required and that their utilization is maximally efficient. Although numerous algorithms have been devised, they can sometimes become computationally exorbitant and yield sub-optimal outcomes, rendering production systems inefficient. These could be due to a variety of causes, such as an imbalance in population quality over generations, recurrent generation and evaluation of identical schedules, and permitting an under-performing method to conduct the evolutionary process. Consequently, this study designs two methods, a sequential approach (Chapter 2) and a multi-method approach (Chapter 3), to address the aforementioned issues and to acquire competitive results in finding optimal or near-optimal solutions for JSSPs in a single objective setting. The devised algorithms for JSSPs optimize workflows for each job by accurate mapping between/among related resources, generating more optimal results than existing algorithms. Production scheduling can not be accomplished precisely without considering supply and delivery decisions and customer requirements simultaneously. Thus, a few recent studies have operationally integrated SCs to accurately predict process insights for executing, monitoring, and controlling the planned production. However, these studies are limited to simple shop-floor configurations and can provide the least flexibility to address the MTO-based SC challenges. Thus, this study formulates a bi-objective optimization model that integrates the supply portfolio into a flexible job shop scheduling environment with a customer-imposed delivery window to cost-effectively meet customized and on-time delivery requirements (Chapter 4). Compared to the job shop that is limited to sequence flexibility only, the flexible job shop has been deemed advantageous due to its capacity to provide increased scheduling flexibility (both process and sequence flexibility). To optimize the model, the performance of the multi-objective particle swarm optimization algorithm has been enhanced, with the results providing decision-makers with an increased degree of flexibility, offering a larger number of Pareto solutions, more varied and consistent frontiers, and a reasonable time for MTO-based SCs. Environmental sustainability is spotlighted for increasing environmental awareness and follow-up regulations. Consequently, the related factors strongly regulate the supply portfolio for sustainable development, which remained unexplored in the SCSP as those criteria are primarily qualitative (e.g., green production, green product design, corporate social responsibility, and waste disposal system). These absences may lead to an unacceptable supply portfolio. Thus, this study overcomes the problem by integrating VIKORSORT into the proposed solution methodology of the extended SCSP. In addition, forming delivery batches of heterogeneous customer orders is challenging, as one order can lead to another being delayed. Therefore, the previous optimization model is extended by integrating supply, manufacturing, and delivery batching decisions and concurrently optimizing them in response to heterogeneous customer requirements with time window constraints, considering both economic and environmental sustainability for the supply portfolio (Chapter 5). Since the proposed optimization model is an extension of the flexible job shop, it can be classified as a non-deterministic polynomial-time (NP)-hard problem, which cannot be solved by conventional optimization techniques, particularly in the case of larger instances. Therefore, a reinforcement learning-based hyper-heuristic (HH) has been designed, where four solution-updating heuristics are intelligently guided to deliver the best possible results compared to existing algorithms. The optimization model furnishes a set of comprehensive schedules that integrate the supply portfolio, production portfolio (work-center/machine assignment and customer orders sequencing), and batching decisions. This provides numerous meaningful managerial insights and operational flexibility prior to the execution phase. Recently, SCs have been experiencing unprecedented and massive disruptions caused by an abrupt outbreak, resulting in difficulties for OEMs to recover from disruptive demand-supply equilibrium. Hence, this study proposes a multi-portfolio (supply, production, and inventory portfolios) approach for a proactive-reactive scheme, which concerns the SCSP with complex multi-level products, simultaneously including unpredictably dynamic supply, demand, and shop floor disruptions (Chapter 6). This study considers fabrication and assembly in a multi-level product structure. To effectively address this time-sensitive model based on real-time data, a Q-learning-based multi-operator differential evolution algorithm in a HH has been designed to address disruptive events and generate a timely rescheduling plan. The numerical results and analyses demonstrate the proposed model's capability to effectively address single and multiple disruptions, thus providing significant managerial insights and ensuring SC resilience

    Many-objectives optimization: a machine learning approach for reducing the number of objectives

    Get PDF
    Solving real-world multi-objective optimization problems using Multi-Objective Optimization Algorithms becomes difficult when the number of objectives is high since the types of algorithms generally used to solve these problems are based on the concept of non-dominance, which ceases to work as the number of objectives grows. This problem is known as the curse of dimensionality. Simultaneously, the existence of many objectives, a characteristic of practical optimization problems, makes choosing a solution to the problem very difficult. Different approaches are being used in the literature to reduce the number of objectives required for optimization. This work aims to propose a machine learning methodology, designated by FS-OPA, to tackle this problem. The proposed methodology was assessed using DTLZ benchmarks problems suggested in the literature and compared with similar algorithms, showing a good performance. In the end, the methodology was applied to a difficult real problem in polymer processing, showing its effectiveness. The algorithm proposed has some advantages when compared with a similar algorithm in the literature based on machine learning (NL-MVU-PCA), namely, the possibility for establishing variable–variable and objective–variable relations (not only objective–objective), and the elimination of the need to define/chose a kernel neither to optimize algorithm parameters. The collaboration with the DM(s) allows for the obtainment of explainable solutions.This research was funded by POR Norte under the PhD Grant PRT/BD/152192/2021. The authors also acknowledge the funding by FEDER funds through the COMPETE 2020 Programme and National Funds through FCT (Portuguese Foundation for Science and Technology) under the projects UIDB/05256/2020, and UIDP/05256/2020, the Center for Mathematical Sciences Applied to Industry (CeMEAI) and the support from the São Paulo Research Foundation (FAPESP grant No 2013/07375-0, the Center for Artificial Intelligence (C4AI-USP), the support from the São Paulo Research Foundation (FAPESP grant No 2019/07665-4) and the IBM Corporation

    Algorithms and Methods for Designing and Scheduling Smart Manufacturing Systems

    Get PDF
    This book, as a Special Issue, is a collection of some of the latest advancements in designing and scheduling smart manufacturing systems. The smart manufacturing concept is undoubtedly considered a paradigm shift in manufacturing technology. This conception is part of the Industry 4.0 strategy, or equivalent national policies, and brings new challenges and opportunities for the companies that are facing tough global competition. Industry 4.0 should not only be perceived as one of many possible strategies for manufacturing companies, but also as an important practice within organizations. The main focus of Industry 4.0 implementation is to combine production, information technology, and the internet. The presented Special Issue consists of ten research papers presenting the latest works in the field. The papers include various topics, which can be divided into three categories—(i) designing and scheduling manufacturing systems (seven articles), (ii) machining process optimization (two articles), (iii) digital insurance platforms (one article). Most of the mentioned research problems are solved in these articles by using genetic algorithms, the harmony search algorithm, the hybrid bat algorithm, the combined whale optimization algorithm, and other optimization and decision-making methods. The above-mentioned groups of articles are briefly described in this order in this book

    Advances in Optimization and Nonlinear Analysis

    Get PDF
    The present book focuses on that part of calculus of variations, optimization, nonlinear analysis and related applications which combines tools and methods from partial differential equations with geometrical techniques. More precisely, this work is devoted to nonlinear problems coming from different areas, with particular reference to those introducing new techniques capable of solving a wide range of problems. The book is a valuable guide for researchers, engineers and students in the field of mathematics, operations research, optimal control science, artificial intelligence, management science and economics

    Applied Methuerstic computing

    Get PDF
    For decades, Applied Metaheuristic Computing (AMC) has been a prevailing optimization technique for tackling perplexing engineering and business problems, such as scheduling, routing, ordering, bin packing, assignment, facility layout planning, among others. This is partly because the classic exact methods are constrained with prior assumptions, and partly due to the heuristics being problem-dependent and lacking generalization. AMC, on the contrary, guides the course of low-level heuristics to search beyond the local optimality, which impairs the capability of traditional computation methods. This topic series has collected quality papers proposing cutting-edge methodology and innovative applications which drive the advances of AMC
    • …
    corecore