6 research outputs found

    Hálózatalapú modell- és adatredukciós módszer = Network-based dimensionality reduction and analysis

    Get PDF
    A hálózatelemzés új távlatokat nyit az adatelemzés területén. Az adatpontokat csomópontokként és a közöttük lévő kapcsolatokat élekként ábrázolva „adathálózatot” kapunk, amellyel megnyílik a lehetőség az exponenciálisan fejlődő hálózatos elemzés eszköztárának alkalmazására is. Tanulmányomban egy új, hálózatalapú modell- és adatredukciós módszer létrehozását javaslom, amely egy olyan, nem paraméteres eljárás, amely modellredukció esetében megadja a látens változók, adatredukció esetében pedig a klasztercentrumok számát. A kialakított módszer robusztus, mivel képes kevés megfigyelés alapján is meghatározni a változócsoportokat, illetve kevés változó alapján az adatcsoportokat. A javasolt módszer alkalmazható szimmetrikus és aszimmetrikus változó- és adat-távolságmértékek esetén is. A módszert szimulált és valós adatokon is tesztelem. Az elkészült módszer R-programnyelvben validált csomagként is elérhető. = Network analysis opens new horizons for data analysis methods, as the results of ever-developing network science can be integrated into classical data analysis techniques. This paper presents the generalized network-based dimensional analysis (GNDA) method. The main contributions of this paper are as follows: (1) The proposed GNDA method handles high dimensional low sample size datasets. In addition, compared with existing methods, we show that only the proposed GNDA method adequately estimates the number of latent variables. (2) The proposed GNDA already considers any symmetric and nonsymmetric similarity functions between indicators (i.e., variables or observations) to specify latent variables. The proposed GNDA method is compared with traditional dimensionality reduction methods on various simulated and real-world datasets. The implementation of the proposed method can be downloaded from the official CRAN site

    Interactive Optimization With Parallel Coordinates: Exploring Multidimensional Spaces for Decision Support

    Get PDF
    Interactive optimization methods are particularly suited for letting human decision makers learn about a problem, while a computer learns about their preferences to generate relevant solutions. For interactive optimization methods to be adopted in practice, computational frameworks are required, which can handle and visualize many objectives simultaneously, provide optimal solutions quickly and representatively, all while remaining simple and intuitive to use and understand by practitioners. Addressing these issues, this work introduces SAGESSE (Systematic Analysis, Generation, Exploration, Steering and Synthesis Experience), a decision support methodology, which relies on interactive multiobjective optimization. Its innovative aspects reside in the combination of (i) parallel coordinates as a means to simultaneously explore and steer the underlying alternative generation process, (ii) a Sobol sequence to efficiently sample the points to explore in the objective space, and (iii) on-the-fly application of multiattribute decision analysis, cluster analysis and other data visualization techniques linked to the parallel coordinates. An illustrative example demonstrates the applicability of the methodology to a large, complex urban planning problem

    A Posteriori And Interactive Approaches For Decision-making With Multiple Stochastic Objectives

    Get PDF
    Computer simulation is a popular method that is often used as a decision support tool in industry to estimate the performance of systems too complex for analytical solutions. It is a tool that assists decision-makers to improve organizational performance and achieve performance objectives in which simulated conditions can be randomly varied so that critical situations can be investigated without real-world risk. Due to the stochastic nature of many of the input process variables in simulation models, the output from the simulation model experiments are random. Thus, experimental runs of computer simulations yield only estimates of the values of performance objectives, where these estimates are themselves random variables. Most real-world decisions involve the simultaneous optimization of multiple, and often conflicting, objectives. Researchers and practitioners use various approaches to solve these multiobjective problems. Many of the approaches that integrate the simulation models with stochastic multiple objective optimization algorithms have been proposed, many of which use the Pareto-based approaches that generate a finite set of compromise, or tradeoff, solutions. Nevertheless, identification of the most preferred solution can be a daunting task to the decisionmaker and is an order of magnitude harder in the presence of stochastic objectives. However, to the best of this researcher’s knowledge, there has been no focused efforts and existing work that attempts to reduce the number of tradeoff solutions while considering the stochastic nature of a set of objective functions. In this research, two approaches that consider multiple stochastic objectives when reducing the set of the tradeoff solutions are designed and proposed. The first proposed approach is an a posteriori approach, which uses a given set of Pareto optima as input. The second iv approach is an interactive-based approach that articulates decision-maker preferences during the optimization process. A detailed description of both approaches is given, and computational studies are conducted to evaluate the efficacy of the two approaches. The computational results show the promise of the proposed approaches, in that each approach effectively reduces the set of compromise solutions to a reasonably manageable size for the decision-maker. This is a significant step beyond current applications of decision-making process in the presence of multiple stochastic objectives and should serve as an effective approach to support decisionmaking under uncertaint

    Development of sustainable groundwater management methodologies to control saltwater intrusion into coastal aquifers with application to a tropical Pacific island country

    Get PDF
    Saltwater intrusion due to the over-exploitation of groundwater in coastal aquifers is a critical challenge facing groundwater-dependent coastal communities throughout the world. Sustainable management of coastal aquifers for maintaining abstracted groundwater quality within permissible salinity limits is regarded as an important groundwater management problem necessitating urgent reliable and optimal management methodologies. This study focuses on the development and evaluation of groundwater salinity prediction tools, coastal aquifer multi-objective management strategies, and adaptive management strategies using new prediction models, coupled simulation-optimization (S/O) models, and monitoring network design, respectively. Predicting the extent of saltwater intrusion into coastal aquifers in response to existing and changing pumping patterns is a prerequisite of any groundwater management framework. This study investigates the feasibility of using support vector machine regression (SVMR), an innovative artificial intelligence-based machine learning algorithm, to predict salinity at monitoring wells in an illustrative aquifer under variable groundwater pumping conditions. For evaluation purposes, the prediction results of SVMR are compared with well-established genetic programming (GP) based surrogate models. The prediction capabilities of the two learning machines are evaluated using several measures to ensure their practicality and generalisation ability. Also, a sensitivity analysis methodology is proposed for assessing the impact of pumping rates on salt concentrations at monitoring locations. The performance evaluations suggest that the predictive capability of SVMR is superior to that of GP models. The sensitivity analysis identifies a subset of the most influential pumping rates, which is used to construct new SVMR surrogate models with improved predictive capabilities. The improved predictive capability and generalisation ability of SVMR models, together with the ability to improve the accuracy of prediction by refining the dataset used for training, make the use of SVMR models more attractive. Coupled S/O models are efficient tools that are used for designing multi-objective coastal aquifer management strategies. This study applies a regional-scale coupled S/O methodology with a Pareto front clustering technique to prescribe optimal groundwater withdrawal patterns from the Bonriki aquifer in the Pacific Island of Kiribati. A numerical simulation model is developed, calibrated and validated using field data from the Bonriki aquifer. For computational feasibility, SVMR surrogate models are trained and tested utilizing input-output datasets generated using the flow and transport numerical simulation model. The developed surrogate models were externally coupled with a multi-objective genetic algorithm optimization (MOGA) model, as a substitute for the numerical model. The study area consisted of freshwater pumping wells for extracting groundwater. Pumping from barrier wells installed along the coastlines is also considered as a management option to hydraulically control saltwater intrusion. The objective of the multi-objective management model was to maximise pumping from production wells and minimize pumping from barrier wells (which provide a hydraulic barrier) to ensure that the water quality at different monitoring locations remains within pre-specified limits. The executed multi-objective coupled S/O model generated 700 Pareto-optimal solutions. Analysing a large set of Pareto-optimal solution is a challenging task for the decision-makers. Hence, the k-means clustering technique was utilized to reduce the large Pareto-optimal solution set and help solve the large-scale saltwater intrusion problem in the Bonriki aquifer. The S/O-based management models have delivered optimal saltwater intrusion management strategies. However, at times, uncertainties in the numerical simulation model due to uncertain aquifer parameters are not incorporated into the management models. The present study explicitly incorporates aquifer parameter uncertainty into a multi-objective management model for the optimal design of groundwater pumping strategies from the unconfined Bonriki aquifer. To achieve computational efficiency and feasibility of the management model, the calibrated numerical simulation model in the S/O model was is replaced with ensembles of SVMR surrogate models. Each SVMR standalone surrogate model in the ensemble is constructed using datasets from different numerical simulation models with different hydraulic conductivity and porosity values. These ensemble SVMR models were coupled to the MOGA model to solve the Bonriki aquifer management problem for ensuring sustainable withdrawal rates that maintain specified salinity limits. The executed optimization model presented a Pareto-front with 600 non-dominated optimal trade-off pumping solutions. The reliability of the management model, established after validation of the optimal solution results, suggests that the implemented constraints of the optimization problem were satisfied; i.e., the salinities at monitoring locations remained within the pre-specified limits. The correct implementation of a prescribed optimal management strategy based on the coupled S/O model is always a concern for decision-makers. The management strategy actually implemented in the field sometimes deviates from the recommended optimal strategy, resulting in field-level deviations. Monitoring such field-level deviations during actual implementation of the recommended optimal management strategy and sequentially updating the strategy using feedback information is an important step towards adaptive management of coastal groundwater resources. In this study, a three-phase adaptive management framework for a coastal aquifer subjected to saltwater intrusion is applied and evaluated for a regional-scale coastal aquifer study area. The methodology adopted includes three sequential components. First, an optimal management strategy (consisting of groundwater extraction from production and barrier wells) is derived and implemented for the optimal management of the aquifer. The implemented management strategy is obtained by solving a homogeneous ensemble-based coupled S/O model. Second, a regional-scale optimal monitoring network is designed for the aquifer system, which considers possible user noncompliance of a recommended management strategy and uncertainty in aquifer parameter estimates. A new monitoring network design is formulated to ensure that candidate monitoring wells are placed at high risk (highly contaminated) locations. In addition, a k-means clustering methodology is utilized to select candidate monitoring wells in areas representative of the entire model domain. Finally, feedback information in the form of salinity measurements at monitoring wells is used to sequentially modify pumping strategies for future time periods in the management horizon. The developed adaptive management framework is evaluated by applying it to the Bonriki aquifer system. Overall, the results of this study suggest that the implemented adaptive management strategy has the potential to address practical implementation issues arising due to user noncompliance, as well as deviations between predicted and actual consequences of implementing a management strategy, and uncertainty in aquifer parameters. The use of ensemble prediction models is known to be more accurate standalone prediction models. The present study develops and utilises homogeneous and heterogeneous ensemble models based on several standalone evolutionary algorithms, including artificial neural networks (ANN), GP, SVMR and Gaussian process regression (GPR). These models are used to predict groundwater salinity in the Bonriki aquifer. Standalone and ensemble prediction models are trained and validated using identical pumping and salinity concentration datasets generated by solving numerical 3D transient density-dependent coastal aquifer flow and transport numerical simulation models. After validation, the ensemble models are used to predict salinity concentration at selected monitoring wells in the modelled aquifer under variable groundwater pumping conditions. The predictive capabilities of the developed ensemble models are quantified using standard statistical procedures. The performance evaluation results suggest that the predictive capabilities of the standalone prediction models (ANN, GP, SVMR and GPR) are comparable to those of the groundwater variable-density flow and salt transport numerical simulation model. However, GPR standalone models had better predictive capabilities than the other standalone models. Also, SVMR and GPR standalone models were more efficient (in terms of computational training time) than other standalone models. In terms of ensemble models, the performance of the homogeneous GPR ensemble model was found to be superior to that of the other homogeneous and heterogeneous ensemble models. Employing data-driven predictive models as replacements for complex groundwater flow and transport models enables the prediction of future scenarios and also helps save computational time, effort and requirements when developing optimal coastal aquifer management strategies based on coupled S/O models. In this study, a new data-driven model, namely Group method for data handling (GMDH) approach is developed and utilized to predict salinity concentration in a coastal aquifer and, simultaneously, determine the most influential input predictor variables (pumping rates) that had the most impact onto the outcomes (salinity at monitoring locations). To confirm the importance of variables, three tests are conducted, in which new GMDH models are constructed using subsets of the original datasets. In TEST 1, new GMDH models are constructed using a set of most influential variables only. In TEST 2, a subset of 20 variables (10 most and 10 least influential variables) are used to develop new GMDH models. In TEST 3, a subset of the least influential variables is used to develop GMDH models. A performance evaluation demonstrates that the GMDH models developed using the entire dataset have reasonable predictive accuracy and efficiency. A comparison of the performance evaluations of the three tests highlights the importance of appropriately selecting input pumping rates when developing predictive models. These results suggest that incorporating the least influential variables decreases model accuracy; thus, only considering the most influential variables in salinity prediction models is beneficial and appropriate. This study also investigated the efficiency and viability of using artificial freshwater recharge (AFR) to increase fresh groundwater pumping rates from production wells. First, the effect of AFR on the inland encroachment of saline water is quantified for existing scenarios. Specifically, groundwater head and salinity differences at monitoring locations before and after artificial recharge are presented. Second, a multi-objective management model incorporating groundwater pumping and AFR is implemented to control groundwater salinization in an illustrative coastal aquifer system. A coupled SVMR-MOGA model is developed for prescribing optimal management strategies that incorporate AFR and groundwater pumping wells. The Pareto-optimal front obtained from the SVMR-MOGA optimization model presents a set of optimal solutions for the sustainable management of the coastal aquifer. The pumping strategies obtained as Pareto-optimal solutions with and without freshwater recharge shows that saltwater intrusion is sensitive to AFR. Also, the hydraulic head lenses created by AFR can be used as one practical option to control saltwater intrusion. The developed 3D saltwater intrusion model, the predictive capabilities of the developed SVMR models, and the feasibility of using the proposed coupled multi-objective SVMR-MOGA optimization model make the proposed methodology potentially suitable for solving large-scale regional saltwater intrusion management problems. Overall, the development and evaluation of various groundwater numerical simulation models, predictive models, multi-objective management strategies and adaptive methodologies will provide decision-makers with tools for the sustainable management of coastal aquifers. It is envisioned that the outcomes of this research will provide useful information to groundwater managers and stakeholders, and offer potential resolutions to policy-makers regarding the sustainable management of groundwater resources. The real-life case study of the Bonriki aquifer presented in this study provides the scientific community with a broader understanding of groundwater resource issues in coastal aquifers and establishes the practical utility of the developed management strategies

    Spatially optimised sustainable urban development

    Get PDF
    PhD ThesisTackling urbanisation and climate change requires more sustainable and resilient cities, which in turn will require planners to develop a portfolio of measures to manage climate risks such as flooding, meet energy and greenhouse gas reduction targets, and prioritise development on brownfield sites to preserve greenspace. However, the policies, strategies and measures put in place to meet such objectives can frequently conflict with each other or deliver unintended consequences, hampering long-term sustainability. For example, the densification of cities in order to reduce transport energy use can increase urban heat island effects and surface water flooding from extreme rainfall events. In order to make coherent decisions in the presence of such complex multi-dimensional spatial conflicts, urban planners require sophisticated planning tools to identify and manage potential trade-offs between the spatial strategies necessary to deliver sustainability. To achieve this aim, this research has developed a multi-objective spatial optimisation framework for the spatial planning of new residential development within cities. The implemented framework develops spatial strategies of required new residential development that minimize conflicts between multiple sustainability objectives as a result of planning policy and climate change related hazards. Five key sustainability objectives have been investigated, namely; (i) minimizing risk from heat waves, (ii) minimizing the risk from flood events, (iii) minimizing travel costs in order to reduce transport emissions, (iv) minimizing urban sprawl and (v) preventing development on existing greenspace. A review identified two optimisation algorithms as suitable for this task. Simulated Annealing (SA) is a traditional optimisation algorithm that uses a probabilistic approach to seek out a global optima by iteratively assessing a wide range of spatial configurations against the objectives under consideration. Gradual ‘cooling’, or reducing the probability of jumping to a different region of the objective space, helps the SA to converge on globally optimal spatial patterns. Genetic Algorithms (GA) evolve successive generations of solutions, by both recombining attributes and randomly mutating previous generations of solutions, to search for and converge towards superior spatial strategies. The framework works towards, and outputs, a series of Pareto-optimal spatial plans that outperform all other plans in at least one objective. This approach allows for a range of best trade-off plans for planners to choose from. ii Both SA and GA were evaluated for an initial case study in Middlesbrough, in the North East of England, and were able to identify strategies which significantly improve upon the local authority’s development plan. For example, the GA approach is able to identify a spatial strategy that reduces the travel to work distance between new development and the central business district by 77.5% whilst nullifying the flood risk to the new development. A comparison of the two optimisation approaches for the Middlesbrough case study revealed that the GA is the more effective approach. The GA is more able to escape local optima and on average outperforms the SA by 56% in in the Pareto fronts discovered whilst discovering double the number of multi-objective Pareto-optimal spatial plans. On the basis of the initial Middlesbrough case study the GA approach was applied to the significantly larger, and more computationally complex, problem of optimising spatial development plans for London in the UK – a total area of 1,572km2. The framework identified optimal strategies in less than 400 generations. The analysis showed, for example, strategies that provide the lowest heat risk (compared to the feasible spatial plans found) can be achieved whilst also using 85% brownfield land to locate new development. The framework was further extended to investigate the impact of different development and density regulations. This enabled the identification of optimised strategies, albeit at lower building density, that completely prevent any increase in urban sprawl whilst also improving the heat risk objective by 60% against a business as usual development strategy. Conversely by restricting development to brownfield the ability of the spatial plan to optimise future heat risk is reduced by 55.6% against the business as usual development strategy. The results of both case studies demonstrate the potential of spatial optimisation to provide planners with optimal spatial plans in the presence of conflicting sustainability objectives. The resulting diagnostic information provides an analytical appreciation of the sensitivity between conflicts and therefore the overall robustness of a plan to uncertainty. With the inclusion of further objectives, and qualitative information unsuitable for this type of analysis, spatial optimization can constitute a powerful decision support tool to help planners to identify spatial development strategies that satisfy multiple sustainability objectives and provide an evidence base for better decision making
    corecore