3,601 research outputs found
Precision-Aware application execution for Energy-optimization in HPC node system
Power consumption is a critical consideration in high performance computing
systems and it is becoming the limiting factor to build and operate Petascale
and Exascale systems. When studying the power consumption of existing systems
running HPC workloads, we find that power, energy and performance are closely
related which leads to the possibility to optimize energy consumption without
sacrificing (much or at all) the performance. In this paper, we propose a HPC
system running with a GNU/Linux OS and a Real Time Resource Manager (RTRM) that
is aware and monitors the healthy of the platform. On the system, an
application for disaster management runs. The application can run with
different QoS depending on the situation. We defined two main situations.
Normal execution, when there is no risk of a disaster, even though we still
have to run the system to look ahead in the near future if the situation
changes suddenly. In the second scenario, the possibilities for a disaster are
very high. Then the allocation of more resources for improving the precision
and the human decision has to be taken into account. The paper shows that at
design time, it is possible to describe different optimal points that are going
to be used at runtime by the RTOS with the application. This environment helps
to the system that must run 24/7 in saving energy with the trade-off of losing
precision. The paper shows a model execution which can improve the precision of
results by 65% in average by increasing the number of iterations from 1e3 to
1e4. This also produces one order of magnitude longer execution time which
leads to the need to use a multi-node solution. The optimal trade-off between
precision vs. execution time is computed by the RTOS with the time overhead
less than 10% against a native execution
Development of sustainable groundwater management methodologies to control saltwater intrusion into coastal aquifers with application to a tropical Pacific island country
Saltwater intrusion due to the over-exploitation of groundwater in coastal aquifers is a critical challenge facing groundwater-dependent coastal communities throughout the world. Sustainable management of coastal aquifers for maintaining abstracted groundwater quality within permissible salinity limits is regarded as an important groundwater management problem necessitating urgent reliable and optimal management methodologies. This study focuses on the development and evaluation of groundwater salinity prediction tools, coastal aquifer multi-objective management strategies, and adaptive management strategies using new prediction models, coupled simulation-optimization (S/O) models, and monitoring network design, respectively.
Predicting the extent of saltwater intrusion into coastal aquifers in response to existing and changing pumping patterns is a prerequisite of any groundwater management framework. This study investigates the feasibility of using support vector machine regression (SVMR), an innovative artificial intelligence-based machine learning algorithm, to predict salinity at monitoring wells in an illustrative aquifer under variable groundwater pumping conditions. For evaluation purposes, the prediction results of SVMR are compared with well-established genetic programming (GP) based surrogate models. The prediction capabilities of the two learning machines are evaluated using several measures to ensure their practicality and generalisation ability. Also, a sensitivity analysis methodology is proposed for assessing the impact of pumping rates on salt concentrations at monitoring locations. The performance evaluations suggest that the predictive capability of SVMR is superior to that of GP models. The sensitivity analysis identifies a subset of the most influential pumping rates, which is used to construct new SVMR surrogate models with improved predictive capabilities. The improved predictive capability and generalisation ability of SVMR models, together with the ability to improve the accuracy of prediction by refining the dataset used for training, make the use of SVMR models more attractive.
Coupled S/O models are efficient tools that are used for designing multi-objective coastal aquifer management strategies. This study applies a regional-scale coupled S/O methodology with a Pareto front clustering technique to prescribe optimal groundwater withdrawal patterns from the Bonriki aquifer in the Pacific Island of Kiribati. A numerical simulation model is developed, calibrated and validated using field data from the Bonriki aquifer. For computational feasibility, SVMR surrogate models are trained and tested utilizing input-output datasets generated using the flow and transport numerical simulation model. The developed surrogate models were externally coupled with a multi-objective genetic algorithm optimization (MOGA) model, as a substitute for the numerical model. The study area consisted of freshwater pumping wells for extracting groundwater. Pumping from barrier wells installed along the coastlines is also considered as a management option to hydraulically control saltwater intrusion. The objective of the multi-objective management model was to maximise pumping from production wells and minimize pumping from barrier wells (which provide a hydraulic barrier) to ensure that the water quality at different monitoring locations remains within pre-specified limits. The executed multi-objective coupled S/O model generated 700 Pareto-optimal solutions. Analysing a large set of Pareto-optimal solution is a challenging task for the decision-makers. Hence, the k-means clustering technique was utilized to reduce the large Pareto-optimal solution set and help solve the large-scale saltwater intrusion problem in the Bonriki aquifer.
The S/O-based management models have delivered optimal saltwater intrusion management strategies. However, at times, uncertainties in the numerical simulation model due to uncertain aquifer parameters are not incorporated into the management models. The present study explicitly incorporates aquifer parameter uncertainty into a multi-objective management model for the optimal design of groundwater pumping strategies from the unconfined Bonriki aquifer. To achieve computational efficiency and feasibility of the management model, the calibrated numerical simulation model in the S/O model was is replaced with ensembles of SVMR surrogate models. Each SVMR standalone surrogate model in the ensemble is constructed using datasets from different numerical simulation models with different hydraulic conductivity and porosity values. These ensemble SVMR models were coupled to the MOGA model to solve the Bonriki aquifer management problem for ensuring sustainable withdrawal rates that maintain specified salinity limits. The executed optimization model presented a Pareto-front with 600 non-dominated optimal trade-off pumping solutions. The reliability of the management model, established after validation of the optimal solution results, suggests that the implemented constraints of the optimization problem were satisfied; i.e., the salinities at monitoring locations remained within the pre-specified limits.
The correct implementation of a prescribed optimal management strategy based on the coupled S/O model is always a concern for decision-makers. The management strategy actually implemented in the field sometimes deviates from the recommended optimal strategy, resulting in field-level deviations. Monitoring such field-level deviations during actual implementation of the recommended optimal management strategy and sequentially updating the strategy using feedback information is an important step towards adaptive management of coastal groundwater resources. In this study, a three-phase adaptive management framework for a coastal aquifer subjected to saltwater intrusion is applied and evaluated for a regional-scale coastal aquifer study area. The methodology adopted includes three sequential components. First, an optimal management strategy (consisting of groundwater extraction from production and barrier wells) is derived and implemented for the optimal management of the aquifer. The implemented management strategy is obtained by solving a homogeneous ensemble-based coupled S/O model. Second, a regional-scale optimal monitoring network is designed for the aquifer system, which considers possible user noncompliance of a recommended management strategy and uncertainty in aquifer parameter estimates. A new monitoring network design is formulated to ensure that candidate monitoring wells are placed at high risk (highly contaminated) locations. In addition, a k-means clustering methodology is utilized to select candidate monitoring wells in areas representative of the entire model domain. Finally, feedback information in the form of salinity measurements at monitoring wells is used to sequentially modify pumping strategies for future time periods in the management horizon. The developed adaptive management framework is evaluated by applying it to the Bonriki aquifer system. Overall, the results of this study suggest that the implemented adaptive management strategy has the potential to address practical implementation issues arising due to user noncompliance, as well as deviations between predicted and actual consequences of implementing a management strategy, and uncertainty in aquifer parameters.
The use of ensemble prediction models is known to be more accurate standalone prediction models. The present study develops and utilises homogeneous and heterogeneous ensemble models based on several standalone evolutionary algorithms, including artificial neural networks (ANN), GP, SVMR and Gaussian process regression (GPR). These models are used to predict groundwater salinity in the Bonriki aquifer. Standalone and ensemble prediction models are trained and validated using identical pumping and salinity concentration datasets generated by solving numerical 3D transient density-dependent coastal aquifer flow and transport numerical simulation models. After validation, the ensemble models are used to predict salinity concentration at selected monitoring wells in the modelled aquifer under variable groundwater pumping conditions. The predictive capabilities of the developed ensemble models are quantified using standard statistical procedures. The performance evaluation results suggest that the predictive capabilities of the standalone prediction models (ANN, GP, SVMR and GPR) are comparable to those of the groundwater variable-density flow and salt transport numerical simulation model. However, GPR standalone models had better predictive capabilities than the other standalone models. Also, SVMR and GPR standalone models were more efficient (in terms of computational training time) than other standalone models. In terms of ensemble models, the performance of the homogeneous GPR ensemble model was found to be superior to that of the other homogeneous and heterogeneous ensemble models.
Employing data-driven predictive models as replacements for complex groundwater flow and transport models enables the prediction of future scenarios and also helps save computational time, effort and requirements when developing optimal coastal aquifer management strategies based on coupled S/O models. In this study, a new data-driven model, namely Group method for data handling (GMDH) approach is developed and utilized to predict salinity concentration in a coastal aquifer and, simultaneously, determine the most influential input predictor variables (pumping rates) that had the most impact onto the outcomes (salinity at monitoring locations). To confirm the importance of variables, three tests are conducted, in which new GMDH models are constructed using subsets of the original datasets. In TEST 1, new GMDH models are constructed using a set of most influential variables only. In TEST 2, a subset of 20 variables (10 most and 10 least influential variables) are used to develop new GMDH models. In TEST 3, a subset of the least influential variables is used to develop GMDH models. A performance evaluation demonstrates that the GMDH models developed using the entire dataset have reasonable predictive accuracy and efficiency. A comparison of the performance evaluations of the three tests highlights the importance of appropriately selecting input pumping rates when developing predictive models. These results suggest that incorporating the least influential variables decreases model accuracy; thus, only considering the most influential variables in salinity prediction models is beneficial and appropriate.
This study also investigated the efficiency and viability of using artificial freshwater recharge (AFR) to increase fresh groundwater pumping rates from production wells. First, the effect of AFR on the inland encroachment of saline water is quantified for existing scenarios. Specifically, groundwater head and salinity differences at monitoring locations before and after artificial recharge are presented. Second, a multi-objective management model incorporating groundwater pumping and AFR is implemented to control groundwater salinization in an illustrative coastal aquifer system. A coupled SVMR-MOGA model is developed for prescribing optimal management strategies that incorporate AFR and groundwater pumping wells. The Pareto-optimal front obtained from the SVMR-MOGA optimization model presents a set of optimal solutions for the sustainable management of the coastal aquifer. The pumping strategies obtained as Pareto-optimal solutions with and without freshwater recharge shows that saltwater intrusion is sensitive to AFR. Also, the hydraulic head lenses created by AFR can be used as one practical option to control saltwater intrusion. The developed 3D saltwater intrusion model, the predictive capabilities of the developed SVMR models, and the feasibility of using the proposed coupled multi-objective SVMR-MOGA optimization model make the proposed methodology potentially suitable for solving large-scale regional saltwater intrusion management problems.
Overall, the development and evaluation of various groundwater numerical simulation models, predictive models, multi-objective management strategies and adaptive methodologies will provide decision-makers with tools for the sustainable management of coastal aquifers. It is envisioned that the outcomes of this research will provide useful information to groundwater managers and stakeholders, and offer potential resolutions to policy-makers regarding the sustainable management of groundwater resources. The real-life case study of the Bonriki aquifer presented in this study provides the scientific community with a broader understanding of groundwater resource issues in coastal aquifers and establishes the practical utility of the developed management strategies
Developing Real-Time Emergency Management Applications: Methodology for a Novel Programming Model Approach
The last years have been characterized by the arising of highly distributed computing
platforms composed of a heterogeneity of computing and communication resources including
centralized high-performance computing architectures (e.g. clusters or large shared-memory
machines), as well as multi-/many-core components also integrated into mobile nodes
and network facilities. The emerging of computational paradigms such as Grid and Cloud
Computing, provides potential solutions to integrate such platforms with data systems, natural
phenomena simulations, knowledge discovery and decision support systems responding to a
dynamic demand of remote computing and communication resources and services.
In this context time-critical applications, notably emergency management systems, are
composed of complex sets of application components specialized for executing specific
computations, which are able to cooperate in such a way as to perform a global goal in a
distributed manner. Since the last years the scientific community has been involved in facing
with the programming issues of distributed systems, aimed at the definition of applications
featuring an increasing complexity in the number of distributed components, in the spatial
distribution and cooperation between interested parties and in their degree of heterogeneity.
Over the last decade the research trend in distributed computing has been focused on
a crucial objective. The wide-ranging composition of distributed platforms in terms of
different classes of computing nodes and network technologies, the strong diffusion of
applications that require real-time elaborations and online compute-intensive processing as
in the case of emergency management systems, lead to a pronounced tendency of systems
towards properties like self-managing, self-organization, self-controlling and strictly speaking
adaptivity.
Adaptivity implies the development, deployment, execution and management of applications
that, in general, are dynamic in nature. Dynamicity concerns the number and the specific
identification of cooperating components, the deployment and composition of the most
suitable versions of software components on processing and networking resources and
services, i.e., both the quantity and the quality of the application components to achieve
the needed Quality of Service (QoS). In time-critical applications the QoS specification
can dynamically vary during the execution, according to the user intentions and the
Developing Real-Time Emergency
Management Applications: Methodology for
a Novel Programming Model Approach
Gabriele Mencagli and Marco Vanneschi
Department of Computer Science, University of Pisa, L. Bruno Pontecorvo, Pisa
Italy
2
2 Will-be-set-by-IN-TECH
information produced by sensors and services, as well as according to the monitored state
and performance of networks and nodes.
The general reference point for this kind of systems is the Grid paradigm which, by
definition, aims to enable the access, selection and aggregation of a variety of distributed and
heterogeneous resources and services. However, though notable advancements have been
achieved in recent years, current Grid technology is not yet able to supply the needed software
tools with the features of high adaptivity, ubiquity, proactivity, self-organization, scalability
and performance, interoperability, as well as fault tolerance and security, of the emerging
applications.
For this reason in this chapter we will study a methodology for designing high-performance
computations able to exploit the heterogeneity and dynamicity of distributed environments
by expressing adaptivity and QoS-awareness directly at the application level. An effective
approach needs to address issues like QoS predictability of different application configurations
as well as the predictability of reconfiguration costs. Moreover adaptation strategies need to
be developed assuring properties like the stability degree of a reconfiguration decision and the
execution optimality (i.e. select reconfigurations accounting proper trade-offs among different
QoS objectives). In this chapter we will present the basic points of a novel approach that lays
the foundations for future programming model environments for time-critical applications
such as emergency management systems.
The organization of this chapter is the following. In Section 2 we will compare the existing
research works for developing adaptive systems in critical environments, highlighting their
drawbacks and inefficiencies. In Section 3, in order to clarify the application scenarios that
we are considering, we will present an emergency management system in which the run-time
selection of proper application configuration parameters is of great importance for meeting the
desired QoS constraints. In Section 4we will describe the basic points of our approach in terms
of how compute-intensive operations can be programmed, how they can be dynamically
modified and how adaptation strategies can be expressed. In Section 5 our approach will
be contextualize to the definition of an adaptive parallel module, which is a building block
for composing complex and distributed adaptive computations. Finally in Section 6 we will
describe a set of experimental results that show the viability of our approach and in Section 7
we will give the concluding remarks of this chapter
Decision Analysis for Management of Natural Hazards
Losses from natural hazards, including geophysical and hydrometeorological hazards, have been increasing worldwide. This review focuses on the process by which scientific evidence about natural hazards is applied to support decision making. Decision analysis typically involves estimating the probability of extreme events; assessing the potential impacts of those events from a variety of perspectives; and evaluating options to plan for, mitigate, or react to events. We consider issues that affect decisions made across a range of natural hazards, summarize decision methodologies, and provide examples of applications of decision analysis to the management of natural hazards. We conclude that there is potential for further exchange of ideas and experience between natural hazard research communities on decision analysis approaches. Broader application of decision methodologies to natural hazard management and evaluation of existing decision approaches can potentially lead to more efficient allocation of scarce resources and more efficient risk management
- …