10,125 research outputs found

    Computing, a powerful tool for improving the parameters simulation quality in flood prediction

    Get PDF
    Floods have caused widespread damage throughout the world. Modelling and simulation provide solutions and tools which enable us to forecast and make necessary steps toward prevention. One problem that must be handled by physical systems simulators is the parameters uncertainty and their impact on output results, causing prediction errors. In this paper, we address input parameter uncertainty toward providing a methodology to tune a flood simulator and achieve lower error between simulated and observed results. The tuning methodology, through a parametric simulation technique, implements a first stage to find an adjusted set of critical parameters which will be used to validate the predictive capability of the simulator in order to reduce the disagreement between observed data and simulated results. We concentrate our experiments in three significant monitoring stations, located at the lower basin of the Paraná River in Argentina, and the percentage of improvement over the original simulator values ranges from 33 to 60%.Facultad de Informátic

    Computing, a powerful tool for improving the parameters simulation quality in flood prediction

    Get PDF
    Floods have caused widespread damage throughout the world. Modelling and simulation provide solutions and tools which enable us to forecast and make necessary steps toward prevention. One problem that must be handled by physical systems simulators is the parameters uncertainty and their impact on output results, causing prediction errors. In this paper, we address input parameter uncertainty toward providing a methodology to tune a flood simulator and achieve lower error between simulated and observed results. The tuning methodology, through a parametric simulation technique, implements a first stage to find an adjusted set of critical parameters which will be used to validate the predictive capability of the simulator in order to reduce the disagreement between observed data and simulated results. We concentrate our experiments in three significant monitoring stations, located at the lower basin of the Paraná River in Argentina, and the percentage of improvement over the original simulator values ranges from 33 to 60%.Facultad de Informátic

    Dynamic Data Driven approach to improve theperformance of a river simulation

    Get PDF
    In this research we incorporate the contributions of the dynamic data driven systems development that is based on the possibility of incorporating data obtained in real time into an executing application, in particular a simulation. This paper reports on the first phase of our research in which we have used this idea to enhance the simulation quality of a river flow simulator by dynamic data inputs during the computational execution. We had presented an optimization methodology of this simulator model in previous works but in this opportunity, we could handle those time periods when a sudden level change takes place inthe river and we could improve the forecasting prediction. These results are the path towards the development of an automatic calibration framework fed with real-time data.Instituto de Investigación en InformáticaInstituto de Investigación en Informátic

    Training of Crisis Mappers and Map Production from Multi-sensor Data: Vernazza Case Study (Cinque Terre National Park, Italy)

    Get PDF
    This aim of paper is to presents the development of a multidisciplinary project carried out by the cooperation between Politecnico di Torino and ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action). The goal of the project was the training in geospatial data acquiring and processing for students attending Architecture and Engineering Courses, in order to start up a team of "volunteer mappers". Indeed, the project is aimed to document the environmental and built heritage subject to disaster; the purpose is to improve the capabilities of the actors involved in the activities connected in geospatial data collection, integration and sharing. The proposed area for testing the training activities is the Cinque Terre National Park, registered in the World Heritage List since 1997. The area was affected by flood on the 25th of October 2011. According to other international experiences, the group is expected to be active after emergencies in order to upgrade maps, using data acquired by typical geomatic methods and techniques such as terrestrial and aerial Lidar, close-range and aerial photogrammetry, topographic and GNSS instruments etc.; or by non conventional systems and instruments such us UAV, mobile mapping etc. The ultimate goal is to implement a WebGIS platform to share all the data collected with local authorities and the Civil Protectio

    Big data analytics:Computational intelligence techniques and application areas

    Get PDF
    Big Data has significant impact in developing functional smart cities and supporting modern societies. In this paper, we investigate the importance of Big Data in modern life and economy, and discuss challenges arising from Big Data utilization. Different computational intelligence techniques have been considered as tools for Big Data analytics. We also explore the powerful combination of Big Data and Computational Intelligence (CI) and identify a number of areas, where novel applications in real world smart city problems can be developed by utilizing these powerful tools and techniques. We present a case study for intelligent transportation in the context of a smart city, and a novel data modelling methodology based on a biologically inspired universal generative modelling approach called Hierarchical Spatial-Temporal State Machine (HSTSM). We further discuss various implications of policy, protection, valuation and commercialization related to Big Data, its applications and deployment

    Causative factors of construction and demolition waste generation in Iraq Construction Industry

    Get PDF
    The construction industry has hurt the environment from the waste generated during construction activities. Thus, it calls for serious measures to determine the causative factors of construction waste generated. There are limited studies on factors causing construction, and demolition (C&D) waste generation, and these limited studies only focused on the quantification of construction waste. This study took the opportunity to identify the causative factors for the C&D waste generation and also to determine the risk level of each causal factor, and the most important minimization methods to avoiding generating waste. This study was carried out based on the quantitative approach. A total of 39 factors that causes construction waste generation that has been identified from the literature review were considered which were then clustered into 4 groups. Improved questionnaire surveys by 38 construction experts (consultants, contractors and clients) during the pilot study. The actual survey was conducted with a total of 380 questionnaires, received with a response rate of 83.3%. Data analysis was performed using SPSS software. Ranking analysis using the mean score approach found the five most significant causative factors which are poor site management, poor planning, lack of experience, rework and poor controlling. The result also indicated that the majority of the identified factors having a high-risk level, in addition, the better minimization method is environmental awareness. A structural model was developed based on the 4 groups of causative factors using the Partial Least Squared-Structural Equation Modelling (PLS-SEM) technique. It was found that the model fits due to the goodness of fit (GOF ≥ 0.36= 0.658, substantial). Based on the outcome of this study, 39 factors were relevant to the generation of construction and demolition waste in Iraq. These groups of factors should be avoided during construction works to reduce the waste generated. The findings of this study are helpful to authorities and stakeholders in formulating laws and regulations. Furthermore, it provides opportunities for future researchers to conduct additional research’s on the factors that contribute to construction waste generation

    Short Papers of the 8th Conference on Cloud Computing Conference, Big Data & Emerging Topics (JCC-BD&ET 2020)

    Get PDF
    Compilación de los short papers presentados en las 8vas Jornadas de Cloud Computing, Big Data & Emerging Topics (JCC-BD&ET2020), llevadas a cabo en modalidad virtual durante septiembre de 2020 y organizadas por el Instituto de Investigación en Informática LIDI (III-LIDI) y la Secretaría de Posgrado de la Facultad de Informática de la UNLP en colaboración con universidades de Argentina y del exterior.Facultad de Informátic

    Precision-Aware application execution for Energy-optimization in HPC node system

    Get PDF
    Power consumption is a critical consideration in high performance computing systems and it is becoming the limiting factor to build and operate Petascale and Exascale systems. When studying the power consumption of existing systems running HPC workloads, we find that power, energy and performance are closely related which leads to the possibility to optimize energy consumption without sacrificing (much or at all) the performance. In this paper, we propose a HPC system running with a GNU/Linux OS and a Real Time Resource Manager (RTRM) that is aware and monitors the healthy of the platform. On the system, an application for disaster management runs. The application can run with different QoS depending on the situation. We defined two main situations. Normal execution, when there is no risk of a disaster, even though we still have to run the system to look ahead in the near future if the situation changes suddenly. In the second scenario, the possibilities for a disaster are very high. Then the allocation of more resources for improving the precision and the human decision has to be taken into account. The paper shows that at design time, it is possible to describe different optimal points that are going to be used at runtime by the RTOS with the application. This environment helps to the system that must run 24/7 in saving energy with the trade-off of losing precision. The paper shows a model execution which can improve the precision of results by 65% in average by increasing the number of iterations from 1e3 to 1e4. This also produces one order of magnitude longer execution time which leads to the need to use a multi-node solution. The optimal trade-off between precision vs. execution time is computed by the RTOS with the time overhead less than 10% against a native execution
    corecore