2,954 research outputs found

    Training of Crisis Mappers and Map Production from Multi-sensor Data: Vernazza Case Study (Cinque Terre National Park, Italy)

    Get PDF
    This aim of paper is to presents the development of a multidisciplinary project carried out by the cooperation between Politecnico di Torino and ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action). The goal of the project was the training in geospatial data acquiring and processing for students attending Architecture and Engineering Courses, in order to start up a team of "volunteer mappers". Indeed, the project is aimed to document the environmental and built heritage subject to disaster; the purpose is to improve the capabilities of the actors involved in the activities connected in geospatial data collection, integration and sharing. The proposed area for testing the training activities is the Cinque Terre National Park, registered in the World Heritage List since 1997. The area was affected by flood on the 25th of October 2011. According to other international experiences, the group is expected to be active after emergencies in order to upgrade maps, using data acquired by typical geomatic methods and techniques such as terrestrial and aerial Lidar, close-range and aerial photogrammetry, topographic and GNSS instruments etc.; or by non conventional systems and instruments such us UAV, mobile mapping etc. The ultimate goal is to implement a WebGIS platform to share all the data collected with local authorities and the Civil Protectio

    Workshop sensing a changing world : proceedings workshop November 19-21, 2008

    Get PDF

    Survey of the research of ICT Applications in the AEC Industry: a view from two mainstream journals

    Get PDF
    The application of information and communication technology (ICT) in the Architecture, Engineering and Construction (AEC) industry has attracted much attention by researchers in recent years. However, a comprehensive review is still missing from the existing literature. This paper aims to provide a comprehensive overview of the state-of-the-art research of ICT applications in the AEC industry. A total of 432 articles, published during 2011-2015 in two mainstream journals, namely Automation in Construction and Journal of Computing in Civil Engineering, are selected and analyzed. This review is conducted from three different views: 1) view of construction project lifecycle, aiming to investigate the distribution of research of ICT application in different stages; 2) view of ICT technologies, aiming to identify the popular ICTs in the AEC industry that researches focus on; and 3) view of ICT application areas, aiming to identify the areas in the AEC industry that ICTs are applied in. Throughout this review, the distribution of the research of ICT in four different stages (i.e., design stage, construction stage, operation & maintenance stage, and multistage) is firstly investigated. A total of 24 types of ICTs, categorized into 8 groups and 19 ICT application areas are then identified and analyzed. In additional, limitations of this review are also discussed.published_or_final_versio

    Characterization of extreme weather events on Italian roads

    Get PDF
    According to climate modellers, probability, frequency, duration, intensity (seriousness) of extreme weather events (extreme temperatures and rainfall) are increasing and will be more frequent in future. The former will lead to higher surface runoff and flood events while the latter will cause landslides phenomena and a break of roads network. The impact of such events depends greatly on the physical hydraulic and mechanical properties of soils. Increasing numbers of extreme events in winter time in recent years have demonstrated the paramount importance of effective and integrated management of land resources in the protection of the environment and of the road network. In Italy more than 10% of the territory has been classified as having a high or very high hydro-geological risk, affecting 80% of the Italian municipalities. The impacts on population and the economic damages are relevant. In Italy over the last 20 years, floods and landslides had an impact on more than 70 000 people and caused economic damage of at least 11 billion euro. Since 2000, the Italian Ministry for the Environment entrusted ISPRA the task of monitoring the programmes of emergency measures toreduce hydrogeological risk. (ReNDiS project, database of mitigation measures against floods and landslides)

    The role of big data in smart city

    No full text
    The expansion of big data and the evolution of Internet of Things (IoT) technologies have played an important role in the feasibility of smart city initiatives. Big data offer the potential for cities to obtain valuable insights from a large amount of data collected through various sources, and the IoT allows the integration of sensors, radio-frequency identification, and Bluetooth in the real-world environment using highly networked services. The combination of the IoT and big data is an unexplored research area that has brought new and interesting challenges for achieving the goal of future smart cities. These new challenges focus primarily on problems related to business and technology that enable cities to actualize the vision, principles, and requirements of the applications of smart cities by realizing the main smart environment characteristics. In this paper, we describe the existing communication technologies and smart-based applications used within the context of smart cities. The visions of big data analytics to support smart cities are discussed by focusing on how big data can fundamentally change urban populations at different levels. Moreover, a future business model that can manage big data for smart cities is proposed, and the business and technological research challenges are identified. This study can serve as a benchmark for researchers and industries for the future progress and development of smart cities in the context of big data

    Review of Web Mapping: Eras, Trends and Directions

    Get PDF
    Web mapping and the use of geospatial information online have evolved rapidly over the past few decades. Almost everyone in the world uses mapping information, whether or not one realizes it. Almost every mobile phone now has location services and every event and object on the earth has a location. The use of this geospatial location data has expanded rapidly, thanks to the development of the Internet. Huge volumes of geospatial data are available and daily being captured online, and are used in web applications and maps for viewing, analysis, modeling and simulation. This paper reviews the developments of web mapping from the first static online map images to the current highly interactive, multi-sourced web mapping services that have been increasingly moved to cloud computing platforms. The whole environment of web mapping captures the integration and interaction between three components found online, namely, geospatial information, people and functionality. In this paper, the trends and interactions among these components are identified and reviewed in relation to the technology developments. The review then concludes by exploring some of the opportunities and directions

    A Hybrid Analytic Network Process and Artificial Neural Network (ANP-ANN) model for urban Earthquake vulnerability assessment

    Get PDF
    © 2018 by the authors. Vulnerability assessment is one of the prerequisites for risk analysis in disaster management. Vulnerability to earthquakes, especially in urban areas, has increased over the years due to the presence of complex urban structures and rapid development. Urban vulnerability is a result of human behavior which describes the extent of susceptibility or resilience of social, economic, and physical assets to natural disasters. The main aim of this paper is to develop a new hybrid framework using Analytic Network Process (ANP) and Artificial Neural Network (ANN) models for constructing a composite social, economic, environmental, and physical vulnerability index. This index was then applied to Tabriz City, which is a seismic-prone province in the northwestern part of Iran with recurring devastating earthquakes and consequent heavy casualties and damages. A Geographical Information Systems (GIS) analysis was used to identify and evaluate quantitative vulnerability indicators for generating an earthquake vulnerability map. The classified and standardized indicators were subsequently weighed and ranked using an ANP model to construct the training database. Then, standardized maps coupled with the training site maps were presented as input to aMultilayer Perceptron (MLP) neural network for producing an Earthquake VulnerabilityMap (EVM). Finally, an EVMwas produced for Tabriz City and the level of vulnerability in various zones was obtained. South and southeast regions of Tabriz City indicate low to moderate vulnerability, while some zones of the northeastern tract are under critical vulnerability conditions. Furthermore, the impact of the vulnerability of Tabriz City on population during an earthquake was included in this analysis for risk estimation. A comparison of the result produced by EVM and the Population Vulnerability (PV) of Tabriz City corroborated the validity of the results obtained by ANP-ANN. The findings of this paper are useful for decision-makers and government authorities to obtain a better knowledge of a city's vulnerability dimensions, and to adopt preparedness strategies in the future for Tabriz City. The developed hybrid framework of ANP and ANN Models can easily be replicated and applied to other urban regions around the world for sustainability and environmental management

    The Analysis of Open Source Software and Data for Establishment of GIS Services Throughout the Network in a Mapping Organization at National or International Level

    Get PDF
    Federal agencies and their partners collect and manage large amounts of geospatial data but it is often not easily found when needed, and sometimes data is collected or purchased multiple times. In short, the best government data is not always organized and managed efficiently to support decision making in a timely and cost effective manner. National mapping agencies, various Departments responsible for collection of different types of Geospatial data and their authorities cannot, for very long, continue to operate, as they did a few years ago like people living in an island. Leaders need to look at what is now possible that was not possible before, considering capabilities such as cloud computing, crowd sourced data collection, available Open source remotely sensed data and multi source information vital in decision-making as well as new Web-accessible services that provide, sometimes at no cost. Many of these services previously could be obtained only from local GIS experts. These authorities need to consider the available solution and gather information about new capabilities, reconsider agency missions and goals, review and revise policies, make budget and human resource for decisions, and evaluate new products, cloud services, and cloud service providers. To do so, we need, choosing the right tools to rich the above-mentioned goals. As we know, Data collection is the most cost effective part of the mapping and establishment of a Geographic Information system. However, it is not only because of the cost for the data collection task but also because of the damages caused by the delay and the time that takes to provide the user with proper information necessary for making decision from the field up to the user’s hand. In fact, the time consumption of a project for data collection, processing, and presentation of geospatial information has more effect on the cost of a bigger project such as disaster management, construction, city planning, environment, etc. Of course, with such a pre-assumption that we provide all the necessary information from the existing sources directed to user’s computer. The best description for a good GIS project optimization or improvement is finding a methodology to reduce the time and cost, and increase data and service quality (meaning; Accuracy, updateness, completeness, consistency, suitability, information content, integrity, integration capability, and fitness for use as well as user’s specific needs and conditions that must be addressed with a special attention). Every one of the above-mentioned issues must be addressed individually and at the same time, the whole solution must be provided in a global manner considering all the criteria. In this thesis at first, we will discuss about the problem we are facing and what is needed to be done as establishment of National Spatial Data Infra-Structure (NSDI), the definition and related components. Then after, we will be looking for available Open Source Software solutions to cover the whole process to manage; Data collection, Data base management system, data processing and finally data services and presentation. The first distinction among Software is whether they are, Open source and free or commercial and proprietary. It is important to note that in order to make distinction among softwares it is necessary to define a clear specification for this categorization. It is somehow very difficult to distinguish what software belongs to which class from legal point of view and therefore, makes it necessary to clarify what is meant by various terms. With reference to this concept there are 2 global distinctions then, inside each group, we distinguish another classification regarding their functionalities and applications they are made for in GIScience. According to the outcome of the second chapter, which is the technical process for selection of suitable and reliable software according to the characteristics of the users need and required components, we will come to next chapter. In chapter 3, we elaborate in to the details of the GeoNode software as our best candidate tools to take responsibilities of those issues stated before. In Chapter 4, we will discuss the existing Open Source Data globally available with the predefined data quality criteria (Such as theme, data content, scale, licensing, and coverage) according to the metadata statement inside the datasets by mean of bibliographic review, technical documentation and web search engines. We will discuss in chapter 5 further data quality concepts and consequently define sets of protocol for evaluation of all datasets according to the tasks that a mapping organization in general, needed to be responsible to the probable users in different disciplines such as; Reconnaissance, City Planning, Topographic mapping, Transportation, Environment control, disaster management and etc… In Chapter 6, all the data quality assessment and protocols will be implemented into the pre-filtered, proposed datasets. In the final scores and ranking result, each datasets will have a value corresponding to their quality according to the sets of rules that are defined in previous chapter. In last steps, there will be a vector of weight that is derived from the questions that has to be answered by user with reference to the project in hand in order to finalize the most appropriate selection of Free and Open Source Data. This Data quality preference has to be defined by identifying a set of weight vector, and then they have to be applied to the quality matrix in order to get a final quality scores and ranking. At the end of this chapter there will be a section presenting data sets utilization in various projects such as “ Early Impact Analysis” as well as “Extreme Rainfall Detection System (ERDS)- version 2” performed by ITHACA. Finally, in conclusion, the important criteria, as well as future trend in GIS software are discussed and at the end recommendations will be presented

    Emerging Informatics

    Get PDF
    The book on emerging informatics brings together the new concepts and applications that will help define and outline problem solving methods and features in designing business and human systems. It covers international aspects of information systems design in which many relevant technologies are introduced for the welfare of human and business systems. This initiative can be viewed as an emergent area of informatics that helps better conceptualise and design new world-class solutions. The book provides four flexible sections that accommodate total of fourteen chapters. The section specifies learning contexts in emerging fields. Each chapter presents a clear basis through the problem conception and its applicable technological solutions. I hope this will help further exploration of knowledge in the informatics discipline

    A Haze Removal Technique For Satellite Remote Sensing Data Based On Spectral And Statistical Methods

    Get PDF
    Haze originated from forest fire burning in Indonesia has become a problem for South-east Asian countries including Malaysia. Haze affects data recorded using satellite due to attenuation of solar radiation by haze constituents. This causes problems to remote sensing data users that require continuous data, particularly for land cover mapping. There are numbers of haze removal techniques but these techniques suffer from limitations since they are developed and designed best for particular regions, i.e. mid-latitude and high-latitude countries. Almost no haze removal techniques are developed and designed for countries within equatorial region where Malaysia is located. This study is meant to identify the effects of haze on remote sensing data, develop haze removal technique that is suitable for equatorial region, especially Malaysia and evaluate and test it. Initially, spectral and statistical analyses of simulated haze datasets are carried out to identify the effects of haze on remote sensing data. Land cover classification using support vector machine (SVM) is carried out in order to investigate the haze effects on different land covers. The outcomes of the analyses are used in designing and developing the haze removal technique. Haze radiances due to radiation attenuation are removed by making use of pseudo invariant features (PIFs) selected among reflective objects within the study area. Spatial filters are subsequently used to remove the remaining noise causes by haze variability. The technique is applied on simulated hazy dataset for performance evaluation and then tested on real hazy dataset. It is revealed that, the technique is able to remove haze and improve the data usage for visibility ranging from 6 to 12 km. Haze removal is not necessary for data with visibility more than 12 km because able to produce classification accuracy more than 85%, i.e. the acceptable accuracy. Nevertheless, for data with visibility less than 6 km, the technique is unable to improve the accuracy to the acceptable one due to the severe modification of spectral and statistical properties caused by haze
    corecore