4,397 research outputs found
Evaluating the Differences of Gridding Techniques for Digital Elevation Models Generation and Their Influence on the Modeling of Stony Debris Flows Routing: A Case Study From Rovina di Cancia Basin (North-Eastern Italian Alps)
Debris \ufb02ows are among the most hazardous phenomena in mountain areas. To cope
with debris \ufb02ow hazard, it is common to delineate the risk-prone areas through
routing models. The most important input to debris \ufb02ow routing models are the
topographic data, usually in the form of Digital Elevation Models (DEMs). The quality
of DEMs depends on the accuracy, density, and spatial distribution of the sampled
points; on the characteristics of the surface; and on the applied gridding methodology.
Therefore, the choice of the interpolation method affects the realistic representation
of the channel and fan morphology, and thus potentially the debris \ufb02ow routing
modeling outcomes. In this paper, we initially investigate the performance of common
interpolation methods (i.e., linear triangulation, natural neighbor, nearest neighbor,
Inverse Distance to a Power, ANUDEM, Radial Basis Functions, and ordinary kriging)
in building DEMs with the complex topography of a debris \ufb02ow channel located
in the Venetian Dolomites (North-eastern Italian Alps), by using small footprint full-
waveform Light Detection And Ranging (LiDAR) data. The investigation is carried
out through a combination of statistical analysis of vertical accuracy, algorithm
robustness, and spatial clustering of vertical errors, and multi-criteria shape reliability
assessment. After that, we examine the in\ufb02uence of the tested interpolation algorithms
on the performance of a Geographic Information System (GIS)-based cell model for
simulating stony debris \ufb02ows routing. In detail, we investigate both the correlation
between the DEMs heights uncertainty resulting from the gridding procedure and
that on the corresponding simulated erosion/deposition depths, both the effect of
interpolation algorithms on simulated areas, erosion and deposition volumes, solid-liquid
discharges, and channel morphology after the event. The comparison among the tested
interpolation methods highlights that the ANUDEM and ordinary kriging algorithms
are not suitable for building DEMs with complex topography. Conversely, the linear
triangulation, the natural neighbor algorithm, and the thin-plate spline plus tension and completely regularized spline functions ensure the best trade-off among accuracy
and shape reliability. Anyway, the evaluation of the effects of gridding techniques on
debris \ufb02ow routing modeling reveals that the choice of the interpolation algorithm does
not signi\ufb01cantly affect the model outcomes
Efficient Algorithms for Coastal Geographic Problems
The increasing performance of computers has made it possible to solve algorithmically problems for which manual and possibly inaccurate methods have been previously used. Nevertheless, one must still pay attention to the performance of an algorithm if huge datasets are used or if the problem iscomputationally diïŹcult.
Two geographic problems are studied in the articles included in this thesis. In the ïŹrst problem the goal is to determine distances from points, called study points, to shorelines in predeïŹned directions. Together with other in-formation, mainly related to wind, these distances can be used to estimate wave exposure at diïŹerent areas. In the second problem the input consists of a set of sites where water quality observations have been made and of the results of the measurements at the diïŹerent sites. The goal is to select a subset of the observational sites in such a manner that water quality is still measured in a suïŹcient accuracy when monitoring at the other sites is stopped to reduce economic cost.
Most of the thesis concentrates on the ïŹrst problem, known as the fetch length problem. The main challenge is that the two-dimensional map is represented as a set of polygons with millions of vertices in total and the distances may also be computed for millions of study points in several directions. EïŹcient algorithms are developed for the problem, one of them approximate and the others exact except for rounding errors. The solutions also diïŹer in that three of them are targeted for serial operation or for a small number of CPU cores whereas one, together with its further developments, is suitable also for parallel machines such as GPUs.Tietokoneiden suorituskyvyn kasvaminen on tehnyt mahdolliseksi ratkaista algoritmisesti ongelmia, joita on aiemmin tarkasteltu paljon ihmistyötĂ€ vaativilla, mahdollisesti epĂ€tarkoilla, menetelmillĂ€. Algoritmien suorituskykyyn on kuitenkin toisinaan edelleen kiinnitettĂ€vĂ€ huomiota lĂ€htömateriaalin suuren mÀÀrĂ€n tai ongelman laskennallisen vaikeuden takia.
VÀitöskirjaansisÀltyvissÀartikkeleissatarkastellaankahtamaantieteellistÀ ongelmaa. EnsimmÀisessÀ nÀistÀ on mÀÀritettÀvÀ etÀisyyksiÀ merellÀ olevista pisteistÀ lÀhimpÀÀn rantaviivaan ennalta mÀÀrÀtyissÀ suunnissa. EtÀisyyksiÀ ja tuulen voimakkuutta koskevien tietojen avulla on mahdollista arvioida esimerkiksi aallokon voimakkuutta. Toisessa ongelmista annettuna on joukko tarkkailuasemia ja niiltÀ aiemmin kerÀttyÀ tietoa erilaisista vedenlaatua kuvaavista parametreista kuten sameudesta ja ravinteiden mÀÀristÀ. TehtÀvÀnÀ on valita asemajoukosta sellainen osa joukko, ettÀ vedenlaatua voidaan edelleen tarkkailla riittÀvÀllÀ tarkkuudella, kun mittausten tekeminen muilla havaintopaikoilla lopetetaan kustannusten sÀÀstÀmiseksi.
VÀitöskirja keskittyy pÀÀosin ensimmÀisen ongelman, suunnattujen etÀisyyksien, ratkaisemiseen. Haasteena on se, ettÀ tarkasteltava kaksiulotteinen kartta kuvaa rantaviivan tyypillisesti miljoonista kÀrkipisteistÀ koostuvana joukkonapolygonejajaetÀisyyksiÀonlaskettavamiljoonilletarkastelupisteille kymmenissÀ eri suunnissa. Ongelmalle kehitetÀÀn tehokkaita ratkaisutapoja, joista yksi on likimÀÀrÀinen, muut pyöristysvirheitÀ lukuun ottamatta tarkkoja. Ratkaisut eroavat toisistaan myös siinÀ, ettÀ kolme menetelmistÀ on suunniteltu ajettavaksi sarjamuotoisesti tai pienellÀ mÀÀrÀllÀ suoritinytimiÀ, kun taas yksi menetelmistÀ ja siihen tehdyt parannukset soveltuvat myös voimakkaasti rinnakkaisille laitteille kuten GPU:lle.
Vedenlaatuongelmassa annetulla asemajoukolla on suuri mÀÀrÀ mahdollisia osajoukkoja. LisÀksi tehtÀvÀssÀ kÀytetÀÀn aikaa vaativia operaatioita kuten lineaarista regressiota, mikÀ entisestÀÀn rajoittaa sitÀ, kuinka monta osajoukkoa voidaan tutkia. Ratkaisussa kÀytetÀÀnkin heuristiikkoja, jotkaeivÀt vÀlttÀmÀttÀ tuota optimaalista lopputulosta.Siirretty Doriast
Semi-automated modeling approaches to route selection in GIS
As an alternative to traditional graphical intuitive approaches (GIA), a semi-automated modeling approach (SMA) can more efficiently identify linear routes by using powerful iterative and automated methods. In this research, two case studies were investigated to examine critical issues relating to the accuracy and effectiveness of raster-defined algorithmic approaches to linear route location. The results illustrate that different shortest-path algorithms do not necessarily result in markedly different linear routes. However, differing results can occur when using different neighboring-cell links in the cell-based route network construction. Cell-based algorithmic approaches in both Arc/Info and IDRISI software generate very similar results which are comparable to linear modeling with greater than eight neighboring-cell links. Given a specific shortest-path algorithm and its route searching technique, the use of a finer spatial resolution only results in a narrower and smoother route corridor. Importantly, cost surface models can be generated to represent differing cumulative environmental \u27costs\u27 or impacts in which different perceptions of environmental cost can be simulated and evaluated.;Three different simulation techniques comprising Ordered Weighted Combination models (OWC), Dynamic Decision Space (DDS), and Gateway-based approaches, were used to address problems associated with concurrent and dynamic changes in multi-objective decision space. These approaches provide efficient and flexible simulation capability within a dynamic and changing decision space. When aggregation data models were used within a Gateway approach the match of resulting routes between GIA and SMA analyses is close. The effectiveness of SMA is greatly limited when confronted by extensive linear and impermeable barriers or where data is sparse. Overall, achieving consensus on environmental cost surface generation and criteria selection is a prerequisite for a successful SMA outcome. It is concluded that SMA has several positive advantages that certainly complement a GIA in linear route siting and spatial decision-making
Route Planning in Transportation Networks
We survey recent advances in algorithms for route planning in transportation
networks. For road networks, we show that one can compute driving directions in
milliseconds or less even at continental scale. A variety of techniques provide
different trade-offs between preprocessing effort, space requirements, and
query time. Some algorithms can answer queries in a fraction of a microsecond,
while others can deal efficiently with real-time traffic. Journey planning on
public transportation systems, although conceptually similar, is a
significantly harder problem due to its inherent time-dependent and
multicriteria nature. Although exact algorithms are fast enough for interactive
queries on metropolitan transit systems, dealing with continent-sized instances
requires simplifications or heavy preprocessing. The multimodal route planning
problem, which seeks journeys combining schedule-based transportation (buses,
trains) with unrestricted modes (walking, driving), is even harder, relying on
approximate solutions even for metropolitan inputs.Comment: This is an updated version of the technical report MSR-TR-2014-4,
previously published by Microsoft Research. This work was mostly done while
the authors Daniel Delling, Andrew Goldberg, and Renato F. Werneck were at
Microsoft Research Silicon Valle
Surface motion prediction and mapping for road infrastructures management by PS-InSAR measurements and machine learning algorithms
This paper introduces a methodology for predicting and mapping surface motion beneath road pavement structures caused by environmental factors. Persistent Scatterer Interferometric Synthetic Aperture Radar (PS-InSAR) measurements, geospatial analyses, and Machine Learning Algorithms (MLAs) are employed for achieving the purpose. Two single learners, i.e., Regression Tree (RT) and Support Vector Machine (SVM), and two ensemble learners, i.e., Boosted Regression Trees (BRT) and Random Forest (RF) are utilized for estimating the surface motion ratio in terms of mm/year over the Province of Pistoia (Tuscany Region, central Italy, 964 km2), in which strong subsidence phenomena have occurred. The interferometric process of 210 Sentinel-1 images from 2014 to 2019 allows exploiting the average displacements of 52,257 Persistent Scatterers as output targets to predict. A set of 29 environmental-related factors are preprocessed by SAGA-GIS, version 2.3.2, and ESRI ArcGIS, version 10.5, and employed as input features. Once the dataset has been prepared, three wrapper feature selection approaches (backward, forward, and bi-directional) are used for recognizing the set of most relevant features to be used in the modeling. A random splitting of the dataset in 70% and 30% is implemented to identify the training and test set. Through a Bayesian Optimization Algorithm (BOA) and a 10-Fold Cross-Validation (CV), the algorithms are trained and validated. Therefore, the Predictive Performance of MLAs is evaluated and compared by plotting the Taylor Diagram. Outcomes show that SVM and BRT are the most suitable algorithms; in the test phase, BRT has the highest Correlation Coefficient (0.96) and the lowest Root Mean Square Error (0.44 mm/year), while the SVM has the lowest difference between the standard deviation of its predictions (2.05 mm/year) and that of the reference samples (2.09 mm/year). Finally, algorithms are used for mapping surface motion over the study area. We propose three case studies on critical stretches of two-lane rural roads for evaluating the reliability of the procedure. Road authorities could consider the proposed methodology for their monitoring, management, and planning activities
Testing the Temporal Stability of Accessibility Value in Residential Hedonic Prices
Purpose â This paper bridges the gap between, on the one hand, supply-driven (urban form and transportation networks) and demand-driven (action-based) accessibility to urban amenities and, on the other hand, house price dynamics as captured through panel hedonic modelling. It aims at assessing temporal changes in the valuation of accessibility, while ordering householdsâ priorities among access to labour market, schools and shopping outlets. Design/methodology/approach â Several indexes are built using a methodology developed by ThĂ©riault et al. (2005, published in Journal of Property Investment and Finance). They integrate car-based travel time on the road network (using GIS), distribution of opportunities (activity places) within the city, and willingness of persons to travel in order to reach specific types of activity places (mobility behaviour). While some measure centrality (potential attractiveness considering travel time, population and opportunities) others consist of action-based indexes using fuzzy logic and capture the willingness to travel in order to reach actual specific activity places (work places, schools, shopping centres, groceries). They summarise suitable opportunities available from each neighbourhood. Rescaled indices (worst - to 100 - best) are inserted simultaneously into a multiplicative hedonic model of single-family houses sold in Quebec City during years 1986, 1991 and 1996 (10,269 transactions). Manipulations of accessibility indexes are developed for ordering their relative impact on sale prices and isolate effects of each index on the variation of sale price, thus providing proxies of householdsâ priorities. Moreover, a panel-like modelling approach is used to control for changes in the valuation of each property-specific, taxation or accessibility attribute during the study period. Findings â This original approach proves efficient in isolating the cross-effects of urban centrality from accessibility to several types of amenities, while controlling for multicollinearity and heteroscedasticity. Results are in line with expectations. While only a few property-specific attributes experience a change in their marginal contribution to house value during the study period, all accessibility indexes do. Every single accessibility index has a much stronger effect on house values than centrality (which is still marginally significant). When buying their home, households put more emphasis on access to schools than they put on access to the labour market, which in turn, prevail over accessibility to either shopping centres or, finally, groceries. The ordering is rather stable but the actual valuation of a specific amenity may change over time. Practical implications â Better understanding the effect of accessibility to amenities on house values provides guidelines for choosing among a set of new neighbourhoods to develop in order to generate optimal fiscal effects for municipalities. It could also provide guidelines for decision making when improving transportation networks or locating new activity centres.
Multi-sensors integrated system for landslide monitoring: critical issues in system setup and data management
This paper discusses critical issues related to the reliability of topographic monitoring systems such as ATS (Automated Total Stations), GNSS (Global Navigation Satellite System) and Ground Based InSAR focusing the attention on controlling the stability of networks infrastructure, which have influence on data correction procedures but are often taken for granted, and on integrating results in GIS (Geographic Information System), under a common reference framework and with respect to open-access ancillary data. The novelty of the paper lies in the demonstration of the efficiency obtained by a proper implementation of the system. Discussion makes reference to an active landslide by using ATS, GNSS and Ground Based InSAR in continuous and periodic mod
- âŠ