19,053 research outputs found
Canopy height estimation from lidar data using open source software compared with commercial software
The goal of this study is to analyze the performance of Open Source Software (OSS) towards the generation of Digital Terrain Model (DTM) and Digital Surface Model (DSM), further on estimates the canopy height by using Light Detection and Ranging (LIDAR) data. Generation of DTM and DSM are very important in this research to ensure that better canopy height can be modeled. DTM and DSM commonly known as a digital representation of earth surface topography where DTM only represent the ground surface while DSM represent all the features including buildings and trees. Many software that have a function to generate DTM and DSM were developed recently. However, most software has been commercialized; therefore it requires a high expenditure to own the software. Advanced technology has lead to the emergence of the growing OSS. OSS is software that can be downloaded for free via the internet. By taking the forestry area of Pekan, Pahang for this research, LIDAR data for that particular area is processed by using the OSS Geographic Resources Analysis Support System (GRASS). To determine the effectiveness and capability of GRASS in the DTM and DSM generation, the same data were processed using commercial software which is TerraScan so that the result can be compared, further on better canopy height can be modele
Effect of the Resolution and Accuracy of DTM produced with Aerial Photogrammetry and Terrestrial Laser Scanning on Slope- and Catchment-scale Erosion Assessment in a Recently Burnt Forest Area: a Case Study
Wildfires are a common phenomenon in Portugal, affecting on average 100.000 ha of rural areas per year and up to 400.000 ha in dramatic years like 2003 and 2005. Wildfires can strongly enhance the hydrological response and associated sediment losses in recently burnt forest catchments and, thereby, negatively affect land-use sustain- ability of the affected terrains as well as ecosystem functioning of downstream aquatic habitats. Therefore, the EROSFIRE-I and –II projects aim at developing a GIS-tool for predicting soil erosion hazard following wildfire and, ultimately, for assessing the implications of alternative post-fire land management practices.
Assessment of runoff and soil erosion rates critically depends on accurate estimates of the corresponding runoff areas. In the case of catchments as well as unbounded erosion plots (arguably, the only practical solution for slope-scale measurements), delineation of runoff area requires a Digital Terrain Model (DTM) with an adequate resolution and accuracy. The DTM that was available for the Colmeal study area, localized in the mountain range of Lousã, in the central part of Portugal, of EROSFIRE-II project is that of the 1:25.000 topographic map produced by the Military Geographic Institute. Since the Colmeal area involves a rather small experimental catchment of roughly 10 ha and relatively short study slopes of less than 100 m long, two different data acquisition techniques were used to produce high-resolution and high-accuracy DTM. One of the data acquisition techniques is aerial photogrammetry whilst the other is terrestrial laser scanning. In order to produce a DTM by photogrammetric means, a dedicated digital aerial photography mission was carried out. The images have a pixel size of 10 cm. Manual measurements permitted to measure breaklines and were complemented by automatic measurements. In this way, a DTM in a TIN format was produced. This was further converted to grid format using the ArcGIS software system. Signalized control points allowed obtaining the DTM in the same global reference system as that employed for terrestrial laser scanning. The terrestrial laser scanning was done using a Riegl LMS Z360I, stationed in 8 points within the area to provide a complete coverage. The resulting dense cloud of points was filtered – by the company carrying out the scanning mission - to remove the non-terrain points (in particular vegetation). Several grids of different sizes were produced (0.10 x 0.10, 0.20 x 0.20, 0.50 x 0.50, 1 x 1 and 2 x 2 m2).
This work will study the effect on runoff and erosion rates at the slope- and catchment-scale of DTM with differ- ent resolution, but produced with data collected with the same acquisition technique, and of DTM with the same resolution, but produced with data collected with the two different acquisition techniques. The study is being carried out in ArcGIS using DTM in a grid format. Preliminary results suggest that the conver- sion of TIN-to-grid in ArcGIS produces results that depend on the procedure being applied. Therefore, the different algorithms available at ArcGIS for TIN-to-grid conversion are currently being tested, using an artificially produced DTM. This testing includes various interpolation techniques for grid generation, and will be extended to different algorithms for computation of drainage flow direction
Recommended from our members
Position criticality in chess endgames
Some 50,000 Win Studies in Chess challenge White to find an effectively unique route to a win. Judging the impact of less than absolute uniqueness requires both technical analysis and artistic judgment. Here, for the first time, an algorithm is defined to help analyse uniqueness in endgame positions objectively. The key idea is to examine how critical certain positions are to White in achieving the win. The algorithm uses sub-n-man endgame tables
(EGTs) for both Chess and relevant, adjacent variants of Chess. It challenges authors of EGT generators to generalise them to create EGTs for these chess variants.
It has already proved efficient and effective in an implementation for Starchess, itself a variant of chess. The approach also addresses a number of similar questions arising in endgame theory, games and compositions
Recommended from our members
Depth to mate and the 50-move rule
The most popular endgame tables (EGTs) documenting ‘DTM’ Depth to Mate in chess endgames are those of Eugene Nalimov but these do not recognise the FIDE 50-move rule ‘50mr’. This paper marks the creation by the first author of EGTs for sub-6-man (s6m) chess and beyond which give DTM as affected by the ply count pc. The results are put into the context of previous work recognising the 50mr and are compared with the original unmoderated DTM results. The work is also notable for being the first EGT generation work to use the functional programming language HASKELL
Recommended from our members
KQQKQQ and the Kasparov-World Game
The 1999 Kasparov-World game for the first time enabled anyone to join a team playing against a World Chess Champion via the web. It included a surprise in the opening, complex middle-game strategy and a deep ending. As the game headed for its mysterious finale, the World Team re-quested a KQQKQQ endgame table and was provided with two by the authors. This paper
describes their work, compares the methods used, examines the issues raised and summarises the concepts involved for the benefit of future workers in the endgame field. It also notes the contribution of this endgame to chess itself
Executing large orders in a microscopic market model
In a recent paper, Alfonsi, Fruth and Schied (AFS) propose a simple order
book based model for the impact of large orders on stock prices. They use this
model to derive optimal strategies for the execution of large orders. We apply
these strategies to an agent-based stochastic order book model that was
recently proposed by Bovier, \v{C}ern\'{y} and Hryniv, but already the
calibration fails. In particular, from our simulations the recovery speed of
the market after a large order is clearly dependent on the order size, whereas
the AFS model assumes a constant speed. For this reason, we propose a
generalization of the AFS model, the GAFS model, that incorporates this
dependency, and prove the optimal investment strategies. As a corollary, we
find that we can derive the ``correct'' constant resilience speed for the AFS
model from the GAFS model such that the optimal strategies of the AFS and the
GAFS model coincide. Finally, we show that the costs of applying the optimal
strategies of the GAFS model to the artificial market environment still differ
significantly from the model predictions, indicating that even the improved
model does not capture all of the relevant details of a real market.Comment: 32 pages, 12 figure
A domain-specific language and matrix-free stencil code for investigating electronic properties of Dirac and topological materials
We introduce PVSC-DTM (Parallel Vectorized Stencil Code for Dirac and
Topological Materials), a library and code generator based on a domain-specific
language tailored to implement the specific stencil-like algorithms that can
describe Dirac and topological materials such as graphene and topological
insulators in a matrix-free way. The generated hybrid-parallel (MPI+OpenMP)
code is fully vectorized using Single Instruction Multiple Data (SIMD)
extensions. It is significantly faster than matrix-based approaches on the node
level and performs in accordance with the roofline model. We demonstrate the
chip-level performance and distributed-memory scalability of basic building
blocks such as sparse matrix-(multiple-) vector multiplication on modern
multicore CPUs. As an application example, we use the PVSC-DTM scheme to (i)
explore the scattering of a Dirac wave on an array of gate-defined quantum
dots, to (ii) calculate a bunch of interior eigenvalues for strong topological
insulators, and to (iii) discuss the photoemission spectra of a disordered Weyl
semimetal.Comment: 16 pages, 2 tables, 11 figure
Evaluation of a hydrographic technique to measure on-farm water storage volumes
Digital terrain models of on-farm water storages are required to assist in accurately measuring the on-farm water balance and water use efficiency components including
storage capacity, inflow, seepage, evaporation and discharge volumes. A hydrographic surveying system combining a high-precision global positioning system (GPS) and a low-cost depth sounder was developed to facilitate the creation of a digital terrain model. The system was validated by comparing the hydrographic terrain model and volume measurements against both a traditional real time kinematic (RTK) land based survey and independent lead line depth measurements. Flat bottomed storage volumes were measured with errors of less than 1 percent using the hydrographic survey technique. A major proportion of the error in small storages was found to be associated with the ability to accurately identify the inflection point between the banks and floor of the storage. However, for larger storages, errors were primarily related to density of sampling points within the storage floor area. Recommendations are provided regarding the appropriate measurement procedures, including sampling point density, for a range of storage sizes
- …
