503 research outputs found

    Tunneling Appropriate Computational Models from Laser Scanning Data

    Get PDF
    Tunneling projects often require computational models of existing structures. To this end, this paper demonstrates the viability of automatically, robustly reconstructing an individual building model from laser scanning data for further computational modeling without any manual intervention. The resulting model is appropriate for immediate importation into a commercial finite element method (FEM) program. The method combines a voxel-based technique with an angle criterion. Initially, the voxelization model is used to represent the façade model, while an angle criterion is implemented to determine boundaries of the façade and its openings (doors and windows). The algorithm overcomes common problems of occlusions or artefacts that arise during data acquisition. The resulting relative errors of overall dimensions and opening areas of geometric models were less 2% and 6%, respectively, which are generally within industry standards for this type of building modeling.Science Foundation Ireland (SFI/PICA/I850); European Union Grant ERC StG 2012-307836- RETURN

    A 6-DIMENSIONAL HILBERT APPROACH TO INDEX FULL WAVEFORM LiDAR DATA IN A DISTRIBUTED COMPUTING ENVIRONMENT

    Get PDF
    Laser scanning data are increasingly available across the globe. To maximize the data's usability requires proper storage and indexing. While significant research has been invested in developing storage and indexing solutions for laser scanning point clouds (i.e. using the discrete form of the data), little attention has been paid to developing equivalent solutions for full waveform (FWF) laser scanning data, especially in a distributed computing environment. Given the growing availability of FWF sensors and datasets, FWF data management solutions are increasingly needed. This paper presents an attempt towards establishing a scalable solution for handling large FWF datasets by introducing the distributed computing solution for FWF data. The work involves a FWF database built atop HBase – the distributed database system running on Hadoop commodity clusters. By combining a 6-dimensional (6D) Hilbert spatial code and a temporal index into a compound indexing key, the database system is capable of supporting multiple spatial, temporal, and spatio-temporal queries. Such queries are important for FWF data exploration and dissemination. The proposed spatial decomposition at a fine resolution of 0.05 m allows the storage of each LiDAR FWF measurement (i.e. pulse, waves, and points) on a single row of the database, thereby providing the full capabilities to add, modify, and remove each measurement record anatomically. While the feasibility and capabilities of the 6D Hilbert solution are evident, the Hilbert decomposition is not due to the complications from the combination of the data’s high dimensionality, fine resolution, and large spatial extent. These factors lead to a complex set of both attractive attributes and limitation in the proposed solution, which are described in this paper based on experimental tests using a 1.1 billion pulse LiDAR scan of a portion of Dublin, Ireland

    Predicting Tunneling-Induced Ground Movement

    Get PDF
    Cost-effective and permissible tunneling can occur only if ground movement prediction is refined to accommodate changes in both the urban environment and tunneling technology. As cities age, tunnels are being installed closer to existing structures and in increasingly complicated belowground conditions. The reality of stacked tunnels, abandoned facilities, and more extensive use of underground space raises the question of whether relationships derived for single open-shield tunnels in free-field conditions can adequately predict ground movement for modern tunneling techniques with more complicated site conditions. Traditional empirical methods to predict maximum surface settlements and the percentage of lost ground for paired tunnels of the new Austrian tunneling method (NATM) in noncohesive soils are evaluated. Predictive data are compared with field measurements for grouted and nongrouted sections. Results showed that the estimated maximum settlement values of an NATM tunnel were highly similar to those of an open shield tunnel for both the grouted and ungrouted sections, although in some cases the Gaussian shape significantly underestimated the depth of the settlement trough in the outer 30% to 40%. Grouting substantially altered the amount of settlement. The average percentage of volume of lost ground with grouting was 1.6%, whereas the value was 5.2% where no grouting occurred. The empirical methods typically generated a fairly reasonable set of responses for an NATM tunnel.Deposited by bulk impor

    Numerical modelling options for cracked masonry buildings

    Get PDF
    9th International Masonry Conference 2014, Guimarães, Portugal, 7 - 9 July, 2014In most numerical modelling of buildings, there is an assumption that the structure is undamaged. However, with historic buildings, defects often exist. Failing to incorporate such damage may cause an unconservative estimation of a building’s response. Nowhere is this more critical than in the case of urban tunnelling where hundreds of unreinforced masonry structures may be impacted by ground movements. This paper examines the effectiveness and limitations of four numerical approaches in the modelling of existing discontinuities, in the form of masonry cracking when compared to traditional finite element methods. The comparative methods include a micro-poly method, a distinct element method, a discontinuity deformation method, and a combined continuum-interface method. Particular attention is paid to the ease of model implementation, the availability of input data, applicability of crack modelling, and the ability to define the initial state of the structure as part of the model. The methods are compared to each other and finite element modelling. Relative qualitative assessments are provided as to their applicability for modelling damaged masonry.European Research Counci

    Nonlinear Analysis of Isotropic Slab Bridges under Extreme Traffic Loading

    Get PDF
    Probabilistic analysis of traffic loading on a bridge traditionally involves an extrapolation from measured or simulated load effects to a characteristic maximum value. In recent years, Long Run Simulation, whereby thousands of years of traffic are simulated, has allowed researchers to gain new insights into the nature of the traffic scenarios that govern at the limit state. For example, mobile cranes and low-loaders, sometimes accompanied by a common articulated truck, have been shown to govern in most cases. In this paper, the extreme loading scenarios identified in the Long Run Simulation are applied to a non-linear, two-dimensional (2D) plate finite element model. For the first time, the loading scenarios that govern in 2D nonlinear analyses are found and compared to those that govern for 2D linear and 1D linear/nonlinear analyses. Results show that, for an isotropic slab, the governing loading scenarios are similar to those that govern in simple one-dimensional (beam) models. Furthermore, there are only slight differences in the critical positions of the vehicles. It is also evident that the load effects causing failure in the 2D linear elastic plate models are significantly lower, i.e. 2D linear elastic analysis is more conservative than both 2D nonlinear and 1D linear/nonlinear

    Classification of hardened cement and lime mortar using short-wave infrared spectrometry data

    Get PDF
    This paper evaluated the feasibility of using spectrometry data in the short-wave infrared range (1,300-2,200nm) to distinguish lime mortar and Type S cement mortar using 42 lab samples (21 lime-based, 21 cement-based) each 404040mm were created. A Partial Least Squares Discriminant Analysis model was developed using the mean spectra of 28 specimens as the calibration set. The results were tested on the mean spectra of the remaining 14 specimens as a validation set. The results showed that, spectrometry data were able to fully distinguish modern mortars (made with cement) from historic lime mortars with a 100% classification accuracy, which can be very useful in archaeological and architectural conservation applications. Specifically, being able to distinguish mortar composition in situ can provide critical information about the construction history of a structure, as well as to inform an appropriate intervention scheme when historic material needs to be repaired or replaced.Funding for this work was provided by New York University’s Center for Urban Science and Progress. Dr. Gowen acknowledges funding from the European Research Council (ERC) under the starting grant programme ERC-2013-StG call—Proposal No. 335508—BioWater

    Point cloud voxel classification of aerial urban LiDAR using voxel attributes and random forest approach

    Get PDF
    The opportunities now afforded by increasingly available, dense, aerial urban LiDAR point clouds (greater than100 pts/m2) are arguably stymied by their sheer size, which precludes the effective use of many tools designed for point cloud data mining and classification. This paper introduces the point cloud voxel classification (PCVC) method, an automated, two-step solution for classifying terabytes of data without overwhelming the computational infrastructure. First, the point cloud is voxelized to reduce the number of points needed to be processed sequentially. Next, descriptive voxel attributes are assigned to aid in further classification. These attributes describe the point distribution within each voxel and the voxel's geo-location. These include 5 point-descriptors (density, standard deviation, clustered points, fitted plane, and plane's angle) and 2 voxel position attributes (elevation and neighbors). A random forest algorithm is then used for final classification of the object within each voxel using four categories: ground, roof, wall, and vegetation. The proposed approach was evaluated using a 297,126,417 point dataset from a 1 km2 area in Dublin, Ireland and 50% denser dataset of New York City of 13,912,692 points (150 m2). PCVC's main advantage is scalability achieved through a 99 % reduction in the number of points that needed to be sequentially categorized. Additionally, PCVC demonstrated strong classification results (precision of 0.92, recall of 0.91, and F1-score of 0.92) compared to previous work on the same data set (precision of 0.82-0.91, recall 0.86-0.89, and F1-score of 0.85-0.90).This work was funded by the National Science Foundation award 1940145

    Use of negative stiffness in failure analysis of concrete beams

    Get PDF
    Pre-print of Use of negative stiffness in failure analysis of concrete beamsA new concrete analysis method is presented entitled continuous, incremental-only, tangential analysis (CITA). CITA employs piece-wise linear stress-strain curve and a tangent elasticity modulus to calculate stiffness including parts with negative values. Since indefinite structure stiffness matrices generally indicate instability, traditionally they have been avoided. However, since CITA analysis involves introducing damage in steps, the full range of concrete behaviour including the softening portion under tensile cracking can be addressed. Herein CITA is verified against numerical and experimental results for concrete beams, thereby showing faster solutions for non-linear problems than sequentially-linear analysis, while reducing self-imposed restrictions against negative stiffness
    corecore