412 research outputs found

    Predicting Tunneling-Induced Ground Movement

    Get PDF
    Cost-effective and permissible tunneling can occur only if ground movement prediction is refined to accommodate changes in both the urban environment and tunneling technology. As cities age, tunnels are being installed closer to existing structures and in increasingly complicated belowground conditions. The reality of stacked tunnels, abandoned facilities, and more extensive use of underground space raises the question of whether relationships derived for single open-shield tunnels in free-field conditions can adequately predict ground movement for modern tunneling techniques with more complicated site conditions. Traditional empirical methods to predict maximum surface settlements and the percentage of lost ground for paired tunnels of the new Austrian tunneling method (NATM) in noncohesive soils are evaluated. Predictive data are compared with field measurements for grouted and nongrouted sections. Results showed that the estimated maximum settlement values of an NATM tunnel were highly similar to those of an open shield tunnel for both the grouted and ungrouted sections, although in some cases the Gaussian shape significantly underestimated the depth of the settlement trough in the outer 30% to 40%. Grouting substantially altered the amount of settlement. The average percentage of volume of lost ground with grouting was 1.6%, whereas the value was 5.2% where no grouting occurred. The empirical methods typically generated a fairly reasonable set of responses for an NATM tunnel.Deposited by bulk impor

    Numerical modelling options for cracked masonry buildings

    Get PDF
    9th International Masonry Conference 2014, Guimarães, Portugal, 7 - 9 July, 2014In most numerical modelling of buildings, there is an assumption that the structure is undamaged. However, with historic buildings, defects often exist. Failing to incorporate such damage may cause an unconservative estimation of a building’s response. Nowhere is this more critical than in the case of urban tunnelling where hundreds of unreinforced masonry structures may be impacted by ground movements. This paper examines the effectiveness and limitations of four numerical approaches in the modelling of existing discontinuities, in the form of masonry cracking when compared to traditional finite element methods. The comparative methods include a micro-poly method, a distinct element method, a discontinuity deformation method, and a combined continuum-interface method. Particular attention is paid to the ease of model implementation, the availability of input data, applicability of crack modelling, and the ability to define the initial state of the structure as part of the model. The methods are compared to each other and finite element modelling. Relative qualitative assessments are provided as to their applicability for modelling damaged masonry.European Research Counci

    Use of negative stiffness in failure analysis of concrete beams

    Get PDF
    Pre-print of Use of negative stiffness in failure analysis of concrete beamsA new concrete analysis method is presented entitled continuous, incremental-only, tangential analysis (CITA). CITA employs piece-wise linear stress-strain curve and a tangent elasticity modulus to calculate stiffness including parts with negative values. Since indefinite structure stiffness matrices generally indicate instability, traditionally they have been avoided. However, since CITA analysis involves introducing damage in steps, the full range of concrete behaviour including the softening portion under tensile cracking can be addressed. Herein CITA is verified against numerical and experimental results for concrete beams, thereby showing faster solutions for non-linear problems than sequentially-linear analysis, while reducing self-imposed restrictions against negative stiffness

    Dynamic Analysis of an Annular Plate Resting on the Surface of an Elastic Half-Space with Distributive Properties

    Get PDF
    This work gives a semi-analytical approach for the dynamic analysis of a plate of annular shape resting on the surface of an elastic half-space with distributive properties. Such calculations have been associated with significant mathematical challenges, often leading to unrealizable computing processes. Therefore, the dynamic analysis of beams and plates interacting with the surfaces of elastic foundations has to date not been completely solved. To advance this work, the deflections of the plate are determined by the Ritz method, and the displacements of the surface of elastic half-space are determined by studying Green's function. The coupling of these two studies is achieved by a mixed method, which allows determination of reactive forces in the contact zone and, hence, the determination of other physical magnitudes. Natural frequencies, natural shapes, and the dynamic response of a plate due to external harmonic excitation are determined. Validation with a Winkler problem illustrates the distributive property effects on the results of the dynamic analysis

    Determining disaster data management needs in a multi-disaster context

    Get PDF
    In the last four decades, the economic loss from natural hazard disasters has increased ten-fold. The increasing human and economic impacts of disasters have intensified efforts on the global, national, state, and local levels to find ways to reduce these impacts. Improved collection and management of disaster data can help support planning and decision-making by first responders and emergency managers during all phases of the disaster cycle. The goal of this report is to establish what disaster-related data are needed in the planning, response, and recovery for multiple types of disasters, with a focus on the data needs in the state of North Carolina. There is a vast amount of information available from all phases of a disaster. Unfortunately, without proper collection, documentation, and storage, the information is either completely lost or is not transformed into functional data. Often, data that are critical for developing better mitigation efforts is not collected because much of it is short-lived and is lost prior to collection. Increased use of instrumentation, such as water level gauges and data collection and analysis software, can aid in collecting and disseminating real-time critical disaster data. The deployment of rapid-response data collection teams immediately after a disaster event can also improve the quantity and quality of data obtained during a disaster. Disaster management systems help first responders and emergency managers formulate and discriminate their decisions before, during, and after a disaster and therefore can serve as a way to organize, analyze, and disseminate critical disaster data. Groups of researchers and emergency management professionals in NC are trying to improve the collection and dissemination of disaster data in order to improve disaster preparation and response. Researchers at North Carolina State University (NCSU) were looking at all phases of data collection in a multi-disaster context. Another group, the North Carolina Institute of Disaster Studies, hosted two previous workshops to better coordinate collaboration between emergency responders and academics throughout the state. These efforts, as well as the disaster data collection research efforts of the North Carolina Emergency Management Division, resulted in a need to gather members of the academic and emergency management community together to obtain a more accurate picture of multi-disaster data collection and use, and to develop the foundation for a consensus on areas of disaster data management that needed improvement. A Disaster Data Workshop, held at NCSU November 4-5, 2004, was chosen as one way to address the data collection and dissemination issues in a context of broad, statewide participation. The workshop planning committee determined that the approximately 30-40 workshop participants would discuss four different disasters in-depth. The four disasters chosen by the workshop planning committee to discuss in the workshop were hurricane and tornado wind, flood, ice storm, and intentional explosion. The first three disasters chosen are the most frequent natural disasters in NC, while the intentional explosion disaster was chosen so that an intentional man-made disaster would be included in the workshop. The five objectives of the NCSU Disaster Data Workshop on “Determining Disaster Data Needs in a Multi-Disaster Context” were as follows. ‱ Evaluate the applicability of a general multi-disaster model, ‱ Understand local data needs and opportunities, ‱ Establish clear models of organizational participation in collection and use, ‱ Define a common data set for multiple disasters, and ‱ Lay the groundwork for establishment of data collection teams. The workshop’s structure was based on meeting the five workshop objectives within the available time. The five sessions of the workshop were data needs, data resources, data dissemination, common data set, and data collection teams. From the participants’ discussions on disaster data during the workshop sessions, some common themes emerged. The emerging themes on data needs, resources, and data dissemination were used to create and implement a multi-disaster data model. The model was developed by the workshop planning committee. The discussions on data needs and resources also led to the identification of data items that participants in each of the four disaster groups indicated were needed for their assigned disaster. These needed data items form a common data set for the four disasters investigated by the workshop, as well as possibly for other disasters not investigated. Also generated from the workshop discussions were a set of disaster data collection and management priorities for NC. From this research study, from the NCSU Disaster Data Workshop results, and from previous workshops and disaster management systems efforts, several conclusions can be drawn about disaster data and its management. Existing data collection and management efforts focus primarily on inventory data, since this information is available regardless of a disaster event. The development of data collection teams and a data repository in NC is needed and would contribute to disaster research and emergency management efforts. The four areas model developed from the workshop allows all of the data items the workshop participants could think of to be assigned to a data area. The common data set model developed from the workshop is also biased toward the data needs for NC, and may need to be modified for application in other regions. Also, a disaster data collection and management cycle was developed from the workshop discussions. This cycle can serve as an agenda for the development and operations of both disaster data collection teams and a common disaster data repository. Recommendations from this study for NC include more research in the area of ice storms, an additional workshop to discuss the further development of data collection teams and coordinated data management in the state, and developing a common disaster data repository in NC. Broader recommendations in the area of disaster data management include prioritizing data set development based on how critical the data set is to a region’s disaster preparedness and response, ensuring that disaster data collection teams are self-reliant, investigating more disaster types to better understand their data needs and resources, and improving data collection efforts through increased use of instrumentation and cooperation between emergency management organizations and managers of infrastructure systems such as transportation and utilities

    Nonlinear Analysis of Isotropic Slab Bridges under Extreme Traffic Loading

    Get PDF
    Probabilistic analysis of traffic loading on a bridge traditionally involves an extrapolation from measured or simulated load effects to a characteristic maximum value. In recent years, Long Run Simulation, whereby thousands of years of traffic are simulated, has allowed researchers to gain new insights into the nature of the traffic scenarios that govern at the limit state. For example, mobile cranes and low-loaders, sometimes accompanied by a common articulated truck, have been shown to govern in most cases. In this paper, the extreme loading scenarios identified in the Long Run Simulation are applied to a non-linear, two-dimensional (2D) plate finite element model. For the first time, the loading scenarios that govern in 2D nonlinear analyses are found and compared to those that govern for 2D linear and 1D linear/nonlinear analyses. Results show that, for an isotropic slab, the governing loading scenarios are similar to those that govern in simple one-dimensional (beam) models. Furthermore, there are only slight differences in the critical positions of the vehicles. It is also evident that the load effects causing failure in the 2D linear elastic plate models are significantly lower, i.e. 2D linear elastic analysis is more conservative than both 2D nonlinear and 1D linear/nonlinear

    Effets du pH de la boue et du temps de traitement sur l'élimination électrocinétique de l'aluminium d'une boue de traitement d'eaux potables

    Get PDF
    Algerian's municipal sewage treatment plants generate around 106 m3 of sewage sludge annually. Recently, rapid expansion of waste water treatment plants without equal attention to the treatment of the produced sludge has generated increasing concerns. While the sludge is usually incinerated or used as an agricultural fertilizer and may contain numerous nutrients, there may also be harmful substances that complicate sludge management. Hence the removal of pollutants from the sludge is necessary before further usage. This paper discusses the characteristics of potable water treatment sludge containing a high aluminum content. Furthermore, an electrokinetic treatment is proposed to remove aluminum from this sludge by varying the type of solution contained in the cathode compartment and modifying the treatment time to optimize the efficiency of the process. Successful results were achieved where 60% of aluminum was collected on the cathode side with a consumed energy around of 1000–2000 kWh kg−1 of sludge weight

    Prediction of low level vibration induced settlement

    Get PDF
    A prediction model of vibration induced settlement was developed for small to intermediate vibration levels (0.25-1.78 cm/s). Seven factors affecting vibration induced settlement such as vibration amplitude, deviatoric stress, confining pressure, soil gradation, duration of vibration, relative density, and moisture content were considered. A special vibratory frame was designed to shake a soil sample within a triaxial cell. An experimental program was devised using a multi-factorial experimental design method, which allowed the investigation of many factors influencing settlement using a relatively small number of experiments. The settlements from the case histories matched the settlements calculated from the model. This demonstrated the potential usefulness of a mathematical model for the evaluation and prediction of the vibration induced, in-situ settlement of sands

    Point cloud voxel classification of aerial urban LiDAR using voxel attributes and random forest approach

    Get PDF
    The opportunities now afforded by increasingly available, dense, aerial urban LiDAR point clouds (greater than100 pts/m2) are arguably stymied by their sheer size, which precludes the effective use of many tools designed for point cloud data mining and classification. This paper introduces the point cloud voxel classification (PCVC) method, an automated, two-step solution for classifying terabytes of data without overwhelming the computational infrastructure. First, the point cloud is voxelized to reduce the number of points needed to be processed sequentially. Next, descriptive voxel attributes are assigned to aid in further classification. These attributes describe the point distribution within each voxel and the voxel's geo-location. These include 5 point-descriptors (density, standard deviation, clustered points, fitted plane, and plane's angle) and 2 voxel position attributes (elevation and neighbors). A random forest algorithm is then used for final classification of the object within each voxel using four categories: ground, roof, wall, and vegetation. The proposed approach was evaluated using a 297,126,417 point dataset from a 1 km2 area in Dublin, Ireland and 50% denser dataset of New York City of 13,912,692 points (150 m2). PCVC's main advantage is scalability achieved through a 99 % reduction in the number of points that needed to be sequentially categorized. Additionally, PCVC demonstrated strong classification results (precision of 0.92, recall of 0.91, and F1-score of 0.92) compared to previous work on the same data set (precision of 0.82-0.91, recall 0.86-0.89, and F1-score of 0.85-0.90).This work was funded by the National Science Foundation award 1940145
    • 

    corecore