562 research outputs found
Evaluation and Seismically Isolated Substructure Redesign of a Typical Multi-Span Pre-Stressed Concrete Girder Highway Bridge
Seismic considerations greatly influence the lateral and vertical design of a structure, often necessitating larger elements than would otherwise be required. Seismic isolation greatly reduces the demands on a structure due to earthquake loading, allowing the use of smaller, more efficient members and foundations. This case study illustrates the theory and procedure of evaluating the response of a recently built multi-span highway bridge using the most recent (2009) AASHTO code. Based on this response, an equivalent structure was designed to incorporate a seismic isolation system, and the substructure of the isolated bridge redesigned to meet the reduced demands more economically. The reduction in demands was quantified, and the member demands and overall responses of the two designs were compared. An overview of isolator design for the common isolator types available in the United States, with examples specific to the isolated structure that was designed, is also included as an addendum
Recommended from our members
MHTool User's Guide - Software for Manufactured Housing Structural Design
Since the late 1990s, the Department of Energy's Idaho National Laboratory (INL) has worked with the US Department of Housing and Urban Development (HUD), the Manufactured Housing Institute (MHI), the National Institute of Standards and Technology (NIST), the National Science Foundation (NSF), and an industry committee to measure the response of manufactured housing to both artificial and natural wind loads and to develop a computational desktop tool to optimize the structural performance of manufactured housing to HUD Code loads. MHTool is the result of an 8-year intensive testing and verification effort using single and double section homes. MHTool is the first fully integrated structural analysis software package specifically designed for manufactured housing. To use MHTool, industry design engineers will enter information (geometries, materials, connection types, etc.) describing the structure of a manufactured home, creating a base model. Windows, doors, and interior walls can be added to the initial design. Engineers will input the loads required by the HUD Code (wind, snow loads, interior live loads, etc.) and run an embedded finite element solver to find walls or connections where stresses are either excessive or very low. The designer could, for example, substitute a less expensive and easier to install connection in areas with very low stress, then re-run the analysis for verification. If forces and stresses are still within HUD Code requirements, construction costs would be saved without sacrificing quality. Manufacturers can easily change geometries or component properties to optimize designs of various floor plans then submit MHTool input and output in place of calculations for DAPIA review. No change in the regulatory process is anticipated. MHTool, while not yet complete, is now ready for demonstration. The pre-BETA version (Build-16) was displayed at the 2005 National Congress & Expo for Manufactured & Modular Housing. Additional base models and an extensive material library need to be developed. Output displays and listings will need to be expanded and model checking capability added. When completed, MHTool will ultimately lead to new manufactured housing designs that meet or exceed the HUD Code for quality, durability, and safety while reducing labor and materials. This will reduce cost and increase home ownership for the traditional manufactured housing market of first time or low-income buyers. MHTool uses the freeware solver Felt modified specifically for manufactured housing by researchers at Washington State University and INL. Input data, material properties, and results verification are based on full scale testing conducted by INL and others. See Section 7 for a collection of references
Earthquake swarm near Denio, Nevada, February to April, 1973
Online access for this thesis was created in part with support from the Institute of Museum and Library Services (IMLS) administered by the Nevada State Library, Archives and Public Records through the Library Services and Technology Act (LSTA). To obtain a high quality image or document please contact the DeLaMare Library at https://unr.libanswers.com/ or call: 775-784-6945.An investigation of historic earthquake activity in northwest Nevada shows that earthquake swarms are typical. Evidence suggests that these swarms are associated with geothermal activity. An earthquake swarm occurred during February, March and April, 1973, 20 kilometers south of Denio on the Nevada/'Oregon border. The largest event of the sequence was a magnitude 5.3 shock on 3 March. Fault plane solutions indicate right-lateral oblique-slip motion on a plane striking N11°W and dipping 60°E. This mechanism is very similar to those of the 1954 Fairview Pc ale and other earthquakes in the western Basin and Range, and is consistent with regional extension in a WNW-ESE direction. During March and April, a small tripartite array recorded more than 1,500 events of this sequence, and 221 of these were selected for detailed analysis. Epicenters of these events fall in a north-south trending zone, 8 kilometers in length and 2 kilometers wide; focal depths range from 5 1/2 to 8 1/2 kilometers. The b-value for this sequence is 1.00 which is considerably higher than 0.81 found for northwest Nevada as a whole, high b-values have been found in laboratory experiments for heterogeneous materials and for rocks under low to moderate stress
Automating Risk Assessments of Hazardous Material Shipments for Transportation Routes and Mode Selection
The METEOR project at Idaho National Laboratory (INL) successfully addresses the difficult problem in risk assessment analyses of combining the results from bounding deterministic simulation results with probabilistic (Monte Carlo) risk assessment techniques. This paper describes a software suite designed to perform sensitivity and cost/benefit analyses on selected transportation routes and vehicles to minimize risk associated with the shipment of hazardous materials. METEOR uses Monte Carlo techniques to estimate the probability of an accidental release of a hazardous substance along a proposed transportation route. A METEOR user selects the mode of transportation, origin and destination points, and charts the route using interactive graphics. Inputs to METEOR (many selections built in) include crash rates for the specific aircraft, soil/rock type and population densities over the proposed route, and bounding limits for potential accident types (velocity, temperature, etc.). New vehicle, materials, and location data are added when available. If the risk estimates are unacceptable, the risks associated with alternate transportation modes or routes can be quickly evaluated and compared. Systematic optimizing methods will provide the user with the route and vehicle selection identified with the lowest risk of hazardous material release. The effects of a selected range of potential accidents such as vehicle impact, fire, fuel explosions, excessive containment pressure, flooding, etc. are evaluated primarily using hydrocodes capable of accurately simulating the material response of critical containment components. Bounding conditions that represent credible accidents (i.e; for an impact event, velocity, orientations, and soil conditions) are used as input parameters to the hydrocode models yielding correlation functions relating accident parameters to component damage. The Monte Carlo algorithms use random number generators to make selections at the various decision points such as; crash, location, etc. For each pass through the routines, when a crash is randomly selected, crash parameters are then used to determine if failure has occurred using either external look up tables, correlations functions from deterministic calculations, or built in data libraries. The effectiveness of the software was recently demonstrated in safety analyses of the transportation of radioisotope systems for the US Dept. of Energy. These methods are readily adaptable to estimating risks associated with a variety of hazardous shipments such as spent nuclear fuel, explosives, and chemicals
Recommended from our members
Analysis of the ATR fuel element swaging process
This report documents a detailed evaluation of the swaging process used to connect fuel plates to side plates in Advanced Test Reactor (ATR) fuel elements. The swaging is a mechanical process that begins with fitting a fuel plate into grooves in the side plates. Once a fuel plate is positioned, a lip on each of two side plate grooves is pressed into the fuel plate using swaging wheels to form the joints. Each connection must have a specified strength (measured in terms, of a pullout force capacity) to assure that these joints do not fail during reactor operation. The purpose of this study is to analyze the swaging process and associated procedural controls, and to provide recommendations to assure that the manufacturing process produces swaged connections that meet the minimum strength requirement. The current fuel element manufacturer, Babcock and Wilcox (B&W) of Lynchburg, Virginia, follows established procedures that include quality inspections and process controls in swaging these connections. The procedures have been approved by Lockheed Martin Idaho Technologies and are designed to assure repeatability of the process and structural integrity of each joint. Prior to July 1994, ATR fuel elements were placed in the Hydraulic Test Facility (HTF) at the Idaho National Engineering Laboratory (AGNAIL), Test Reactor Area (TRA) for application of Boehmite (an aluminum oxide) film and for checking structural integrity before placement of the elements into the ATR. The results presented in this report demonstrate that the pullout strength of the swaged connections is assured by the current manufacturing process (with several recommended enhancements) without the need for- testing each element in the HTF
Recommended from our members
Evaluation of strain levels in the IFSF rack
An evaluation has been performed on strain levels determined for the IFSF fuel storage rack for seismic loading. The storage rack had been previously analyzed by a consulting company, Advanced Engineering Consultants (AEC), who reported significant strain levels in several members of the rack. The purpose for the study conducted herein was to refine the method for calculating strain levels, and then to assess the acceptability of the refined strain values. This was accomplished by making a modification to AEC`s model to more realistically represent plastic behavior in all locations where material yields. An analysis was performed where this modified model was subjected to the same seismic loadings as applied in AEC`s analysis. It was expected that the more realistic representation of plastic behavior in the modified model would result in reduced maximum calculated strains for the rack. Results of the analysis showed that the more realistic representation of plastic behavior in rack members did reduce the calculated maximum strains from those reported by AEC. These modified strains were evaluated for acceptability according to ductility criteria of the governing standard (i.e. ANSI/AISC N690-1994, as specified by the project Criteria Application Document). This evaluation showed that the strains meet these acceptance criteria. The analysis described herein was performed only to investigate this issue. AEC`s analysis stands as the analysis of record for the rack
Recommended from our members
Manufactured Home Testing in Simulated and Naturally Occurring High Winds
A typical double-wide manufactured home was tested in simulated and naturally occurring high winds to understand structural behavior and improve performance during severe windstorms. Seven (7) lateral load tests were conducted on a double-wide manufactured home at a remote field test site in Wyoming. An extensive instrumentation package monitored the overall behavior of the home and collected data vital to validating computational software for the manufactured housing industry. The tests were designed to approach the design load of the home without causing structural damage, thus allowing the behavior of the home to be accessed when the home was later exposed to high winds (to 80-mph). The data generally show near-linear initial system response with significant non-linear behavior as the applied loads increase. Load transfer across the marriage line is primarily compression. Racking, while present, is very small. Interface slip and shear displacement along the marriage line are nearly insignificant. Horizontal global displacements reached 0.6 inch. These tests were designed primarily to collect data necessary to calibrate a desktop analysis and design software tool, MHTool, under development at the Idaho National Laboratory specifically for manufactured housing. Currently available analysis tools are, for the most part, based on methods developed for “stick built” structures and are inappropriate for manufactured homes. The special materials utilized in manufactured homes, such as rigid adhesives used in the connection of the sheathing materials to the studs, significantly alter the behavior of manufactured homes under lateral loads. Previous full scale tests of laterally loaded manufactured homes confirm the contention that conventional analysis methods are not applicable. System behavior dominates the structural action of manufactured homes and its prediction requires a three dimensional analysis of the complete unit, including tiedowns. This project was sponsored by the US Department of Energy, US Department of Housing and Urban Development, and the Manufactured Housing Institute. The results of this research can lead to savings in annual losses of life and property by providing validated information to enable the advancement of code requirements and by developing engineering software that can predict and optimize wind resistance
Dynamic Tests of High Strength Concrete Cylinders
Idaho National Laboratory engineers collaborated
Recommended from our members
A METHOD FOR SELECTING SOFTWARE FOR DYNAMIC EVENT ANALYSIS I: PROBLEM SELECTION
New nuclear power reactor designs will require resistance to a variety of possible malevolent attacks, as well as traditional dynamic accident scenarios. The design/analysis team may be faced with a broad range of phenomena including air and ground blasts, high-velocity penetrators or shaped charges, and vehicle or aircraft impacts. With a host of software tools available to address these high-energy events, the analysis team must evaluate and select the software most appropriate for their particular set of problems. The accuracy of the selected software should then be validated with respect to the phenomena governing the interaction of the threat and structure. In this paper, we present a method for systematically comparing current high-energy physics codes for specific applications in new reactor design. Several codes are available for the study of blast, impact, and other shock phenomena. Historically, these packages were developed to study specific phenomena such as explosives performance, penetrator/target interaction, or accidental impacts. As developers generalize the capabilities of their software, legacy biases and assumptions can remain that could affect the applicability of the code to other processes and phenomena. R&D institutions generally adopt one or two software packages and use them almost exclusively, performing benchmarks on a single-problem basis. At the Idaho National Laboratory (INL), new comparative information was desired to permit researchers to select the best code for a particular application by matching its characteristics to the physics, materials, and rate scale (or scales) representing the problem at hand. A study was undertaken to investigate the comparative characteristics of a group of shock and high-strain rate physics codes including ABAQUS, LS-DYNA, CTH, ALEGRA, ALE-3D, and RADIOSS. A series of benchmark problems were identified to exercise the features and capabilities of the subject software. To be useful, benchmark problems require several features. They should be; 1) small, requiring reasonable computer resources, 2) designed to engage a small set of physical phenomena, 3) independent of code formulation, 4) verifiable, either by closed-form solution or experimental result, and 5) unlimited in distribution. This paper presents the selection rationale and problems chosen for the benchmarking suite exhibiting the above features. Detailed discussion of the benchmark study results will be presented in future reports
- …