16,631 research outputs found
Onsite assessment of structural timber members by means of hierarchical models and probabilistic methods
One of the main motivations for hierarchical modelling is to understand how properties, composition and structure at lower scale levels may influence and be used to predict the material properties at macroscopic and structural engineering scales. Structural timber is, in most cases, characterized by three parameters usually designated as reference properties: density, bending modulus of elasticity and bending strength.
The present paper addresses a review on different possibilities for obtaining reliable data about the mechanical behaviour of timber elements by collecting information at different levels and by organizing that information into a hierarchy of sequential levels (from lowest to highest). The applicability and limitations of statistic and probabilistic methods on the prediction and inference of timber’s reference material properties are discussed and exemplified
Composite structural materials
Technology utilization of fiber reinforced composite materials is discussed in the areas of physical properties, and life prediction. Programs related to the Composite Aircraft Program are described in detail
A guide to the equipment, methods and procedures for the prevention of risks, emergency response and mitigation of the consequences of accidents: Part I
This report is the first part of a dilogy which aims to be a compendium for regulators without a specific background in risk and safety assessment. It describes the state-of-the-art of the safety-related equipment, methods, procedures and projects available nowadays for the prevention of risks, the emergency response and the mitigation of the consequences of accidents.
While the present report addresses the above topics from a generic perspective, the second part, currently in preparation, focuses on the particular challenges of the Nordic Seas.
The review is based on the retrieval and analysis of a large number of open source information, along with personal contacts with Authorities and HSE representatives of several major oil and gas operators. This helps the reader go into further details and better appreciate the latest technological advancements in offshore safety as a consequence of the lessons learnt from the Macondo Accident.JRC.C.3-Energy Security, Distribution and Market
A sytemic study of mining accident causality : an analysis of 91 mining accidents from a platinum mine in South Africa
The mining industry is a very important sector of the South African national economy. A major factor threatening the sustainability of this industry is the worrying effect of mining accidents. These accidents usually lead to the destruction of property, injury/death of mine workers, and pollution of the environment. Although mining is generally seen as a hazardous operation worldwide, the accident rates in South African mines are still unacceptably high. Another worrying phenomenon is the fact that since 2003 reduction in fatalities and injuries has been 20– 25% short of annual targets set by stake holders. These factors make the safety of the industry a very important subject. The understanding of accident causality is a major step in the quest to reduce accidents. It is only with a good understanding of the accident process that effective remedies can be designed. Accident modelling techniques provide the necessary platform for the interpretation and understanding of accidents at workplaces. The Swiss Cheese Model of accidents has proven to be a very efficient way of analysing industrial accidents. In this model, an accident is seen as a combination of unsafe acts by front line operators and latent conditions in the organization. The model helps to identify factors in an organizational structure that influence human behaviour/performance at workplaces. This study is aimed at demonstrating how a systemic approach can be applied to the analysis of the causes of accidents in South African mines. In this study, an accident analysis framework has been developed from the Swiss Cheese Model, combining the Mark III version of the Swiss Cheese Model, the Nertney Wheel and safety management principles. The main section of the framework is made up of three layers of accident causality: proximal causes, workplace factors and systemic factors. The second section (metadata) of the framework incorporates contextual data pertaining to each accident such as age, experience, task being performed, and time of accident. These data enhance the understanding of accident causality. The third and final section of the framework incorporates information about accident causing agencies and the nature of barriers breached in the accident process
Use of evidential reasoning for eliciting bayesian subjective probabilities in human reliability analysis: A maritime case
Modelling the interdependencies among the factors influencing human error (e.g. the common performance conditions (CPCs) in Cognitive Reliability Error Analysis Method (CREAM)) stimulates the use of Bayesian Networks (BNs) in Human Reliability Analysis (HRA). However, subjective probability elicitation for a BN is often a daunting and complex task. To create conditional probability values for each given variable in a BN requires a high degree of knowledge and engineering effort, often from a group of domain experts. This paper presents a novel hybrid approach for incorporating the evidential reasoning (ER) approach with BNs to facilitate HRA under incomplete data. The kernel of this approach is to develop the best and the worst possible conditional subjective probabilities of the nodes representing the factors influencing HRA when using BNs in human error probability (HEP). The proposed hybrid approach is demonstrated by using CREAM to estimate HEP in the maritime area. The findings from the hybrid ER-BN model can effectively facilitate HEP analysis in specific and decision-making under uncertainty in general
Recommended from our members
Maximizing the value of information from high-frequency downhole dynamics data
Downhole drilling dynamics are poorly understood. Neither models nor experiments seem capable of fully describing the movements and forces of the drillstring during drilling. Downhole measurements could potentially hold the key to those missing insights, however data is not yet used to its full potential. This work addresses the barriers to obtaining value from downhole dynamics data and offers solutions to overcome them.
A novel kinematic model was developed that fully accounts for sensor position and measurement design. It supports the hypothesis that lateral vibrations cause high-frequency fluctuations of tangential accelerations. Hence, against currently prevailing scientific opinion, “high-frequency torsional oscillations” (HFTO) are not actually a torsional phenomenon, but the consequence of a lateral vibration. A downhole measurement tool under off-center rotation captures particular high-frequency data patterns that can be considered a sensor artifact. If ignored, these artifacts can impact the calculations of RPM and other derived measurements from downhole data.
An extensive set of downhole data was analyzed to improve downhole dynamics data collection schemes for detecting drilling dysfunctions. For each prominent type of dysfunction, minimum data collection frequencies are specified. Such guidelines assist in collecting downhole data at sampling rates that are high enough to draw meaningful conclusions, but low enough to not flood limited available bandwidth and memory capacities. Even though a sensor is set up to measure only a single parameter along a single axis, it captures a variety of downhole events, which may lead to misinterpretations. These events can still be differentiated based on their typical frequency ranges. It is further shown how ‘noisy’ frequency ranges can be detected and selectively removed by combining multiple downhole measurements.
A lack of transparency and inefficient processes around sensor design, data collection, processing, and transfer cause misinterpretation and under-utilization of drilling downhole data. A review of tool design and sensor identifies sources of bad data quality. Eventually, defined data quality requirements will offer sustainable sensor data improvement. To work with downhole data generated under current circumstances, data processing techniques are developed and demonstrated. Algorithms that combine data, drilling processes, and physics automatically correct sensor errors. Further, a machine learning approach for automated vibration classification based on patterns is developed.
A standardized structure to transfer downhole data from the service provider to the end user is suggested. The structure does not only define how the data should be shared, but also what additional data (metadata) is required. Specifications of such informational requirements improve transparency and comparability of measurements. Therefore, the proposed data format is a prerequisite for automated drilling data analysis.Petroleum and Geosystems Engineerin
Bridging Bays, Bridging Borders: Global Justice and Community Organizing in the San Francisco Bay Area
We offer this document as our own effort to build the inclusion and understandings that will help both communities and leaders recognize the grassroots wisdom and issues that could help us realize the positive impacts from globalization and minimize the negative aspects that have concerned us all. Another world is possible, but it is up to us to build it
- …