132 research outputs found

    Effect of soil temperature changes on geogrid strains

    Get PDF
    Abstract: Temperatures were measured along instrumented geogrids to determine thermal strains and their changes with seasonal temperatures. It was observed that the application of temperature correction to the measured strain values by electrical wire resistance (EWR) strain gauges to compensate for temperature-induced strains is not correct. Because of the effects of soil confinement, the geogrids confined with soil do not undergo thermal expansion or contraction from temperature change if slippage between the soil and geogrid cannot occur. Instead of thermal strains, thermal stress or thermal force will be developed in the geogrids with the magnitude depending on the elastic properties, temperature change, and linear coefficient of thermal expansion

    Laboratory investigation on freeze separation of saline mine waste water

    Get PDF
    Abstract: The extraction and upgrading process for bitumen from oil sand deposits in Alberta, Canada currently requires large volumes of process water. This water demand is fulfilled by importing water and recycling/reuse of clarified process water. Reuse of the clarified water results in the steady increase of organic and inorganic (salt) contaminant concentrations in the recycle water. Using a specially designed flume housed in a cold room, trickle freeze separation was evaluated for contaminant separation of saline solutions used as a surrogate for mine waste water. Experiments were conducted at various ambient temperatures, salt concentrations and mass flow rates. Melting proved to be more effective at concentrating salts than freezing. The trickle freeze/thaw process developed during the experiment was very effective at separating and concentrating the salts into a smaller volume. For source waters frozen at an ambient temperature of -15 degrees C and with 3000 mg/L (NaCl) or less, 80% removal of salts was possible after melting 9% of the produced ice. For source waters with higher concentrations (20,000 mg/L), 80% removal was possible after melting 27% of the produced ice

    Calculating Confidence, Uncertainty, and Numbers of Samples When Using Statistical Sampling Approaches to Characterize and Clear Contaminated Areas

    Get PDF
    This report discusses the methodology, formulas, and inputs needed to make characterization and clearance decisions for Bacillus anthracis-contaminated and uncontaminated (or decontaminated) areas using a statistical sampling approach. Specifically, the report includes the methods and formulas for calculating the • number of samples required to achieve a specified confidence in characterization and clearance decisions • confidence in making characterization and clearance decisions for a specified number of samples for two common statistically based environmental sampling approaches. In particular, the report addresses an issue raised by the Government Accountability Office by providing methods and formulas to calculate the confidence that a decision area is uncontaminated (or successfully decontaminated) if all samples collected according to a statistical sampling approach have negative results. Key to addressing this topic is the probability that an individual sample result is a false negative, which is commonly referred to as the false negative rate (FNR). The two statistical sampling approaches currently discussed in this report are 1) hotspot sampling to detect small isolated contaminated locations during the characterization phase, and 2) combined judgment and random (CJR) sampling during the clearance phase. Typically if contamination is widely distributed in a decision area, it will be detectable via judgment sampling during the characterization phrase. Hotspot sampling is appropriate for characterization situations where contamination is not widely distributed and may not be detected by judgment sampling. CJR sampling is appropriate during the clearance phase when it is desired to augment judgment samples with statistical (random) samples. The hotspot and CJR statistical sampling approaches are discussed in the report for four situations: 1. qualitative data (detect and non-detect) when the FNR = 0 or when using statistical sampling methods that account for FNR > 0 2. qualitative data when the FNR > 0 but statistical sampling methods are used that assume the FNR = 0 3. quantitative data (e.g., contaminant concentrations expressed as CFU/cm2) when the FNR = 0 or when using statistical sampling methods that account for FNR > 0 4. quantitative data when the FNR > 0 but statistical sampling methods are used that assume the FNR = 0. For Situation 2, the hotspot sampling approach provides for stating with Z% confidence that a hotspot of specified shape and size with detectable contamination will be found. Also for Situation 2, the CJR approach provides for stating with X% confidence that at least Y% of the decision area does not contain detectable contamination. Forms of these statements for the other three situations are discussed in Section 2.2. Statistical methods that account for FNR > 0 currently only exist for the hotspot sampling approach with qualitative data (or quantitative data converted to qualitative data). This report documents the current status of methods and formulas for the hotspot and CJR sampling approaches. Limitations of these methods are identified. Extensions of the methods that are applicable when FNR = 0 to account for FNR > 0, or to address other limitations, will be documented in future revisions of this report if future funding supports the development of such extensions. For quantitative data, this report also presents statistical methods and formulas for 1. quantifying the uncertainty in measured sample results 2. estimating the true surface concentration corresponding to a surface sample 3. quantifying the uncertainty of the estimate of the true surface concentration. All of the methods and formulas discussed in the report were applied to example situations to illustrate application of the methods and interpretation of the results

    Validation of Statistical Sampling Algorithms in Visual Sample Plan (VSP): Summary Report

    Get PDF
    The U.S. Department of Homeland Security, Office of Technology Development (OTD) contracted with a set of U.S. Department of Energy national laboratories, including the Pacific Northwest National Laboratory (PNNL), to write a Remediation Guidance for Major Airports After a Chemical Attack. The report identifies key activities and issues that should be considered by a typical major airport following an incident involving release of a toxic chemical agent. Four experimental tasks were identified that would require further research in order to supplement the Remediation Guidance. One of the tasks, Task 4, OTD Chemical Remediation Statistical Sampling Design Validation, dealt with statistical sampling algorithm validation. This report documents the results of the sampling design validation conducted for Task 4. In 2005, the Government Accountability Office (GAO) performed a review of the past U.S. responses to Anthrax terrorist cases. Part of the motivation for this PNNL report was a major GAO finding that there was a lack of validated sampling strategies in the U.S. response to Anthrax cases. The report (GAO 2005) recommended that probability-based methods be used for sampling design in order to address confidence in the results, particularly when all sample results showed no remaining contamination. The GAO also expressed a desire that the methods be validated, which is the main purpose of this PNNL report. The objective of this study was to validate probability-based statistical sampling designs and the algorithms pertinent to within-building sampling that allow the user to prescribe or evaluate confidence levels of conclusions based on data collected as guided by the statistical sampling designs. Specifically, the designs found in the Visual Sample Plan (VSP) software were evaluated. VSP was used to calculate the number of samples and the sample location for a variety of sampling plans applied to an actual release site. Most of the sampling designs validated are probability based, meaning samples are located randomly (or on a randomly placed grid) so no bias enters into the placement of samples, and the number of samples is calculated such that IF the amount and spatial extent of contamination exceeds levels of concern, at least one of the samples would be taken from a contaminated area, at least X% of the time. Hence, "validation" of the statistical sampling algorithms is defined herein to mean ensuring that the "X%" (confidence) is actually met

    Change Point Estimation in Monitoring Survival Time

    Get PDF
    Precise identification of the time when a change in a hospital outcome has occurred enables clinical experts to search for a potential special cause more effectively. In this paper, we develop change point estimation methods for survival time of a clinical procedure in the presence of patient mix in a Bayesian framework. We apply Bayesian hierarchical models to formulate the change point where there exists a step change in the mean survival time of patients who underwent cardiac surgery. The data are right censored since the monitoring is conducted over a limited follow-up period. We capture the effect of risk factors prior to the surgery using a Weibull accelerated failure time regression model. Markov Chain Monte Carlo is used to obtain posterior distributions of the change point parameters including location and magnitude of changes and also corresponding probabilistic intervals and inferences. The performance of the Bayesian estimator is investigated through simulations and the result shows that precise estimates can be obtained when they are used in conjunction with the risk-adjusted survival time CUSUM control charts for different magnitude scenarios. The proposed estimator shows a better performance where a longer follow-up period, censoring time, is applied. In comparison with the alternative built-in CUSUM estimator, more accurate and precise estimates are obtained by the Bayesian estimator. These superiorities are enhanced when probability quantification, flexibility and generalizability of the Bayesian change point detection model are also considered

    An Approach for Assessing the Signature Quality of Various Chemical Assays when Predicting the Culture Media Used to Grow Microorganisms

    Get PDF
    We demonstrate an approach for assessing the quality of a signature system designed to predict the culture medium used to grow a microorganism. The system was comprised of four chemical assays designed to identify various ingredients that could be used to produce the culture medium. The analytical measurements resulting from any combination of these four assays can be used in a Bayesian network to predict the probabilities that the microorganism was grown using one of eleven culture media. We evaluated combinations of the signature system by removing one or more of the assays from the Bayes network. We measured and compared the quality of the various Bayes nets in terms of fidelity, cost, risk, and utility, a method we refer to as Signature Quality Metric

    Economic evaluation of three populational screening strategies for cervical cancer in the county of Valles Occidental: CRICERVA clinical trial

    Get PDF
    Copyright @ 2011 Acera et al; licensee BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.A high percentage of cervical cancer cases have not undergone cytological tests within 10 years prior to diagnosis. Different population interventions could improve coverage in the public system, although costs will also increase. The aim of this study was to compare the effectiveness and the costs of three types of population interventions to increase the number of female participants in the screening programmes for cancer of the cervix carried out by Primary Care in four basic health care areas.Fondo de Investigación Sanitaria del Instituto Carlos III de Madri

    The Canlex Project: summary and conclusions

    Get PDF
    The Canadian geotechnical engineering community has completed a major collaborative 5 year research project entitled the Canadian Liquefaction Experiment (CANLEX). The main objective of the project was to study the phenomenon of soil liquefaction, which can occur in saturated sandy soils and is characterized by a large loss of strength or stiffness resulting in substantial deformations. The intent of this paper is to compare, interpret, and summarize the large amount of field and laboratory data obtained for six sites in Western Canada as part of the CANLEX project. The sites are compared in terms of both flow-liquefaction and cyclic-softening considerations. The paper presents a number of conclusions drawn from the project as a whole, in terms of both fundamental and practical significance.published_or_final_versio
    corecore