18,594 research outputs found

    Methods of testing for giardia in water : a thesis presented in partial fulfilment of the requirements for the degree Master of Science in Microbiology at Massey University

    Get PDF
    Since the 1960's when the first waterborne outbreaks of Giardia were reported in America, it has been recognised as a disease causing organism. From these outbreaks the USA Environmental Protection Agency (EPA) developed a method for testing large volumes of water for Giardia cysts, this was adapted into the 16th edition of the Standard Methods. To test the method cultured cysts were required for spiked trials. A published method of encystation by Schupp et al (1988) was investigated as a potential source of cysts. Morphologically correct cysts were gained in the greatest number at 37°C over 72 hours at a bile concentration of 5g/l. Using cultured cysts and cysts from animals and water, viability and the least number needed to iniate a culture were assessed. When 10 of the cysts produced in vitro were excysted it was possible to obtain a culture. For cysts from animal and water origins at levels up to 10,000 cysts, it was not possible to obtain cultures. Variations of the Standard Method of water testing for Giardia had been reported by different laboratories. We investigated the sensitivity of this method using some of the reported variations such as staining on a membrane filter, the use of monoclonal antibody stains and methods of washing cysts free of the sampling core. We found the method could detect to the 5 x 10 2 cysts/5001 of water, a recovery of 10%. The recoveries obtained over a range of cysts spiked was between 10-40%. An alternative method to sampling and processing the sample was tangential filtration. Four tangential filtration units were compared to the concentration techiques of centrifugation and sedimentation (these were those used in the Standard Method). The tangential filtration units were found not to be as sensitive as centrifugation and sedimentation. They also presented difficulties with particulate matter or sediment. When compared to the sampling method, the unit was unable to concentrate the 5001 of tap water due to the sediment levels. Staining methods were evaluated. Slide staining was compared to staining on a filter, the filter method was found to give a better recovery. Comparison between commercially available monoclonal antibody stain, a polyclonal antibody stain and Lugols iodine stain, found that the monoclonal and polyclonal antibody stains lead to easier identification by illuminating the cyst (it still had to be checked for internal morphology) than the iodine stain. The monoclonal antibody stains were found to be more specific than the polyclonal stain. Methods of inactivating the antigens recognised by the monoclonal antibody stain persist so cross contamination between samples was investigated. Hypochlorite concentrations of 4% and higher over 20 minutes were found to inactivate the antigen recognised. Other chemicals were compared but none were found to inactivate the antigen. A study of a family infected with Giardia was undertaken, to test methods used in the laboratory and study modes of transmission. Giardia cysts were found in the river that supplied the farm tank but not in the tank itself. The house tank also tested negative for Giardia. The family had young children attending school and playgroup, person to person transmission may also have been involved. Animals on the farm had positive tests for Giardia

    Compositions of Hot Super-Earth Atmospheres: exploring Kepler Candidates

    Full text link
    This paper outlines a simple approach to evaluate the atmospheric composition of hot rocky planets by assuming different types of planetary composition and using corresponding model calculations. To explore hot atmospheres above 1000 K, we model the vaporization of silicate magma and estimate the range of atmospheric compositions according to the planet's radius and semi-major axis for the Kepler February 2011 data release. Our results show 5 atmospheric types for hot, rocky super-Earth atmospheres, strongly dependent on the initial composition and the planet's distance to the star. We provide a simple set of parameters that can be used to evaluate atmospheric compositions for current and future candidates provided by the Kepler mission and other searches.Comment: 5 pages, Accepted for publication in ApJ Letter

    The Graphical Lasso: New Insights and Alternatives

    Full text link
    The graphical lasso \citep{FHT2007a} is an algorithm for learning the structure in an undirected Gaussian graphical model, using â„“1\ell_1 regularization to control the number of zeros in the precision matrix {\B\Theta}={\B\Sigma}^{-1} \citep{BGA2008,yuan_lin_07}. The {\texttt R} package \GL\ \citep{FHT2007a} is popular, fast, and allows one to efficiently build a path of models for different values of the tuning parameter. Convergence of \GL\ can be tricky; the converged precision matrix might not be the inverse of the estimated covariance, and occasionally it fails to converge with warm starts. In this paper we explain this behavior, and propose new algorithms that appear to outperform \GL. By studying the "normal equations" we see that, \GL\ is solving the {\em dual} of the graphical lasso penalized likelihood, by block coordinate ascent; a result which can also be found in \cite{BGA2008}. In this dual, the target of estimation is \B\Sigma, the covariance matrix, rather than the precision matrix \B\Theta. We propose similar primal algorithms \PGL\ and \DPGL, that also operate by block-coordinate descent, where \B\Theta is the optimization target. We study all of these algorithms, and in particular different approaches to solving their coordinate sub-problems. We conclude that \DPGL\ is superior from several points of view.Comment: This is a revised version of our previous manuscript with the same name ArXiv id: http://arxiv.org/abs/1111.547

    Pathwise Least Angle Regression and a Significance Test for the Elastic Net

    Full text link
    Least angle regression (LARS) by Efron et al. (2004) is a novel method for constructing the piece-wise linear path of Lasso solutions. For several years, it remained also as the de facto method for computing the Lasso solution before more sophisticated optimization algorithms preceded it. LARS method has recently again increased its popularity due to its ability to find the values of the penalty parameters, called knots, at which a new parameter enters the active set of non-zero coefficients. Significance test for the Lasso by Lockhart et al. (2014), for example, requires solving the knots via the LARS algorithm. Elastic net (EN), on the other hand, is a highly popular extension of Lasso that uses a linear combination of Lasso and ridge regression penalties. In this paper, we propose a new novel algorithm, called pathwise (PW-)LARS-EN, that is able to compute the EN knots over a grid of EN tuning parameter {\alpha} values. The developed PW-LARS-EN algorithm decreases the EN tuning parameter and exploits the previously found knot values and the original LARS algorithm. A covariance test statistic for the Lasso is then generalized to the EN for testing the significance of the predictors. Our simulation studies validate the fact that the test statistic has an asymptotic Exp(1) distribution.Comment: 5 pages, 25th European Signal Processing Conference (EUSIPCO 2017

    Local case-control sampling: Efficient subsampling in imbalanced data sets

    Full text link
    For classification problems with significant class imbalance, subsampling can reduce computational costs at the price of inflated variance in estimating model parameters. We propose a method for subsampling efficiently for logistic regression by adjusting the class balance locally in feature space via an accept-reject scheme. Our method generalizes standard case-control sampling, using a pilot estimate to preferentially select examples whose responses are conditionally rare given their features. The biased subsampling is corrected by a post-hoc analytic adjustment to the parameters. The method is simple and requires one parallelizable scan over the full data set. Standard case-control sampling is inconsistent under model misspecification for the population risk-minimizing coefficients θ∗\theta^*. By contrast, our estimator is consistent for θ∗\theta^* provided that the pilot estimate is. Moreover, under correct specification and with a consistent, independent pilot estimate, our estimator has exactly twice the asymptotic variance of the full-sample MLE - even if the selected subsample comprises a miniscule fraction of the full data set, as happens when the original data are severely imbalanced. The factor of two improves to 1+1c1+\frac{1}{c} if we multiply the baseline acceptance probabilities by c>1c>1 (and weight points with acceptance probability greater than 1), taking roughly 1+c2\frac{1+c}{2} times as many data points into the subsample. Experiments on simulated and real data show that our method can substantially outperform standard case-control subsampling.Comment: Published in at http://dx.doi.org/10.1214/14-AOS1220 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org
    • …
    corecore