743 research outputs found

    Rapid virus titration using flow cytometry

    Get PDF
    Rapid high throughput virus titration methods are essential for facilitating continuous process monitoring and rapid decision making in viral bioprocess development. In spite of repeated efforts to address this need, the industry continues to reply on well understood and trusted plaque assays and end point dilution assays, or variations thereof. Together with the University of Waterloo, we have developed a flow cytometry based assay that is able to give infectious virus titers in a fraction of the time as compared to conventional virus titration assays. The developed method utilizes the phenomenon of increased granularity in cells after virus infection, with the increase in granularity proportional to the multiplicity of infection of the virus. The assay has been adapted to a 96 well plate format which, in combination with the use of a flow cytometer with an automated sampler, results in a high throughput assay with much reduced operator effort as compared to traditional assays. Two different cell and virus systems have been examined using this assay. Assay variations in both systems were measured to be ~20%, and assay accuracy was highly comparable to traditional gold standard assays such as the plaque assay. Assay analysis was found to be simple and amenable to automation through the use of R scripts. Operator effort was reduced by approximately half per sample, and the assay time was reduced by 75%, when compared to traditional assays. In addition, the simplicity of the assay greatly reduces operator training time. Studies by other groups provide confidence that the phenomenon of increased cell granularity with virus infection is present in several virus-cell systems. Therefore, the developed method has great potential to be used as a routine high throughput screening technique for a wide range of viruses

    Rapid virus titration using flow cytometry

    Get PDF
    Rapid high throughput virus titration methods are essential for facilitating continuous process monitoring and rapid decision making in viral bioprocess development. In spite of repeated efforts to address this need, the industry continues to reply on well understood and trusted plaque assays and end point dilution assays, or variations thereof. Together with the University of Waterloo, we have developed a flow cytometry based assay that is able to give infectious virus titers in a fraction of the time as compared to conventional virus titration assays. The developed method utilizes the phenomenon of increased granularity in cells after virus infection, with the increase in granularity proportional to the multiplicity of infection of the virus. The assay has been adapted to a 96 well plate format which, in combination with the use of a flow cytometer with an automated sampler, results in a high throughput assay with much reduced operator effort as compared to traditional assays. Two different cell and virus systems have been examined using this assay. Assay variations in both systems were measured to be ~20%, and assay accuracy was highly comparable to traditional gold standard assays such as the plaque assay. Assay analysis was found to be simple and amenable to automation through the use of R scripts. Operator effort was reduced by approximately half per sample, and the assay time was reduced by 75%, when compared to traditional assays. In addition, the simplicity of the assay greatly reduces operator training time. Studies by other groups provide confidence that the phenomenon of increased cell granularity with virus infection is present in several virus-cell systems. Therefore, the developed method has great potential to be used as a routine high throughput screening technique for a wide range of viruses

    Investigating attributes affecting the performance of WBI users

    Get PDF
    This is the post-print version of the final paper published in Computers and Education. The published article is available from the link below. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. Copyright @ 2013 Elsevier B.V.Numerous research studies have explored the effect of hypermedia on learners' performance using Web Based Instruction (WBI). A learner's performance is determined by their varying skills and abilities as well as various differences such as gender, cognitive style and prior knowledge. In this paper, we investigate how differences between individuals influenced learner's performance using a hypermedia system to accommodate an individual's preferences. The effect of learning performance is investigated to explore relationships between measurement attributes including gain scores (post-test minus pre-test), number of pages visited in a WBI program, and time spent on such pages. A data mining approach was used to analyze the results by comparing two clustering algorithms (K-Means and Hierarchical) with two different numbers of clusters. Individual differences had a significant impact on learner behavior in our WBI program. Additionally, we found that the relationship between attributes that measure performance played an influential role in exploring performance level; the relationship between such attributes induced rules in measuring level of a learners' performance

    The Sydney-AAO Multi-object Integral field spectrograph (SAMI)

    Full text link
    We demonstrate a novel technology that combines the power of the multi-object spectrograph with the spatial multiplex advantage of an integral field spectrograph (IFS). The Sydney-AAO Multi-object IFS (SAMI) is a prototype wide-field system at the Anglo-Australian Telescope (AAT) that allows 13 imaging fibre bundles ("hexabundles") to be deployed over a 1-degree diameter field of view. Each hexabundle comprises 61 lightly-fused multimode fibres with reduced cladding and yields a 75 percent filling factor. Each fibre core diameter subtends 1.6 arcseconds on the sky and each hexabundle has a field of view of 15 arcseconds diameter. The fibres are fed to the flexible AAOmega double-beam spectrograph, which can be used at a range of spectral resolutions (R=lambda/delta(lambda) ~ 1700-13000) over the optical spectrum (3700-9500A). We present the first spectroscopic results obtained with SAMI for a sample of galaxies at z~0.05. We discuss the prospects of implementing hexabundles at a much higher multiplex over wider fields of view in order to carry out spatially--resolved spectroscopic surveys of 10^4 to 10^5 galaxies.Comment: 24 pages, 16 figures. Accepted by MNRA

    Graph Neural Network for Object Reconstruction in Liquid Argon Time Projection Chambers

    Full text link
    This paper presents a graph neural network (GNN) technique for low-level reconstruction of neutrino interactions in a Liquid Argon Time Projection Chamber (LArTPC). GNNs are still a relatively novel technique, and have shown great promise for similar reconstruction tasks in the LHC. In this paper, a multihead attention message passing network is used to classify the relationship between detector hits by labelling graph edges, determining whether hits were produced by the same underlying particle, and if so, the particle type. The trained model is 84% accurate overall, and performs best on the EM shower and muon track classes. The model's strengths and weaknesses are discussed, and plans for developing this technique further are summarised.Comment: 7 pages, 3 figures, submitted to the 25th International Conference on Computing in High-Energy and Nuclear Physic

    Ethanol Distribution, Dispensing, and Use: Analysis of a Portion of the Biomass-to-Biofuels Supply Chain Using System Dynamics

    Get PDF
    The Energy Independence and Security Act of 2007 targets use of 36 billion gallons of biofuels per year by 2022. Achieving this may require substantial changes to current transportation fuel systems for distribution, dispensing, and use in vehicles. The U.S. Department of Energy and the National Renewable Energy Laboratory designed a system dynamics approach to help focus government action by determining what supply chain changes would have the greatest potential to accelerate biofuels deployment. The National Renewable Energy Laboratory developed the Biomass Scenario Model, a system dynamics model which represents the primary system effects and dependencies in the biomass-to-biofuels supply chain. The model provides a framework for developing scenarios and conducting biofuels policy analysis. This paper focuses on the downstream portion of the supply chain–represented in the distribution logistics, dispensing station, and fuel utilization, and vehicle modules of the Biomass Scenario Model. This model initially focused on ethanol, but has since been expanded to include other biofuels. Some portions of this system are represented dynamically with major interactions and feedbacks, especially those related to a dispensing station owner’s decision whether to offer ethanol fuel and a consumer’s choice whether to purchase that fuel. Other portions of the system are modeled with little or no dynamics; the vehicle choices of consumers are represented as discrete scenarios. This paper explores conditions needed to sustain an ethanol fuel market and identifies implications of these findings for program and policy goals. A large, economically sustainable ethanol fuel market (or other biofuel market) requires low end-user fuel price relative to gasoline and sufficient producer payment, which are difficult to achieve simultaneously. Other requirements (different for ethanol vs. other biofuel markets) include the need for infrastructure for distribution and dispensing and widespread use of high ethanol blends in flexible-fuel vehicles

    VAST: An ASKAP Survey for Variables and Slow Transients

    Get PDF
    The Australian Square Kilometre Array Pathfinder (ASKAP) will give us an unprecedented opportunity to investigate the transient sky at radio wavelengths. In this paper we present VAST, an ASKAP survey for Variables and Slow Transients. VAST will exploit the wide-field survey capabilities of ASKAP to enable the discovery and investigation of variable and transient phenomena from the local to the cosmological, including flare stars, intermittent pulsars, X-ray binaries, magnetars, extreme scattering events, interstellar scintillation, radio supernovae and orphan afterglows of gamma ray bursts. In addition, it will allow us to probe unexplored regions of parameter space where new classes of transient sources may be detected. In this paper we review the known radio transient and variable populations and the current results from blind radio surveys. We outline a comprehensive program based on a multi-tiered survey strategy to characterise the radio transient sky through detection and monitoring of transient and variable sources on the ASKAP imaging timescales of five seconds and greater. We also present an analysis of the expected source populations that we will be able to detect with VAST.Comment: 29 pages, 8 figures. Submitted for publication in Pub. Astron. Soc. Australi
    • …
    corecore