59 research outputs found

    HiTRACE-Web: an online tool for robust analysis of high-throughput capillary electrophoresis

    Get PDF
    To facilitate the analysis of large-scale high-throughput capillary electrophoresis data, we previously proposed a suite of efficient analysis software named HiTRACE (High Throughput Robust Analysis of Capillary Electrophoresis). HiTRACE has been used extensively for quantitating data from RNA and DNA structure mapping experiments, including mutate-and-map contact inference, chromatin footprinting, the EteRNA RNA design project and other high-throughput applications. However, HiTRACE is based on a suite of command-line MATLAB scripts that requires nontrivial efforts to learn, use, and extend. Here we present HiTRACE-Web, an online version of HiTRACE that includes standard features previously available in the command-line version as well as additional features such as automated band annotation and flexible adjustment of annotations, all via a user-friendly environment. By making use of parallelization, the on-line workflow is also faster than software implementations available to most users on their local computers. Free access: http://hitrace.or

    %QLS SAS Macro: A SAS macro for Analysis of Longitudinal Data Using Quasi-Least Squares .

    Get PDF
    Quasi-least squares (QLS) is an alternative computational approach for estimation of the correlation parameter in the framework of generalized estimating equations (GEE). QLS overcomes some limitations of GEE that were discussed in Crowder (Biometrika 82 (1995) 407-410). In addition, it allows for easier implementation of some correlation structures that are not available for GEE. We describe a user written SAS macro called %QLS, and demonstrate application of our macro using a clinical trial example for the comparison of two treatments for a common toenail infection. %QLS also computes the lower and upper boundaries of the correlation parameter for analysis of longitudinal binary data that were described by Prentice (Biometrics 44 (1988), 1033-1048). Furthermore, it displays a warning message if the Prentice constraints are violated; This warning is not provided in existing GEE software packages and other packages that were recently developed for application of QLS (in Stata, Matlab, and R). %QLS allows for analysis of normal, binary, or Poisson data with one of the following working correlation structures: the first-order autoregressive (AR(1)), equicorrelated, Markov, or tri-diagonal structures. Keywords: longitudinal data, generalized estimating equations, quasi-least squares, SAS

    On the designation of the patterned associations for longitudinal Bernoulli data: weight matrix versus true correlation structure?

    Get PDF
    Due to potential violation of standard constraints for the correlation for binary data, it has been argued recently that the working correlation matrix should be viewed as a weight matrix that should not be confused with the true correlation structure. We propose two arguments to support our view to the contrary for the first-order autoregressive AR(1) correlation matrix. First, we prove that the standard constraints are not unduly restrictive for the AR(1) structure that is plausible for longitudinal data; furthermore, for the logit link function the upper boundary value only depends on the regression parameter and the change in covariate values between successive measurements. In addition, for given marginal means and parameter α\alpha, we provide a general proof that satisfaction of the standard constraints for consecutive marginal means will guarantee the existence of a compatible multivariate distribution with an AR(1) structure. The relative laxity of the standard constraints for the AR(1) structure coupled with the existence of a simple model that yields data with an AR(1) structure bolsters our view that for the AR(1) structure at least, it is appropriate to view this model as a correlation structure versus a weight matrix

    Analysis of Adverse Events in Drug Safety: A Multivariate Approach Using Stratified Quasi-least Squares

    Get PDF
    Safety assessment in drug development involves numerous statistical challenges, and yet statistical methodologies and their applications to safety data have not been fully developed, despite a recent increase of interest in this area. In practice, a conventional univariate approach for analysis of safety data involves application of the Fisher\u27s exact test to compare the proportion of subjects who experience adverse events (AEs) between treatment groups; This approach ignores several common features of safety data, including the presence of multiple endpoints, longitudinal follow-up, and a possible relationship between the AEs within body systems. In this article, we propose various regression modeling strategies to model multiple longitudinal AEs that are biologically classified into different body systems via the stratified quasi-least squares (SQLS) method. We then analyze safety data from a clinical drug development program at Wyeth Research that compared an experimental drug with a standard treatment using SQLS, which could be a superior alternative to application of the Fisher\u27s exact test

    Simulations of BEAVRS Benchmark Cycle 2 Depletion with MCS/CTF Coupling System

    Get PDF
    The quarter-core simulation of BEAVRS Cycle 2 depletion benchmark has been conducted using the MCS/CTF coupling system. MCS/CTF is a cycle-wise Picard iteration based inner-coupling code system, which couples sub-channel T/H (thermal/hydraulic) code CTF as a T/H solver in Monte Carlo neutron transport code MCS. This coupling code system has been previously applied in the BEAVRS benchmark Cycle 1 full-core simulation. The Cycle 2 depletion has been performed with T/H feedback based on the spent fuel materials composition pre-generated by the Cycle 1 depletion simulation using refueling capability of MCS code. Meanwhile, the MCS internal one-dimension T/H solver (MCS/TH1D) has been also applied in the simulation as the reference. In this paper, an analysis of the detailed criticality boron concentration and the axially integrated assembly-wise detector signals will be presented and compared with measured data based on the real operating physical conditions. Moreover, the MCS/CTF simulated results for neutronics and T/H parameters will be also compared to MCS/TH1D to figure out their difference, which proves the practical application of MCS into the BEAVRS benchmark two-cycle depletion simulations. (C) 2019 Korean Nuclear Society, Published by Elsevier Korea LLC

    A Multi-Physics Adaptive Time Step Coupling Algorithm for Light-Water Reactor Core Transient and Accident Simulation

    Get PDF
    A new reactor core multi-physics system addresses the pellet-to-cladding heat transfer modeling to improve full-core operational transient and accident simulation used for assessment of reactor core nuclear safety. The rigorous modeling of the heat transfer phenomena involves strong interaction between neutron kinetics, thermal-hydraulics and nuclear fuel performance, as well as consideration of the pellet-to-cladding mechanical contact leading to dramatic increase in the gap thermal conductance coefficient. In contrast to core depletion where parameters smoothly depend on fuel burn-up, the core transient is driven by stiff equation associated with rapid variation in the solution and vulnerable to numerical instability for large time step sizes. Therefore, the coupling algorithm dedicated for multi-physics transient must implement adaptive time step and restart capability to achieve prescribed tolerance and to maintain stability of numerical simulation. This requirement is met in the MPCORE (Multi-Physics Core) multi-physics system employing external loose coupling approach to facilitate the coupling procedure due to little modification of constituent modules and due to high transparency of coupling interfaces. The paper investigates the coupling algorithm performance and evaluates the pellet-to-cladding heat transfer effect for the rod ejection accident of a light water reactor core benchmark

    A Rapid Prototyping Method for Sub-MHz Single-Element Piezoelectric Transducers by Using 3D-Printed Components

    Get PDF
    We present a rapid prototyping method for sub-megahertz single-element piezoelectric transducers by using 3D-printed components. In most of the early research phases of applying new sonication ideas, the prototyping quickness is prioritized over the final packaging quality, since the quickness of preliminary demonstration is crucial for promptly determining specific aims and feasible research approaches. We aim to develop a rapid prototyping method for functional ultrasonic transducers to overcome the current long lead time (>a few weeks). Here, we used 3D-printed external housing parts considering a single matching layer and either air backing or epoxy-composite backing (acoustic impedance > 5 MRayl). By molding a single matching layer on the top surface of a piezoceramic in a 3D-printed housing, an entire packaging time was significantly reduced ( 1) at focus with temporal pulse controllability (maximum amplitude by <5-cycle burst). We adopted an air-backing design (Type A) for efficient pressure outputs, and bandwidth improvement was tested by a tungsten-composite-backing (Type B) design. The acoustic characterization results showed that the type A prototype provided 3.3 kPa/Vpp far-field transmitting sensitivity with 25.3% fractional bandwidth whereas the type B transducer showed 2.1 kPa/Vpp transmitting sensitivity with 43.3% fractional bandwidth. As this method provided discernable quickness and cost efficiency, this detailed rapid prototyping guideline can be useful for early-phase sonication projects, such as multi-element therapeutic ultrasound array and micro/nanomedicine testing benchtop device prototyping

    HiTRACE: High-throughput robust analysis for capillary electrophoresis

    Full text link
    Motivation: Capillary electrophoresis (CE) of nucleic acids is a workhorse technology underlying high-throughput genome analysis and large-scale chemical mapping for nucleic acid structural inference. Despite the wide availability of CE-based instruments, there remain challenges in leveraging their full power for quantitative analysis of RNA and DNA structure, thermodynamics, and kinetics. In particular, the slow rate and poor automation of available analysis tools have bottlenecked a new generation of studies involving hundreds of CE profiles per experiment. Results: We propose a computational method called high-throughput robust analysis for capillary electrophoresis (HiTRACE) to automate the key tasks in large-scale nucleic acid CE analysis, including the profile alignment that has heretofore been a rate-limiting step in the highest throughput experiments. We illustrate the application of HiTRACE on thirteen data sets representing 4 different RNAs, three chemical modification strategies, and up to 480 single mutant variants; the largest data sets each include 87,360 bands. By applying a series of robust dynamic programming algorithms, HiTRACE outperforms prior tools in terms of alignment and fitting quality, as assessed by measures including the correlation between quantified band intensities between replicate data sets. Furthermore, while the smallest of these data sets required 7 to 10 hours of manual intervention using prior approaches, HiTRACE quantitation of even the largest data sets herein was achieved in 3 to 12 minutes. The HiTRACE method therefore resolves a critical barrier to the efficient and accurate analysis of nucleic acid structure in experiments involving tens of thousands of electrophoretic bands.Comment: Revised to include Supplement. Availability: HiTRACE is freely available for download at http://hitrace.stanford.ed
    corecore