395 research outputs found

    Sensitive Detection and Early Prognostic Significance of p24 Antigen in Heat-Denatured Plasma of Human Immunodeficiency Virus Type 1-Infected Infants

    Get PDF
    Immune complex formation causes underdetection of p24 antigen in human immunodeficiencyvirus(HIV)infection.Brieflyboilingdilutedplasma releasesallcomplexedantigen, which can then be measured by some commercial assays. In a retrospective pediatric cohort study, the specificity of this procedure in 390 uninfected samples was 96.9% after initial testing and 100% after neutralization. Sensitivity among 125 postnatal infected samples was, at a detection of 2 pg/ml., 96.0% (97% neutralizable) compared with 47.7% for regular antigen (76% neutralizable), 96% for polymerase chain reaction, and 77% for viral culture. The high sensitivity and specificity of heat-denatured antigen was confirmed by prospectively testing 113 additional samples.Quantitativeanalysisofsamplesfrominfectedinfants showedlowlevelsofp24 antigen in 29% of cord blood sera, a postnatal increase to levels that were during the first 6 months of life inversely associated with survival, and persistence of antigenemia thereafter independent of clinical status. Prevalence and antigen levels were significantly lower in mothers. The persistent antigenemia in children indicates that their immune systems cannot restrict HIV expression as efficiently as those of adult

    A new uvs mutant

    Get PDF
    A new uvs mutan

    Degree of explanation

    Get PDF
    Partial explanations are everywhere. That is, explanations citing causes that explain some but not all of an effect are ubiquitous across science, and these in turn rely on the notion of degree of explanation. I argue that current accounts are seriously deficient. In particular, they do not incorporate adequately the way in which a cause’s explanatory importance varies with choice of explanandum. Using influential recent contrastive theories, I develop quantitative definitions that remedy this lacuna, and relate it to existing measures of degree of causation. Among other things, this reveals the precise role here of chance, as well as bearing on the relation between causal explanation and causation itself

    The Explication Defence of Arguments from Reference

    Get PDF
    In a number of influential papers, Machery, Mallon, Nichols and Stich have presented a powerful critique of so-called arguments from reference, arguments that assume that a particular theory of reference is correct in order to establish a substantive conclusion. The critique is that, due to cross-cultural variation in semantic intuitions supposedly undermining the standard methodology for theorising about reference, the assumption that a theory of reference is correct is unjustified. I argue that the many extant responses to Machery et al.’s critique do little for the proponent of an argument from reference, as they do not show how to justify the problematic assumption. I then argue that it can in principle be justified by an appeal to Carnapian explication. I show how to apply the explication defence to arguments from reference given by Andreasen (for the biological reality of race) and by Churchland (against the existence of beliefs and desires)

    GENIE: a software package for gene-gene interaction analysis in genetic association studies using multiple GPU or CPU cores

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Gene-gene interaction in genetic association studies is computationally intensive when a large number of SNPs are involved. Most of the latest Central Processing Units (CPUs) have multiple cores, whereas Graphics Processing Units (GPUs) also have hundreds of cores and have been recently used to implement faster scientific software. However, currently there are no genetic analysis software packages that allow users to fully utilize the computing power of these multi-core devices for genetic interaction analysis for binary traits.</p> <p>Findings</p> <p>Here we present a novel software package GENIE, which utilizes the power of multiple GPU or CPU processor cores to parallelize the interaction analysis. GENIE reads an entire genetic association study dataset into memory and partitions the dataset into fragments with non-overlapping sets of SNPs. For each fragment, GENIE analyzes: 1) the interaction of SNPs within it in parallel, and 2) the interaction between the SNPs of the current fragment and other fragments in parallel. We tested GENIE on a large-scale candidate gene study on high-density lipoprotein cholesterol. Using an NVIDIA Tesla C1060 graphics card, the GPU mode of GENIE achieves a speedup of 27 times over its single-core CPU mode run.</p> <p>Conclusions</p> <p>GENIE is open-source, economical, user-friendly, and scalable. Since the computing power and memory capacity of graphics cards are increasing rapidly while their cost is going down, we anticipate that GENIE will achieve greater speedups with faster GPU cards. Documentation, source code, and precompiled binaries can be downloaded from <url>http://www.cceb.upenn.edu/~mli/software/GENIE/</url>.</p

    Percutaneous Forefoot Decompression in a Foot Compartment Syndrome Model

    Get PDF
    ABSTRACT: Background: Acute compartment syndrome of the foot is a controversial topic. Release of the foot has been seen as complicated because of large incisions and postoperative morbidity, and there has been debate over whether this procedure is actually effective for releasing all areas of increased pressure. New sensor technology affords the opportunity to advance our understanding of acute compartment syndrome of the foot and its treatment. The purpose of the present study was to determine whether percutaneous decompression could be performed for the treatment of compartment syndrome in a forefoot model. Methods: The present study utilized a validated continuous pressure sensor to model compartment syndrome in human cadaveric feet. We utilized a pressure-controlled saline solution infusion system to induce increased pressure. A novel percutaneous release of the forefoot was investigated to assess its efficacy in achieving decompression. Results: For all cadaveric specimens, continuous pressure monitoring was accomplished with use of a continuous pressure sensor. There were 4 discrete compartment areas that could be reliably pressurized in all feet. The average baseline, pressurized, and post-release pressures (and standard deviations) were 4.5 ± 2.9, 43.8 ± 7.7, and 9.5 ± 3.6 mm Hg, respectively. Percutaneous decompression produced a significant decrease in pressure in all 4 compartments (p < 0.05). Conclusions: With use of continuous compartment pressure monitoring, 4 consistent areas were established as discrete compartments in the foot. All 4 compartments were pressurized with a standard pump system. With use of 2 small dorsal incisions, all 4 compartments were successfully released, with no injuries identified in the cutaneous nerve branches, extensor tendons, or arteries. These results have strong implications for the future of modeling compartment syndrome as well as for guiding clinical studies. Clinical Relevance: A reproducible and accurate method of continuous pressure monitoring of foot compartments after trauma is needed (1) to reliably identify patients who are likely to benefit from compartment release and (2) to help avoid missed or evolving cases of acute compartment syndrome. In addition, a reproducible method for percutaneous compartment release that minimizes collateral structural damage and the need for secondary surgical procedures is needed

    ML-based Real-Time Control at the Edge: An Approach Using hls4ml

    Full text link
    This study focuses on implementing a real-time control system for a particle accelerator facility that performs high energy physics experiments. A critical operating parameter in this facility is beam loss, which is the fraction of particles deviating from the accelerated proton beam into a cascade of secondary particles. Accelerators employ a large number of sensors to monitor beam loss. The data from these sensors is monitored by human operators who predict the relative contribution of different sub-systems to the beam loss. Using this information, they engage control interventions. In this paper, we present a controller to track this phenomenon in real-time using edge-Machine Learning (ML) and support control with low latency and high accuracy. We implemented this system on an Intel Arria 10 SoC. Optimizations at the algorithm, high-level synthesis, and interface levels to improve latency and resource usage are presented. Our design implements a neural network, which can predict the main source of beam loss (between two possible causes) at speeds up to 575 frames per second (fps) (average latency of 1.74 ms). The practical deployed system is required to operate at 320 fps, with a 3ms latency requirement, which has been met by our design successfully

    Failure of A Novel, Rapid Antigen and Antibody Combination Test to Detect Antigen-Positive HIV Infection in African Adults with Early HIV Infection

    Get PDF
    BACKGROUND: Acute HIV infection (prior to antibody seroconversion) represents a high-risk window for HIV transmission. Development of a test to detect acute infection at the point-of-care is urgent. METHODS: Volunteers enrolled in a prospective study of HIV incidence in four African cities, Kigali in Rwanda and Ndola, Kitwe and Lusaka in Zambia, were tested regularly for HIV by rapid antibody test and p24 antigen ELISA. Five subgroups of samples were also tested by the Determine Ag/Ab Combo test 1) Antigen positive, antibody negative (acute infection); 2) Antigen positive, antibody positive; 3) Antigen negative, antibody positive; 4) Antigen negative, antibody negative; and 5) Antigen false positive, antibody negative (HIV uninfected). A sixth group included serial dilutions from a p24 antigen-positive control sample. Combo test results were reported as antigen positive, antibody positive, or both. RESULTS: Of 34 group 1 samples with VL between 5x105 and >1.5x107 copies/mL (median 3.5x106), 1 (2.9%) was detected by the Combo antigen component, 7 (20.6%) others were positive by the Combo antibody component. No group 2 samples were antigen positive by the Combo test (0/18). Sensitivity of the Combo antigen test was therefore 1.9% (1/52, 95% CI 0.0, 9.9). One false positive Combo antibody result (1/30, 3.3%) was observed in group 4. No false-positive Combo antigen results were observed. The Combo antigen test was positive in group 6 at concentrations of 80 pg/mL, faintly positive at 40 and 20 pg/mL, and negative thereafter. The p24 ELISA antigen test remained positive at 5 pg/mL. CONCLUSIONS: Although the antibody component of the Combo test detected antibodies to HIV earlier than the comparison antibody tests used, less than 2% of the cases of antigen-positive HIV infection were detected by the Combo antigen component. The development of a rapid point-of-care test to diagnose acute HIV infection remains an urgent goal

    Combination antiretroviral therapy and the risk of myocardial infarction

    Get PDF
    • …
    corecore