56 research outputs found

    Variability in dengue titer estimates from plaque reduction neutralization tests poses a challenge to epidemiological studies and vaccine development.

    Get PDF
    BACKGROUND: Accurate determination of neutralization antibody titers supports epidemiological studies of dengue virus transmission and vaccine trials. Neutralization titers measured using the plaque reduction neutralization test (PRNT) are believed to provide a key measure of immunity to dengue viruses, however, the assay's variability is poorly understood, making it difficult to interpret the significance of any assay reading. In addition there is limited standardization of the neutralization evaluation point or statistical model used to estimate titers across laboratories, with little understanding of the optimum approach. METHODOLOGY/PRINCIPAL FINDINGS: We used repeated assays on the same two pools of serum using five different viruses (2,319 assays) to characterize the variability in the technique under identical experimental conditions. We also assessed the performance of multiple statistical models to interpolate continuous values of neutralization titer from discrete measurements from serial dilutions. We found that the variance in plaque reductions for individual dilutions was 0.016, equivalent to a 95% confidence interval of 0.45-0.95 for an observed plaque reduction of 0.7. We identified PRNT75 as the optimum evaluation point with a variance of 0.025 (log10 scale), indicating a titer reading of 1∶500 had 95% confidence intervals of 1∶240-1∶1000 (2.70±0.31 on a log10 scale). The choice of statistical model was not important for the calculation of relative titers, however, cloglog regression out-performed alternatives where absolute titers are of interest. Finally, we estimated that only 0.7% of assays would falsely detect a four-fold difference in titers between acute and convalescent sera where no true difference exists. CONCLUSIONS: Estimating and reporting assay uncertainty will aid the interpretation of individual titers. Laboratories should perform a small number of repeat assays to generate their own variability estimates. These could be used to calculate confidence intervals for all reported titers and allow benchmarking of assay performance

    A Dynamic Landscape for Antibody Binding Modulates Antibody-Mediated Neutralization of West Nile Virus

    Get PDF
    Neutralizing antibodies are a significant component of the host's protective response against flavivirus infection. Neutralization of flaviviruses occurs when individual virions are engaged by antibodies with a stoichiometry that exceeds a required threshold. From this “multiple-hit” perspective, the neutralizing activity of antibodies is governed by the affinity with which it binds its epitope and the number of times this determinant is displayed on the surface of the virion. In this study, we investigated time-dependent changes in the fate of West Nile virus (WNV) decorated with antibody in solution. Experiments with the well-characterized neutralizing monoclonal antibody (MAb) E16 revealed a significant increase in neutralization activity over time that could not be explained by the kinetics of antibody binding, virion aggregation, or the action of complement. Additional kinetic experiments using the fusion-loop specific MAb E53, which has limited neutralizing activity because it recognizes a relatively inaccessible epitope on mature virions, identified a role of virus “breathing” in regulating neutralization activity. Remarkably, MAb E53 neutralized mature WNV in a time- and temperature-dependent manner. This phenomenon was confirmed in studies with a large panel of MAbs specific for epitopes in each domain of the WNV envelope protein, with sera from recipients of a live attenuated WNV vaccine, and in experiments with dengue virus. Given enough time, significant inhibition of infection was observed even for antibodies with very limited, or no neutralizing activity in standard neutralization assays. Together, our data suggests that the structural dynamics of flaviviruses impacts antibody-mediated neutralization via exposure of otherwise inaccessible epitopes, allowing for antibodies to dock on the virion with a stoichiometry sufficient for neutralization

    Cell-cell adhesion regulates Merlin/NF2 interaction with the PAF complex

    Get PDF
    The PAF complex (PAFC) coordinates transcription elongation and mRNA processing and its CDC73/parafibromin subunit functions as a tumour suppressor. The NF2/Merlin tumour suppressor functions both at the cell cortex and nucleus and is a key mediator of contact inhibition but the molecular mechanisms remain unclear. In this study we have used affinity proteomics to identify novel Merlin interacting proteins and show that Merlin forms a complex with multiple proteins involved in RNA processing including the PAFC and the CHD1 chromatin remodeller. Tumour-derived inactivating mutations in both Merlin and the CDC73 PAFC subunit mutually disrupt their interaction and growth suppression by Merlin requires CDC73. Merlin interacts with the PAFC in a cell density-dependent manner and we identify a role for FAT cadherins in regulating the Merlin-PAFC interaction. Our results suggest that in addition to its function within the Hippo pathway, Merlin is part of a tumour suppressor network regulated by cell-cell adhesion which coordinates post-initiation steps of the transcription cycle of genes mediating contact inhibition

    Incompletely Specified Probabilistic Networks

    No full text
    Heinz College pape

    Probabilistic and defeasible reasoning using extended path analysis

    No full text
    A number of quantitative and qualitative approaches to defeasible reasoning have appeared in the artificial intelligence literature in the past decade. However, probability-based methods often languished, due to the supposed complexity of representation in terms of joint distributions. More recently, probabilistic representations based on networks have been shown to be quite compact. In expert systems especially, the use of probability theory for defeasible reasoning is now becoming more widespread. This dissertation describes how the statistical causal modeling technique called path analysis can be used as a probabilistically correct method for defeasible reasoning. Path analysis also makes use of networks, but the quantification of arcs is done by regression rather than by the conditional probabilities typical of other probabilistic methods. It is shown that the regression coefficients of path analysis in fact form a compact and computationally efficient representation for probabilistic reasoning. In addition to deriving the relationship between an extended version of path analysis and probability propagation on networks, the new method is illustrated by treating a number of problems which have appeared in the defeasible reasoning literature. Finally one of these so-called benchmark problems is analyzed in detail, and compared with a number of other well-known methods. It is shown that extended path analysis makes explicit the assumptions that all defeasible reasoning strategies must make, but seldom announce

    Book review: Probabilistic Similarity Networks By David E. Heckerman (The MIT Press, 1991)

    No full text

    Service Models, Operational Decisions and Architecture of Digital Libraries

    No full text
    A digital library simplifies the interface between information users and producers; it economizes the costs of storage, search, and telecommunications in providing access to information. Externalities in information consumption and the cost structure of information access suggest five service models of digital libraries. Optimizing models are proposed to formalize operational decisions in these service models. A managing server implementing these decision models would automate the operations of digital libraries. KEYWORDS: Service models, economic models, digital libraries INTRODUCTION A networked digital environment makes it possible for any individual to become a publisher and for any user to access each author's publication. However, the sheer numbers of publishers and users impose heavy burdens of navigation on users and marketing on publishers. Providing access to information incurs costs of telecommunication, query processing and storage. Neither publishers nor users are at th..

    THE MICROWAVE SPECTRUM AND STRUCTURE FOR THE HCCH-CO COMPLEX

    No full text
    Author Institution: Department of Chemistry, The University of ArizonaRotational transitions were measured for 5 isotopomers of HCCH-CO van der Waals complex using a Flygare-Balle type microwave spectrometer. The observed spectrum is consistent with a linear, hydrogen bound HCCH-CO structure. Measured rotational constants, in MHz are B=1397.370(1) for HCCH-CO, 1394.963(1) for HCCD-CO, 1385-142(1) for HCCH13COHCCH-^{13}CO, 1331.23(1)1331.23(1) for DCCHCODCCH-CO and 1329.684(2)1329.684(2) for DCCD.CODCCD.CO. The measured rotational constants were fit to obtain a center of mass separation for HCCH-CO of Rtm=5.018(7)R_{tm} = 5.018(7) \AA and a vibrationally averaged 0 21\approx 21^{\circ} which corresponds to the motion of the HCCH about RtmR_{tm} in the complex. Distortion constants were obtained and analysed to obtain force constants and estimate the dissociation energies for the isotopomers

    Mediating the Tension Between Information Privacy and Information Access: The Role of Digital Government

    No full text
    Government agencies collect and disseminate data that bear on the most important issues of public interest. Advances in information technology, particularly the Internet, have multiplied the tension between demands for evermore comprehensive databases and demands for the shelter of privacy. In mediating between these two conflicting demands, agencies must address a host of difficult problems. These include providing access to information while protecting confidentiality, coping with health information databases, and ensuring consistency with international standards. The policies of agencies are determined by what is right for them to do, what works for them, and what they are required to do by law. They must interpret and respect the ethical imperatives of democratic accountability, constitutional empowerment, and individual autonomy. They must keep pace with technological developments by developing effective measures for making information available to a broad range of users. They must both abide by the mandates of legislation and participate in the process of developing new legislation that is responsive to changes that affect their domain. In managing confidentiality and data access functions, agencies have two basic tools: techniques for disclosure limitation through restricted data and administrative procedures through restricted access. The technical procedures for disclosure limitation involve a range of mathematical and statistical tools. The administrative procedures can be implemented through a variety of institutional mechanisms, ranging from privacy advocates, through internal privacy review boards, to a data and access protection commission
    corecore