20 research outputs found

    A false sense of security?:can tiered approach be trusted to accurately classify immunogenicity samples?

    Get PDF
    Detecting and characterizing of anti-drug antibodies (ADA) against a protein therapeutic are crucially important to monitor the unwanted immune response. Usually a multi-tiered approach that initially rapidly screens for positive samples that are subsequently confirmed in a separate assay is employed for testing of patient samples for ADA activity. In this manuscript we evaluate the ability of different methods used to classify subject with screening and competition based confirmatory assays. We find that for the overall performance of the multi-stage process the method used for confirmation is most important where a t-test is best when differences are moderate to large. Moreover we find that, when differences between positive and negative samples are not sufficiently large, using a competition based confirmation step does yield poor classification of positive samples

    Polyethylene glycol 20 kDa-induced vacuolation does not impair phagocytic function of human monocyte-derived macrophages

    Get PDF
    Conjugation to polyethylene glycol (PEG) is commonly used to enhance drug delivery and efficacy by extending the half-life of the drug molecule. This has important implications for reducing treatment burden in diseases that require chronic prophylaxis, such as hemophilia. Clearance of PEG molecules with high molecular weights (≥ 40 kDa) has been reported to cause cellular vacuolation in mammals. Rurioctocog alfa pegol (PEGylated recombinant coagulation factor VIII) contains a 20-kDa PEG. This study investigated the effects of exposure to 20-kDa PEG (10 μg/ml to 10 mg/ml) on the morphology and function of human monocyte-derived macrophages (MDMs) in vitro. Exposure to PEG for 24 hours was associated with significant vacuolation only at concentrations of 1 mg/ml or more, which far exceed the levels associated with clinically relevant doses of rurioctocog alfa pegol. Immunofluorescence staining of PEG was detected in the cytoplasm of MDMs, indicating uptake into the cells. No impairment of MDM phagocytic activity (ability to ingest fluorescently labeled Escherichia coli) was observed with 24-hour exposure to PEG, even at concentrations associated with significant vacuolation. Furthermore, PEG exposure did not have significant effects on cytokine secretion in resting or lipopolysaccharide-stimulated MDMs, or on the expression of cell surface markers in stimulated MDMs. Cell viability was not affected by 24-hour exposure to PEG. In conclusion, vacuolation of human MDMs after exposure to 20-kDa PEG only occurred with PEG concentrations far in excess of those equivalent to clinically relevant doses of rurioctocog alfa pegol and did not affect MDM viability or functionality. Together, these results support the concept that PEG-mediated vacuolation is an adaptive cellular response rather than a toxic effect

    Die Genfamilie der Peroxiredoxine in Pflanzen: genetische Regulation und Funktion

    No full text
    Horling F. Die Genfamilie der Peroxiredoxine in Pflanzen: genetische Regulation und Funktion. Bielefeld; 2002

    A comparison of methods for classifying samples as truly specific with confirmatory immunoassays

    No full text
    Biotechnology-derived therapeutics may induce an unwanted immune response leading to the formation of anti-drug antibodies (ADAs) which can result in altered efficacy and safety of the therapeutic protein. Anti-drug antibodies may, for example, affect pharmacokinetics of the therapeutic protein or induce autoimmunity. It is therefore crucial to have assays available for the detection and characterization of ADAs. Commonly, a screening assay is initially used to classify samples as either ADA positive or negative. A confirmatory assay, typically based on antigen competition, is subsequently employed to separate false positive samples from truly positive samples. In this manuscript we investigate the performance of different statistical methods classifying samples in competition assays through simulation and analysis of real data. In our evaluations we do not find a uniformly best method although a simple t-test does provide good results throughout. More crucially we find that very large differences between uninhibited and inhibited measurements relative to the assay variability are required in order to obtain useful classification results questioning the usefulness of competition assays with high variability

    BIG: An agent for resource-bounded information gathering and decision making

    Get PDF
    The World Wide Web has become an invaluable information resource but the explosion of available information has made Web search a time consuming and complex process. The large number of information sources and their different levels of accessibility, reliability and associated costs present a complex information gathering control problem. This paper describes the rationale, architecture, and implementation of a next generation information gathering system---a system that integrates several areas of Artificial Intelligence research under a single umbrella. Our solution to the information explosion is an information gathering agent, BIG, that plans to gather information to support a decision process, reasons about the resource trade-offs of different possible gathering approaches, extracts information from both unstructured and structured documents, and uses the extracted information to refine its search and processing activities. 2000 Elsevier Science B.V. All rights reserved

    BIG: A Resource-Bounded Information Gathering Agent

    No full text
    Effective information gathering on the WWW is a complex task requiring planning, scheduling, text processing, and interpretation-style reasoning about extracted data to resolve inconsistencies and to refine hypotheses about the data. This paper describes the rationale, architecture, and implementation of a next generation information gathering system -- a system that integrates several areas of AI research under a single research umbrella. The goal of this system is to exploit the vast number of information sources available today on the NII including a growing number of digital libraries, independent news agencies, government agencies, as well as human experts providing a variety of services. The large number of information sources and their different levels of accessibility, reliability and associated costs present a complex information gathering coordination problem. Our solution is an information gathering agent, BIG, that plans to gather information to support a decision process, reasons about the resource tradeoffs of different possible gathering approaches, extracts information from both unstructured and structured documents, and uses the extracted information to refine its search and processing activities

    A formal comparison of different methods for establishing cut points to distinguish positive and negative samples in immunoassays.

    No full text
    Biotechnology derived therapeutics may induce an unwanted immune response leading to the formation of anti-drug antibodies (ADA). As a result the efficacy and safety of the therapeutic protein could be impaired. Neutralizing antibodies may, for example, affect pharmacokinetics of the therapeutic protein or induce autoimmunity. Therefore a drug induced immune response is a major concern and needs to be assessed during drug development. It is therefore crucial to have assays available for the detection and characterization of ADAs. These assays are used to classify samples in positive and negative samples based on a cut point. In this manuscript we investigate the performance of established and newly developed methods to determine a cut point in immunoassays such as ELISA through simulation and analysis of real data. The different methods are found to have different advantages and disadvantages. A robust parametric approach generally resulted in very good results and can be recommended for many situations. The newly introduced method based on mixture models yields similar results to the robust parametric approach but offers some additional flexibility at the expense of higher complexity
    corecore