732 research outputs found

    Cloning, Sequencing, Expression and Characterization of an Alzheimer’s-Specific Monoclonal Antibody

    Get PDF
    Alzheimer’s disease (AD) is characterized by chronic inflammation and neurodegeneration, which leads to loss of cognitive functions. Dr. Nichols’ research laboratory is studying the neurological effects associated with AD. Amyloid precursor protein (APP) is a membrane spanning protein whose primary function is unknown, but it is associated with many tissue types and found clustered at the synapse of neurons. APP can be cleaved by secretases into 40 or 42 amino acid fragments called amyloid beta protein (Aβ). These cleaved amyloid-β proteins can accumulate (aggregate) and form extracellular plaques in AD brains. Antibodies are normally produced in an adaptive immune response and are a high affinity binding protein that recognizes a specific molecule, whether it be a protein or foreign cellular component. Antibodies are commonly used in the lab to quantify the levels of aggregated proteins, such as Aβ and are often used in immunotherapy clinical trials to target plaques in AD patient’s brains. The aggregates of Aβ assemble into multiple different confirmations, some of which are soluble and others insoluble, but the most toxic and active form is the soluble protofibril form. Antibodies that have been made to be selective for the protofibril form of Aβ are required constantly to study the effect of Aβ. A combination of biological and biochemical techniques has been used to obtain the DNA sequence of a monoclonal antibody (mAbSL). The antibody mAbSL was purified with an affinity column and has been shown to be selective and specific for Aβ protofibrils by a series of immunological techniques, such as enzyme-linked immunosorbent assays (ELISA) and dot blots. With the sequence and means of expression we can create a stock of the antibodies and they will be applied in multiple aspects of this laboratory to help characterize Aβ protofibrils and their effects

    Frame Permutation Quantization

    Full text link
    Frame permutation quantization (FPQ) is a new vector quantization technique using finite frames. In FPQ, a vector is encoded using a permutation source code to quantize its frame expansion. This means that the encoding is a partial ordering of the frame expansion coefficients. Compared to ordinary permutation source coding, FPQ produces a greater number of possible quantization rates and a higher maximum rate. Various representations for the partitions induced by FPQ are presented, and reconstruction algorithms based on linear programming, quadratic programming, and recursive orthogonal projection are derived. Implementations of the linear and quadratic programming algorithms for uniform and Gaussian sources show performance improvements over entropy-constrained scalar quantization for certain combinations of vector dimension and coding rate. Monte Carlo evaluation of the recursive algorithm shows that mean-squared error (MSE) decays as 1/M^4 for an M-element frame, which is consistent with previous results on optimal decay of MSE. Reconstruction using the canonical dual frame is also studied, and several results relate properties of the analysis frame to whether linear reconstruction techniques provide consistent reconstructions.Comment: 29 pages, 5 figures; detailed added to proof of Theorem 4.3 and a few minor correction

    Exploring Millions of 6-State FSSP Solutions: the Formal Notion of Local CA Simulation

    Full text link
    In this paper, we come back on the notion of local simulation allowing to transform a cellular automaton into a closely related one with different local encoding of information. This notion is used to explore solutions of the Firing Squad Synchronization Problem that are minimal both in time (2n -- 2 for n cells) and, up to current knowledge, also in states (6 states). While only one such solution was proposed by Mazoyer since 1987, 718 new solutions have been generated by Clergue, Verel and Formenti in 2018 with a cluster of machines. We show here that, starting from existing solutions, it is possible to generate millions of such solutions using local simulations using a single common personal computer

    ULTRA-TRACE DETERMINATION OF IRIDIUM BY ETV/ICP-MS USING CHEMICAL MODIFIERS

    Full text link
    Joint Research on Environmental Science and Technology for the Eart

    Adaptive sampling method to monitor low-risk pathways with limited surveillance resources

    Full text link
    The rise of globalisation has led to a sharp increase in international trade, with high volumes of containers, goods and items moving across the world. Unfortunately, these trade pathways also facilitate the movement of unwanted pests, weeds, diseases, and pathogens. Each item could contain biosecurity risk material, but it is impractical to inspect every item. Instead, inspection efforts typically focus on high risk items. However, low risk does not imply no risk. It is crucial to monitor the low risk pathways to ensure that they are and remain low risk. To do so, many approaches would seek to estimate the risk to some precision, but the lower the risk, the more samples needed to estimate the risk. On a low-risk pathway that can be afforded more limited inspection resources, it makes more sense to assign fewer samples to the lower risk activities. We approach the problem by introducing two thresholds. Our method focuses on letting us know whether the risk is below certain thresholds, rather than estimating the risk precisely. This method also allows us to detect a significant change in risk. Our approach typically requires less sampling than previous methods, while still providing evidence to regulators to help them efficiently and effectively allocate inspection effort.Comment: 12 + 2 pages, 8 figures, 2 table

    Generalised Apery numbers modulo 9

    Get PDF
    Whilst locoregional control of head and neck cancers (HNCs) has improved over the last four decades, long-term survival has remained largely unchanged. A possible reason for this is that the rate of distant metastasis has not changed. Such disseminated disease is reflected in measurable levels of cancer cells in the blood of HNC patients, referred to as circulating tumour cells (CTCs). Numerous marker-independent techniques have been developed for CTC isolation and detection. Recently, microfluidics-based platforms have come to the fore to avoid molecular bias. In this pilot, proof of concept study, we evaluated the use of the spiral microfluidic chip for CTC enrichment and subsequent detection in HNC patients. CTCs were detected in 13/24 (54%) HNC patients, representing both early to late stages of disease. Importantly, in 7/13 CTC-positive patients, CTC clusters were observed. This is the first study to use spiral microfluidics technology for CTC enrichment in HNC

    Navigator channel adaptation to reconstruct three dimensional heart volumes from two dimensional radiotherapy planning data

    Get PDF
    BACKGROUND: Biologically-based models that utilize 3D radiation dosimetry data to estimate the risk of late cardiac effects could have significant utility for planning radiotherapy in young patients. A major challenge arises from having only 2D treatment planning data for patients with long-term follow-up. In this study, we evaluate the accuracy of an advanced deformable image registration (DIR) and navigator channels (NC) adaptation technique to reconstruct 3D heart volumes from 2D radiotherapy planning images for Hodgkin's Lymphoma (HL) patients. METHODS: Planning CT images were obtained for 50 HL patients who underwent mediastinal radiotherapy. Twelve image sets (6 male, 6 female) were used to construct a male and a female population heart model, which was registered to 23 HL "Reference" patients' CT images using a DIR algorithm, MORFEUS. This generated a series of population-to-Reference patient specific 3D deformation maps. The technique was independently tested on 15 additional "Test" patients by reconstructing their 3D heart volumes using 2D digitally reconstructed radiographs (DRR). The technique involved: 1) identifying a matching Reference patient for each Test patient using thorax measurements, 2) placement of six NCs on matching Reference and Test patients' DRRs to capture differences in significant heart curvatures, 3) adapting the population-to-Reference patient-specific deformation maps to generate population-to-Test patient-specific deformation maps using linear and bilinear interpolation methods, 4) applying population-to-Test patient specific deformation to the population model to reconstruct Test-patient specific 3D heart models. The percentage volume overlap between the NC-adapted reconstruction and actual Test patient's true heart volume was calculated using the Dice coefficient. RESULTS: The average Dice coefficient expressed as a percentage between the NC-adapted and actual Test model was 89.4 ± 2.8%. The modified NC adaptation technique made significant improvements to the population deformation heart models (p = 0.01). As standard evaluation, the residual Dice error after adaptation was comparable to the volumetric differences observed in free-breathing heart volumes (p = 0.62). CONCLUSIONS: The reconstruction technique described generates accurate 3D heart models from limited 2D planning data. This development could potentially be used to retrospectively calculate delivered dose to the heart for historically treated patients and thereby provide a better understanding of late radiation-related cardiac effects
    corecore