9 research outputs found

    GPrimer: a fast GPU-based pipeline for primer design for qPCR experiments

    Get PDF
    Background: Design of valid high-quality primers is essential for qPCR experiments. MRPrimer is a powerful pipeline based on MapReduce that combines both primer design for target sequences and homology tests on off-target sequences. It takes an entire sequence DB as input and returns all feasible and valid primer pairs existing in the DB. Due to the effectiveness of primers designed by MRPrimer in qPCR analysis, it has been widely used for developing many online design tools and building primer databases. However, the computational speed of MRPrimer is too slow to deal with the sizes of sequence DBs growing exponentially and thus must be improved. Results: We develop a fast GPU-based pipeline for primer design (GPrimer) that takes the same input and returns the same output with MRPrimer. MRPrimer consists of a total of seven MapReduce steps, among which two steps are very time-consuming. GPrimer significantly improves the speed of those two steps by exploiting the computational power of GPUs. In particular, it designs data structures for coalesced memory access in GPU and workload balancing among GPU threads and copies the data structures between main memory and GPU memory in a streaming fashion. For human RefSeq DB, GPrimer achieves a speedup of 57 times for the entire steps and a speedup of 557 times for the most time-consuming step using a single machine of 4 GPUs, compared with MRPrimer running on a cluster of six machines. Conclusions: We propose a GPU-based pipeline for primer design that takes an entire sequence DB as input and returns all feasible and valid primer pairs existing in the DB at once without an additional step using BLAST-like tools. The software is available at https://github.com/qhtjrmin/GPrimer.git. © 2021, The Author(s).1

    Fault and Defect Tolerant Computer Architectures: Reliable Computing With Unreliable Devices

    Get PDF
    This research addresses design of a reliable computer from unreliable device technologies. A system architecture is developed for a fault and defect tolerant (FDT) computer. Trade-offs between different techniques are studied and yield and hardware cost models are developed. Fault and defect tolerant designs are created for the processor and the cache memory. Simulation results for the content-addressable memory (CAM)-based cache show 90% yield with device failure probabilities of 3 x 10(-6), three orders of magnitude better than non fault tolerant caches of the same size. The entire processor achieves 70% yield with device failure probabilities exceeding 10(-6). The required hardware redundancy is approximately 15 times that of a non-fault tolerant design. While larger than current FT designs, this architecture allows the use of devices much more likely to fail than silicon CMOS. As part of model development, an improved model is derived for NAND Multiplexing. The model is the first accurate model for small and medium amounts of redundancy. Previous models are extended to account for dependence between the inputs and produce more accurate results

    Annual Report of the University, 2001-2002, Volumes 1-4

    Get PDF
    VITAL ACADEMIC CLIMATE* by Brian Foster, Provost/Vice President of Academic Affairs A great university engages students and faculty fully in important ideas and issues ... not just to learn about them, but to take them apart and put them back together, to debate, deconstruct, resist, reconstruct and build upon them. Engagement of this sort takes concentration and commitment, and it produces the kind of discipline and passion that leads to student and faculty success and satisfaction in their studies, research, performance, artistic activity and service. It is also the kind of activity that creates a solid, nurturing spirit of community. This is what we mean when we talk about a vital academic climate. We are striving for an environment that will enrich the social, cultural and intellectual lives of all who come in contact with the University. Many things interconnect to make this happen: curriculum, co-curricular activities, conferences, symposia, cultural events, community service, research and social activity. Our goal is to create the highest possible level of academic commitment and excitement at UNM. This is what characterizes a truly great university. *Strategic Direction 2 New Mexico native Andres C. Salazar, a Ph.D. in electrical engineering from Michigan State University, has been named the PNM Chair in Microsystems, Commercialization and Technology. Carrying the title of professor, the PNM Chair is a joint appointment between the School of Engineering and the Anderson Schools of Management. Spring 2002 graduate John Probasco was selected a 2002 Rhodes Scholar, the second UNM student to be so honored in the past four years. The biochemistry major from Alamogordo previously had been awarded the Goldwater Scholarship and the Truman Scholarship. Andres c. Salazar Biology student Sophie Peterson of Albuquerque was one of 30 students nationwide to receive a 2002-2003 Award of Excellence from Phi Kappa Phi, the oldest and largest national honor society. Regents\\u27 Professor of Communication and Journalism Everett M. Rogers was selected the University\\u27s 4 71h Annual Research Lecturer, the highest honor UNM bestows upon members of its faculty. John Probasco honored by Student Activities Director Debbie Morris. New Mexico resident, author and poet Simon}. Ortiz received an Honorary Doctorate of Letters at Spring Commencement ceremonies. Child advocate Angela Angie Vachio, founder and executive director of Peanut Butter and Jelly Family Services, Inc., was awarded an Honorary Doctorate of Humane Letters. American Studies Assistant Professor Amanda}. Cobb won the 22 d annual American Book Award for listening to Our Grandmothers\\u27 Stories: The Bloomfield Academy for Chickasaw Females, 1852-1949

    Effects of distribution planning systems on the cost of delivery in unique make-to-order manufacturing

    Get PDF
    This thesis investigates the effects of simulation through the use of a distribution planning system (DPS) on distribution costs in the setting of unique make-to-order manufacturers (UMTO). In doing so, the German kitchen furniture industry (GKFI) serves as an example and supplier of primary data. On the basis of a detailed market analysis this thesis will demonstrate that this industry, which mostly works with its own vehicles for transport, is in urgent need of innovative logistics strategies. Within the scope of an investigation into the current practical and theoretical use of DPS, it will become apparent that most known DPS are based on the application of given or set delivery tour constraints. Those constraints are often not questioned in practice and in theory nor even attempted to be omitted, but are accepted in day-to-day operation. This paper applies a different approach. In the context of this research, a practically applied DPS is used supportively for the removal of time window constraints (TWC) in UMTO delivery. The same DPS is used in ceteris paribus condition for the re-routing of deliveries and hereby supports the findings regarding the costliness of TWC. From this experiment emerges an overall cost saving of 50.9% and a 43.5% reduction of kilometres travelled. The applied experimental research methodology and the significance of the resulting savings deliver the opportunity to analyse the removal of delivery time window restrictions as one of many constraints in distribution logistics. The economic results of this thesis may become the basis of discussion for further research based on the applied methodology. From a practical point of view, the contributions to new knowledge are the cost savings versus the change of demand for the setting of TWC between the receiver of goods and the UMTO supplier. On the side of theoretical knowledge, this thesis contributes to filling the gap on the production – distribution problem from a UMTO perspective. Further contributions to knowledge are delivered through the experimental methodology with the application of a DPS for research in logistics simulation

    The assessment report of the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services on pollinators, pollination and food production

    Get PDF
    The thematic assessment of pollinators, pollination and food production carried out under the auspices of the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services aims to assess animal pollination as a regulating ecosystem service underpinning food production in the context of its contribution to nature’s gifts to people and supporting a good quality of life. To achieve this, it focuses on the role of native and managed pollinators, the status and trends of pollinators and pollinator-plant networks and pollination, drivers of change, impacts on human well-being, food production in response to pollination declines and deficits and the effectiveness of responses

    Incorporating technology diffusion estimates in health economic methods - Application in a preterm birth screening case study

    Get PDF
    Low implementation of cost-effective health technologies results in inefficient use of resources in a health system. Despite this, estimates of implementation or diffusion are not routine components of analyses performed within health technology assessments (HTA), potentially due to a lack of a) methods to obtain diffusion estimates and b) understanding of the impact of diffusion estimates on health economic outcomes. This thesis contributes a) a method to estimate health technology diffusion prior to HTA and b) a modelling framework that assesses the potential impact of diffusion estimates on cost-effectiveness and expected value of information and implementation (EVII) analysis using modelling, qualitative and elicitation methods. These were illustrated in a preterm birth (PTB) screening case study. The modelling framework included extensions to an existing EVII model to make it dynamic and allow research to affect implementation; and the development of a dynamic cost-effectiveness analysis (DCEA) model that reflects price changes precipitated by diffusion and hence, the reimbursement decision. Drivers of diffusion were identified for the case study technology, aiding the design of implementation strategies. The developed method for predicting diffusion requires transformation of elicited expert beliefs to inform an existing diffusion model. Application in the PTB screening model showed that the dynamic EVII method can 1.) help more accurately assess the losses the health care payer incurs when there is decision uncertainty and low implementation and 2.) provide more realistic assessments of implementation strategies and evidence generation schemes. The applied DCEA model showed that changes in price triggered by technology diffusion significantly affect cost-effectiveness results. The method for predicting health technology diffusion and the EVII and DCEA frameworks are foreseen to be relevant in the context of HTAs of medical devices, diagnostics and drugs; particularly when there is low implementation or there is potential for future price changes conditional on diffusion

    Capacity Analysis of DNA Memory Based on Mathematical Model

    Get PDF
    著者らは,約16.8M のアドレス空間を持つDNA メモリ(Nested Primer Molecular Memory:NPMM)を構築した.このDNA メモリでは,データはNested PCR とよばれる実験的操作によって,正しいアドレスを指定したときのみ取り出せる.本論文では,DNA メモリのデータ取り出し操作であるアドレッシングの過程を数理モデル化し,容量の最大化を最適化問題として定式化することで,NPMM のスケールアップの限界を議論する.We have developed a DNA Memory called NPMM with over 10 million (16.8M) addresses. The data embedded into a unique address was correctly extracted through addressing processes based on the nested PCR. In this paper, the addressing process of NPMM was modeled by using the combinatorial optimization problem in order to discuss the limitation of the scaling-up problem of NPMM
    corecore