117,724 research outputs found

    Screening donors for xenotransplantation: The potential for xenozoonoses

    Get PDF
    Xenotransplantation is a potential solution to the current donor shortage for solid organ transplantation. The transmission of infectious agents from donor organs or bone marrow to the recipient is a well-recognized phenomenon following allotransplantation. Thus the prospect of xenotransplantation raises the issue of xenozoonoses-i.e., the transmission of animal infections to the human host. Anticipating an increasing number of baboon to human transplants, 31 adult male baboons (Papio cynocephalus) from a single colony in the United States were screened for the presence of antibody to microbial agents (principally viral) that may pose a significant risk of infection. Antibody to simian cytomegalovirus, simian agent 8 and Epstein-Barr virus, was found in 97% of animals tested. Antibody to simian retroviruses and Toxoplasma gondii was found in 30% and 32% respectively. Discordant results were found when paired samples were examined by two primate laboratories. This was particularly noted when methodologies were based on cross-reaction with human viral antigens. These results highlight the need to develop specific antibody tests against the species used for xenotransplantation. © 1994 Williams & Wilkins

    Synchrotron imaging assessment of bone quality

    Get PDF
    Bone is a complex hierarchical structure and its principal function is to resist mechanical forces and fracture. Bone strength depends not only on the quantity of bone tissue but also on the shape and hierarchical structure. The hierarchical levels are interrelated, especially the micro-architecture, collagen and mineral components; hence analysis of their specific roles in bone strength and stiffness is difficult. Synchrotron imaging technologies including micro-CT and small/wide angle X-Ray scattering/diffraction are becoming increasingly popular for studying bone because the images can resolve deformations in the micro-architecture and collagen-mineral matrix under in situ mechanical loading. Synchrotron cannot be directly applied in-vivo due to the high radiation dose but will allow researchers to carry out systematic multifaceted studies of bone ex-vivo. Identifying characteristics of aging and disease will underpin future efforts to generate novel devices and interventional therapies for assessing and promoting healthy aging. With our own research work as examples, this paper introduces how synchrotron imaging technology can be used with in-situ testing in bone research

    Mapping the Curricular Structure and Contents of Network Science Courses

    Full text link
    As network science has matured as an established field of research, there are already a number of courses on this topic developed and offered at various higher education institutions, often at postgraduate levels. In those courses, instructors adopted different approaches with different focus areas and curricular designs. We collected information about 30 existing network science courses from various online sources, and analyzed the contents of their syllabi or course schedules. The topics and their curricular sequences were extracted from the course syllabi/schedules and represented as a directed weighted graph, which we call the topic network. Community detection in the topic network revealed seven topic clusters, which matched reasonably with the concept list previously generated by students and educators through the Network Literacy initiative. The minimum spanning tree of the topic network revealed typical flows of curricular contents, starting with examples of networks, moving onto random networks and small-world networks, then branching off to various subtopics from there. These results illustrate the current state of consensus formation (including variations and disagreements) among the network science community on what should be taught about networks and how, which may also be informative for K--12 education and informal education.Comment: 17 pages, 11 figures, 2 tables; to appear in Cramer, C. et al. (eds.), Network Science in Education -- Tools and Techniques for Transforming Teaching and Learning (Springer, 2017, in press

    Optimizing egalitarian performance in the side-effects model of colocation for data center resource management

    Full text link
    In data centers, up to dozens of tasks are colocated on a single physical machine. Machines are used more efficiently, but tasks' performance deteriorates, as colocated tasks compete for shared resources. As tasks are heterogeneous, the resulting performance dependencies are complex. In our previous work [18] we proposed a new combinatorial optimization model that uses two parameters of a task - its size and its type - to characterize how a task influences the performance of other tasks allocated to the same machine. In this paper, we study the egalitarian optimization goal: maximizing the worst-off performance. This problem generalizes the classic makespan minimization on multiple processors (P||Cmax). We prove that polynomially-solvable variants of multiprocessor scheduling are NP-hard and hard to approximate when the number of types is not constant. For a constant number of types, we propose a PTAS, a fast approximation algorithm, and a series of heuristics. We simulate the algorithms on instances derived from a trace of one of Google clusters. Algorithms aware of jobs' types lead to better performance compared with algorithms solving P||Cmax. The notion of type enables us to model degeneration of performance caused by using standard combinatorial optimization methods. Types add a layer of additional complexity. However, our results - approximation algorithms and good average-case performance - show that types can be handled efficiently.Comment: Author's version of a paper published in Euro-Par 2017 Proceedings, extends the published paper with addtional results and proof

    Accurate Pulmonary Nodule Detection in Computed Tomography Images Using Deep Convolutional Neural Networks

    Full text link
    Early detection of pulmonary cancer is the most promising way to enhance a patient's chance for survival. Accurate pulmonary nodule detection in computed tomography (CT) images is a crucial step in diagnosing pulmonary cancer. In this paper, inspired by the successful use of deep convolutional neural networks (DCNNs) in natural image recognition, we propose a novel pulmonary nodule detection approach based on DCNNs. We first introduce a deconvolutional structure to Faster Region-based Convolutional Neural Network (Faster R-CNN) for candidate detection on axial slices. Then, a three-dimensional DCNN is presented for the subsequent false positive reduction. Experimental results of the LUng Nodule Analysis 2016 (LUNA16) Challenge demonstrate the superior detection performance of the proposed approach on nodule detection(average FROC-score of 0.891, ranking the 1st place over all submitted results).Comment: MICCAI 2017 accepte

    Calculating the random guess scores of multiple-response and matching test items

    Get PDF
    For achievement tests, the guess score is often used as a baseline for the lowest possible grade for score to grade transformations and setting the cut scores. For test item types such as multiple-response, matching and drag-and-drop, determin-ing the guess score requires more elaborate calculations than the more straight-forward calculation of the guess score for True-False and multiple-choice test item formats. For various variants of multiple-response and matching types with respect to dichotomous and polytomous scoring, methods for determining the guess score are presented and illustrated with practical applications. The implica-tions for theory and practice are discussed

    A Suborbital Payload for Soft X-ray Spectroscopy of Extended Sources

    Full text link
    We present a suborbital rocket payload capable of performing soft X-ray spectroscopy on extended sources. The payload can reach resolutions of ~100(lambda/dlambda) over sources as large as 3.25 degrees in diameter in the 17-107 angstrom bandpass. This permits analysis of the overall energy balance of nearby supernova remnants and the detailed nature of the diffuse soft X-ray background. The main components of the instrument are: wire grid collimators, off-plane grating arrays and gaseous electron multiplier detectors. This payload is adaptable to longer duration orbital rockets given its comparatively simple pointing and telemetry requirements and an abundance of potential science targets.Comment: Accepted to Experimental Astronomy, 12 pages plus 1 table and 17 figure

    Oblivion: Mitigating Privacy Leaks by Controlling the Discoverability of Online Information

    Get PDF
    Search engines are the prevalently used tools to collect information about individuals on the Internet. Search results typically comprise a variety of sources that contain personal information -- either intentionally released by the person herself, or unintentionally leaked or published by third parties, often with detrimental effects on the individual's privacy. To grant individuals the ability to regain control over their disseminated personal information, the European Court of Justice recently ruled that EU citizens have a right to be forgotten in the sense that indexing systems, must offer them technical means to request removal of links from search results that point to sources violating their data protection rights. As of now, these technical means consist of a web form that requires a user to manually identify all relevant links upfront and to insert them into the web form, followed by a manual evaluation by employees of the indexing system to assess if the request is eligible and lawful. We propose a universal framework Oblivion to support the automation of the right to be forgotten in a scalable, provable and privacy-preserving manner. First, Oblivion enables a user to automatically find and tag her disseminated personal information using natural language processing and image recognition techniques and file a request in a privacy-preserving manner. Second, Oblivion provides indexing systems with an automated and provable eligibility mechanism, asserting that the author of a request is indeed affected by an online resource. The automated ligibility proof ensures censorship-resistance so that only legitimately affected individuals can request the removal of corresponding links from search results. We have conducted comprehensive evaluations, showing that Oblivion is capable of handling 278 removal requests per second, and is hence suitable for large-scale deployment
    corecore