106,208 research outputs found
Screening donors for xenotransplantation: The potential for xenozoonoses
Xenotransplantation is a potential solution to the current donor shortage for solid organ transplantation. The transmission of infectious agents from donor organs or bone marrow to the recipient is a well-recognized phenomenon following allotransplantation. Thus the prospect of xenotransplantation raises the issue of xenozoonoses-i.e., the transmission of animal infections to the human host. Anticipating an increasing number of baboon to human transplants, 31 adult male baboons (Papio cynocephalus) from a single colony in the United States were screened for the presence of antibody to microbial agents (principally viral) that may pose a significant risk of infection. Antibody to simian cytomegalovirus, simian agent 8 and Epstein-Barr virus, was found in 97% of animals tested. Antibody to simian retroviruses and Toxoplasma gondii was found in 30% and 32% respectively. Discordant results were found when paired samples were examined by two primate laboratories. This was particularly noted when methodologies were based on cross-reaction with human viral antigens. These results highlight the need to develop specific antibody tests against the species used for xenotransplantation. © 1994 Williams & Wilkins
Mapping the Curricular Structure and Contents of Network Science Courses
As network science has matured as an established field of research, there are
already a number of courses on this topic developed and offered at various
higher education institutions, often at postgraduate levels. In those courses,
instructors adopted different approaches with different focus areas and
curricular designs. We collected information about 30 existing network science
courses from various online sources, and analyzed the contents of their syllabi
or course schedules. The topics and their curricular sequences were extracted
from the course syllabi/schedules and represented as a directed weighted graph,
which we call the topic network. Community detection in the topic network
revealed seven topic clusters, which matched reasonably with the concept list
previously generated by students and educators through the Network Literacy
initiative. The minimum spanning tree of the topic network revealed typical
flows of curricular contents, starting with examples of networks, moving onto
random networks and small-world networks, then branching off to various
subtopics from there. These results illustrate the current state of consensus
formation (including variations and disagreements) among the network science
community on what should be taught about networks and how, which may also be
informative for K--12 education and informal education.Comment: 17 pages, 11 figures, 2 tables; to appear in Cramer, C. et al.
(eds.), Network Science in Education -- Tools and Techniques for Transforming
Teaching and Learning (Springer, 2017, in press
Mediating boundaries between knowledge and knowing: ICT and R4D praxis
Research for development (R4D) praxis (theory-informed practical action) can be underpinned by the use of Information and Communication Technologies (ICTs) which, it is claimed, provide opportunities for knowledge working and sharing. Such a framing implicitly or explicitly constructs a boundary around knowledge as reified, or commodified – or at least able to be stabilized for a period of time (first order knowledge). In contrast ‘third-generation knowledge’ emphasizes the social nature of learning and knowledge-making; this reframes knowledge as a negotiated social practice, thus constructing a different system boundary. This paper offers critical reflections on the use of a wiki as a data repository and mediating technical platform as part of innovating in R4D praxis. A sustainable social learning process was sought that fostered an emergent community of practice among biophysical and social researchers acting for the first time as R4D co-researchers. Over time the technologically mediated element of the learning system was judged to have failed. This inquiry asks: How can learning system design cultivate learning opportunities and respond to learning challenges in an online environment to support R4D practice? Confining critical reflection to the online learning experience alone ignores the wider context in which knowledge work took place; therefore the institutional setting is also considered
Accurate Pulmonary Nodule Detection in Computed Tomography Images Using Deep Convolutional Neural Networks
Early detection of pulmonary cancer is the most promising way to enhance a
patient's chance for survival. Accurate pulmonary nodule detection in computed
tomography (CT) images is a crucial step in diagnosing pulmonary cancer. In
this paper, inspired by the successful use of deep convolutional neural
networks (DCNNs) in natural image recognition, we propose a novel pulmonary
nodule detection approach based on DCNNs. We first introduce a deconvolutional
structure to Faster Region-based Convolutional Neural Network (Faster R-CNN)
for candidate detection on axial slices. Then, a three-dimensional DCNN is
presented for the subsequent false positive reduction. Experimental results of
the LUng Nodule Analysis 2016 (LUNA16) Challenge demonstrate the superior
detection performance of the proposed approach on nodule detection(average
FROC-score of 0.891, ranking the 1st place over all submitted results).Comment: MICCAI 2017 accepte
Oblivion: Mitigating Privacy Leaks by Controlling the Discoverability of Online Information
Search engines are the prevalently used tools to collect information about
individuals on the Internet. Search results typically comprise a variety of
sources that contain personal information -- either intentionally released by
the person herself, or unintentionally leaked or published by third parties,
often with detrimental effects on the individual's privacy. To grant
individuals the ability to regain control over their disseminated personal
information, the European Court of Justice recently ruled that EU citizens have
a right to be forgotten in the sense that indexing systems, must offer them
technical means to request removal of links from search results that point to
sources violating their data protection rights. As of now, these technical
means consist of a web form that requires a user to manually identify all
relevant links upfront and to insert them into the web form, followed by a
manual evaluation by employees of the indexing system to assess if the request
is eligible and lawful.
We propose a universal framework Oblivion to support the automation of the
right to be forgotten in a scalable, provable and privacy-preserving manner.
First, Oblivion enables a user to automatically find and tag her disseminated
personal information using natural language processing and image recognition
techniques and file a request in a privacy-preserving manner. Second, Oblivion
provides indexing systems with an automated and provable eligibility mechanism,
asserting that the author of a request is indeed affected by an online
resource. The automated ligibility proof ensures censorship-resistance so that
only legitimately affected individuals can request the removal of corresponding
links from search results. We have conducted comprehensive evaluations, showing
that Oblivion is capable of handling 278 removal requests per second, and is
hence suitable for large-scale deployment
Overview of building information modelling in healthcare projects
In this paper, we explore how BIM functionalities together with novel
management concepts and methods have been utilized in thirteen hospital
projects in the United States and the United Kingdom. Secondary data collection
and analysis were used as the method. Initial findings indicate that the utilization
of BIM enables a holistic view of project delivery and helps to integrate project
parties into a collaborative process. The initiative to implement BIM must come
from the top down to enable early involvement of all key stakeholders. It seems
that it is rather resistance from people to adapt to the new way of working and
thinking than immaturity of technology that hinders the utilization of BIM
Recommended from our members
In vivo and in vitro models of demyelinating diseases. V. Comparison of the assembly of mouse hepatitis virus, strain JHM, in two murine cell lines.
The developmental sequence of a neurotropic strain (JHM) of mouse hepatitis virus was examined by transmission electron microscopy and immunocytology. The nucleoprotein core of this coronavirus, which contains RNA of positive polarity and is helical in configuration, becomes incorporated into enveloped particles in the same manner as the nucleocapsids of the orthomyxo- and paramyxoviruses. However, JHM virus is assembled intracellularly by budding at surfaces of smooth membranous vacuoles. A comparison of JHM virus replication in L2 and 17Cl-1 cell lines revealed that L2 cells undergo more rapid cytopathology and cease virus production much sooner than 17Cl-l cells. In L2 cells the accumulation of core material appears to continue after the abrupt cessation of virus assembly. This is evident by the massive cytoplasmic accumulation of structure resembling nucleocapsids, which react with hybridoma antibody to the nucleocapsid antigen as demonstrated by the immunoperoxidase procedure. The current findings are consistent with our previously published demonstration, using cells of neural and other deviation, of the fundamental role of the host cell type in regulating the replication and expression of coronaviruses
Quantifying disease activity in fatty-infiltrated skeletal muscle by IDEAL-CPMG in Duchenne muscular dystrophy
The purpose of this study was to explore the use of iterative decomposition of water and fat with echo asymmetry and least-squares estimation Carr-Purcell-Meiboom-Gill (IDEAL-CPMG) to simultaneously measure skeletal muscle apparent fat fraction and water T2 (T2,w) in patients with Duchenne muscular dystrophy (DMD). In twenty healthy volunteer boys and thirteen subjects with DMD, thigh muscle apparent fat fraction was measured by Dixon and IDEAL-CPMG, with the IDEAL-CPMG also providing T2,w as a measure of muscle inflammatory activity. A subset of subjects with DMD was followed up during a 48-week clinical study. The study was in compliance with the Patient Privacy Act and approved by the Institutional Review Board. Apparent fat fraction in the thigh muscles of subjects with DMD was significantly increased compared to healthy volunteer boys (p <0.001). There was a strong correlation between Dixon and IDEAL-CPMG apparent fat fraction. Muscle T2,w measured by IDEAL-CPMG was independent of changes in apparent fat fraction. Muscle T2,w was higher in the biceps femoris and vastus lateralis muscles of subjects with DMD (p <0.05). There was a strong correlation (p <0.004) between apparent fat fraction in all thigh muscles and six-minute walk distance (6MWD) in subjects with DMD. IDEAL-CPMG allowed independent and simultaneous quantification of skeletal muscle fatty degeneration and disease activity in DMD. IDEAL-CPMG apparent fat fraction and T2,w may be useful as biomarkers in clinical trials of DMD as the technique disentangles two competing biological processes
Optimizing egalitarian performance in the side-effects model of colocation for data center resource management
In data centers, up to dozens of tasks are colocated on a single physical
machine. Machines are used more efficiently, but tasks' performance
deteriorates, as colocated tasks compete for shared resources. As tasks are
heterogeneous, the resulting performance dependencies are complex. In our
previous work [18] we proposed a new combinatorial optimization model that uses
two parameters of a task - its size and its type - to characterize how a task
influences the performance of other tasks allocated to the same machine.
In this paper, we study the egalitarian optimization goal: maximizing the
worst-off performance. This problem generalizes the classic makespan
minimization on multiple processors (P||Cmax). We prove that
polynomially-solvable variants of multiprocessor scheduling are NP-hard and
hard to approximate when the number of types is not constant. For a constant
number of types, we propose a PTAS, a fast approximation algorithm, and a
series of heuristics. We simulate the algorithms on instances derived from a
trace of one of Google clusters. Algorithms aware of jobs' types lead to better
performance compared with algorithms solving P||Cmax.
The notion of type enables us to model degeneration of performance caused by
using standard combinatorial optimization methods. Types add a layer of
additional complexity. However, our results - approximation algorithms and good
average-case performance - show that types can be handled efficiently.Comment: Author's version of a paper published in Euro-Par 2017 Proceedings,
extends the published paper with addtional results and proof
Calculating the random guess scores of multiple-response and matching test items
For achievement tests, the guess score is often used as a baseline for the lowest possible grade for score to grade transformations and setting the cut scores. For test item types such as multiple-response, matching and drag-and-drop, determin-ing the guess score requires more elaborate calculations than the more straight-forward calculation of the guess score for True-False and multiple-choice test item formats. For various variants of multiple-response and matching types with respect to dichotomous and polytomous scoring, methods for determining the guess score are presented and illustrated with practical applications. The implica-tions for theory and practice are discussed
- …