85,220 research outputs found
Screening donors for xenotransplantation: The potential for xenozoonoses
Xenotransplantation is a potential solution to the current donor shortage for solid organ transplantation. The transmission of infectious agents from donor organs or bone marrow to the recipient is a well-recognized phenomenon following allotransplantation. Thus the prospect of xenotransplantation raises the issue of xenozoonoses-i.e., the transmission of animal infections to the human host. Anticipating an increasing number of baboon to human transplants, 31 adult male baboons (Papio cynocephalus) from a single colony in the United States were screened for the presence of antibody to microbial agents (principally viral) that may pose a significant risk of infection. Antibody to simian cytomegalovirus, simian agent 8 and Epstein-Barr virus, was found in 97% of animals tested. Antibody to simian retroviruses and Toxoplasma gondii was found in 30% and 32% respectively. Discordant results were found when paired samples were examined by two primate laboratories. This was particularly noted when methodologies were based on cross-reaction with human viral antigens. These results highlight the need to develop specific antibody tests against the species used for xenotransplantation. © 1994 Williams & Wilkins
Mediating boundaries between knowledge and knowing: ICT and R4D praxis
Research for development (R4D) praxis (theory-informed practical action) can be underpinned by the use of Information and Communication Technologies (ICTs) which, it is claimed, provide opportunities for knowledge working and sharing. Such a framing implicitly or explicitly constructs a boundary around knowledge as reified, or commodified â or at least able to be stabilized for a period of time (first order knowledge). In contrast âthird-generation knowledgeâ emphasizes the social nature of learning and knowledge-making; this reframes knowledge as a negotiated social practice, thus constructing a different system boundary. This paper offers critical reflections on the use of a wiki as a data repository and mediating technical platform as part of innovating in R4D praxis. A sustainable social learning process was sought that fostered an emergent community of practice among biophysical and social researchers acting for the first time as R4D co-researchers. Over time the technologically mediated element of the learning system was judged to have failed. This inquiry asks: How can learning system design cultivate learning opportunities and respond to learning challenges in an online environment to support R4D practice? Confining critical reflection to the online learning experience alone ignores the wider context in which knowledge work took place; therefore the institutional setting is also considered
Oblivion: Mitigating Privacy Leaks by Controlling the Discoverability of Online Information
Search engines are the prevalently used tools to collect information about
individuals on the Internet. Search results typically comprise a variety of
sources that contain personal information -- either intentionally released by
the person herself, or unintentionally leaked or published by third parties,
often with detrimental effects on the individual's privacy. To grant
individuals the ability to regain control over their disseminated personal
information, the European Court of Justice recently ruled that EU citizens have
a right to be forgotten in the sense that indexing systems, must offer them
technical means to request removal of links from search results that point to
sources violating their data protection rights. As of now, these technical
means consist of a web form that requires a user to manually identify all
relevant links upfront and to insert them into the web form, followed by a
manual evaluation by employees of the indexing system to assess if the request
is eligible and lawful.
We propose a universal framework Oblivion to support the automation of the
right to be forgotten in a scalable, provable and privacy-preserving manner.
First, Oblivion enables a user to automatically find and tag her disseminated
personal information using natural language processing and image recognition
techniques and file a request in a privacy-preserving manner. Second, Oblivion
provides indexing systems with an automated and provable eligibility mechanism,
asserting that the author of a request is indeed affected by an online
resource. The automated ligibility proof ensures censorship-resistance so that
only legitimately affected individuals can request the removal of corresponding
links from search results. We have conducted comprehensive evaluations, showing
that Oblivion is capable of handling 278 removal requests per second, and is
hence suitable for large-scale deployment
Recommended from our members
In vivo and in vitro models of demyelinating diseases. V. Comparison of the assembly of mouse hepatitis virus, strain JHM, in two murine cell lines.
The developmental sequence of a neurotropic strain (JHM) of mouse hepatitis virus was examined by transmission electron microscopy and immunocytology. The nucleoprotein core of this coronavirus, which contains RNA of positive polarity and is helical in configuration, becomes incorporated into enveloped particles in the same manner as the nucleocapsids of the orthomyxo- and paramyxoviruses. However, JHM virus is assembled intracellularly by budding at surfaces of smooth membranous vacuoles. A comparison of JHM virus replication in L2 and 17Cl-1 cell lines revealed that L2 cells undergo more rapid cytopathology and cease virus production much sooner than 17Cl-l cells. In L2 cells the accumulation of core material appears to continue after the abrupt cessation of virus assembly. This is evident by the massive cytoplasmic accumulation of structure resembling nucleocapsids, which react with hybridoma antibody to the nucleocapsid antigen as demonstrated by the immunoperoxidase procedure. The current findings are consistent with our previously published demonstration, using cells of neural and other deviation, of the fundamental role of the host cell type in regulating the replication and expression of coronaviruses
Quantifying disease activity in fatty-infiltrated skeletal muscle by IDEAL-CPMG in Duchenne muscular dystrophy
The purpose of this study was to explore the use of iterative decomposition of water and fat with echo asymmetry and least-squares estimation Carr-Purcell-Meiboom-Gill (IDEAL-CPMG) to simultaneously measure skeletal muscle apparent fat fraction and water T2 (T2,w) in patients with Duchenne muscular dystrophy (DMD). In twenty healthy volunteer boys and thirteen subjects with DMD, thigh muscle apparent fat fraction was measured by Dixon and IDEAL-CPMG, with the IDEAL-CPMG also providing T2,w as a measure of muscle inflammatory activity. A subset of subjects with DMD was followed up during a 48-week clinical study. The study was in compliance with the Patient Privacy Act and approved by the Institutional Review Board. Apparent fat fraction in the thigh muscles of subjects with DMD was significantly increased compared to healthy volunteer boys (pâ<0.001). There was a strong correlation between Dixon and IDEAL-CPMG apparent fat fraction. Muscle T2,w measured by IDEAL-CPMG was independent of changes in apparent fat fraction. Muscle T2,w was higher in the biceps femoris and vastus lateralis muscles of subjects with DMD (pâ<0.05). There was a strong correlation (pâ<0.004) between apparent fat fraction in all thigh muscles and six-minute walk distance (6MWD) in subjects with DMD. IDEAL-CPMG allowed independent and simultaneous quantification of skeletal muscle fatty degeneration and disease activity in DMD. IDEAL-CPMG apparent fat fraction and T2,w may be useful as biomarkers in clinical trials of DMD as the technique disentangles two competing biological processes
Optimizing egalitarian performance in the side-effects model of colocation for data center resource management
In data centers, up to dozens of tasks are colocated on a single physical
machine. Machines are used more efficiently, but tasks' performance
deteriorates, as colocated tasks compete for shared resources. As tasks are
heterogeneous, the resulting performance dependencies are complex. In our
previous work [18] we proposed a new combinatorial optimization model that uses
two parameters of a task - its size and its type - to characterize how a task
influences the performance of other tasks allocated to the same machine.
In this paper, we study the egalitarian optimization goal: maximizing the
worst-off performance. This problem generalizes the classic makespan
minimization on multiple processors (P||Cmax). We prove that
polynomially-solvable variants of multiprocessor scheduling are NP-hard and
hard to approximate when the number of types is not constant. For a constant
number of types, we propose a PTAS, a fast approximation algorithm, and a
series of heuristics. We simulate the algorithms on instances derived from a
trace of one of Google clusters. Algorithms aware of jobs' types lead to better
performance compared with algorithms solving P||Cmax.
The notion of type enables us to model degeneration of performance caused by
using standard combinatorial optimization methods. Types add a layer of
additional complexity. However, our results - approximation algorithms and good
average-case performance - show that types can be handled efficiently.Comment: Author's version of a paper published in Euro-Par 2017 Proceedings,
extends the published paper with addtional results and proof
Calculating the random guess scores of multiple-response and matching test items
For achievement tests, the guess score is often used as a baseline for the lowest possible grade for score to grade transformations and setting the cut scores. For test item types such as multiple-response, matching and drag-and-drop, determin-ing the guess score requires more elaborate calculations than the more straight-forward calculation of the guess score for True-False and multiple-choice test item formats. For various variants of multiple-response and matching types with respect to dichotomous and polytomous scoring, methods for determining the guess score are presented and illustrated with practical applications. The implica-tions for theory and practice are discussed
Preface: recent developments in financial modelling and risk management
In the last decade, a wide range of innovative financial instruments has taken by storm the financial markets. In 2015 for instance, the European Commission (EC) introduced the definition of âinnovative financial instrumentsâ as instruments that are complementary to grants or subsidies and as part of a move towards a smarter âfunding mixâ. Loans, equity and quasi-equity instrument and guarantees are considered as a particularly effective way to increase and enhance the impact of EU funding while compared to the traditional grant-based system (EC, 2015), therefore, they represent a way to further promote a more responsible, result-oriented use of European funds by the corporate world
A Suborbital Payload for Soft X-ray Spectroscopy of Extended Sources
We present a suborbital rocket payload capable of performing soft X-ray
spectroscopy on extended sources. The payload can reach resolutions of
~100(lambda/dlambda) over sources as large as 3.25 degrees in diameter in the
17-107 angstrom bandpass. This permits analysis of the overall energy balance
of nearby supernova remnants and the detailed nature of the diffuse soft X-ray
background. The main components of the instrument are: wire grid collimators,
off-plane grating arrays and gaseous electron multiplier detectors. This
payload is adaptable to longer duration orbital rockets given its comparatively
simple pointing and telemetry requirements and an abundance of potential
science targets.Comment: Accepted to Experimental Astronomy, 12 pages plus 1 table and 17
figure
Validation of chronic obstructive pulmonary disease recording in the Clinical Practice Research Datalink (CPRD-GOLD)
Objectives: The optimal method of identifying people with chronic obstructive pulmonary disease (COPD) from electronic primary care records is not known. We assessed the accuracy of different approaches using the Clinical Practice Research Datalink, a UK electronic health record database. Setting: 951 participants registered with a CPRD practice in the UK between 1 January 2004 and 31 December 2012. Individuals were selected for â„1 of 8 algorithms to identify people with COPD. General practitioners were sent a brief questionnaire and additional evidence to support a COPD diagnosis was requested. All information received was reviewed independently by two respiratory physicians whose opinion was taken as the gold standard. Primary outcome measure: The primary measure of accuracy was the positive predictive value (PPV), the proportion of people identified by each algorithm for whom COPD was confirmed. Results: 951 questionnaires were sent and 738 (78%) returned. After quality control, 696 (73.2%) patients were included in the final analysis. All four algorithms including a specific COPD diagnostic code performed well. Using a diagnostic code alone, the PPV was 86.5% (77.5-92.3%) while requiring a diagnosis plus spirometry plus specific medication; the PPV was slightly higher at 89.4% (80.7-94.5%) but reduced case numbers by 10%. Algorithms without specific diagnostic codes had low PPVs (range 12.2-44.4%). Conclusions: Patients with COPD can be accurately identified from UK primary care records using specific diagnostic codes. Requiring spirometry or COPD medications only marginally improved accuracy. The high accuracy applies since the introduction of an incentivised disease register for COPD as part of Quality and Outcomes Framework in 2004
- âŠ