648 research outputs found
CADUCEUS, SCIPIO, ALCADIA: Cell therapy trials using cardiac-derived cells for patients with post myocardial infarction LV dysfunction, still evolving.
The early results of the CArdiosphere-Derived aUtologous stem CElls to reverse ventricUlar dySfunction study were recently published in the Lancet [1]. This study is a phase 1 prospective randomised study, performed at two centres. The study was designed to test the hypothesis that intracoronary infusion of autologous cardiac-derived cells following myocardial infarction can reduce the size of the infarct and increase the amount of viable myocardium. The eligible patients were randomised in a 2:1 ratio to receive CDCs or standard care. In all, 17 patients were randomised to cell therapy and 8 to standard care. The cell therapy consisted of an infusion of 25 million cells into the infarct related artery, 1.5–3 months after successful primary angioplasty in patients who developed LV dysfunction (EF less than 37 per cent). The cells were derived from RV endomyocardial biopsies performed within the previous 37 days. The number of cells was determined from previous experimental studies of the maximum number of cells which can be injected without inducing infarction. The study was not blinded because of ethical considerations regarding performing right ventricular biopsy on the controls. The exclusion criteria included patients who had evidence of right ventricular infarction, or could not have an MRI examination because of claustrophobia or prior insertion of devices. There was no death, myocardial infarction or serious arrhythmia reported in either group during the period of follow up, which was between 6-12 months. Serious adverse events were observed in 24 percent of the intervention group versus 12 per cent in the controls (p not significant)
Privacy Preservation by Disassociation
In this work, we focus on protection against identity disclosure in the
publication of sparse multidimensional data. Existing multidimensional
anonymization techniquesa) protect the privacy of users either by altering the
set of quasi-identifiers of the original data (e.g., by generalization or
suppression) or by adding noise (e.g., using differential privacy) and/or (b)
assume a clear distinction between sensitive and non-sensitive information and
sever the possible linkage. In many real world applications the above
techniques are not applicable. For instance, consider web search query logs.
Suppressing or generalizing anonymization methods would remove the most
valuable information in the dataset: the original query terms. Additionally,
web search query logs contain millions of query terms which cannot be
categorized as sensitive or non-sensitive since a term may be sensitive for a
user and non-sensitive for another. Motivated by this observation, we propose
an anonymization technique termed disassociation that preserves the original
terms but hides the fact that two or more different terms appear in the same
record. We protect the users' privacy by disassociating record terms that
participate in identifying combinations. This way the adversary cannot
associate with high probability a record with a rare combination of terms. To
the best of our knowledge, our proposal is the first to employ such a technique
to provide protection against identity disclosure. We propose an anonymization
algorithm based on our approach and evaluate its performance on real and
synthetic datasets, comparing it against other state-of-the-art methods based
on generalization and differential privacy.Comment: VLDB201
Local and global recoding methods for anonymizing set-valued data
In this paper, we study the problem of protecting privacy in the publication of set-valued data. Consider a collection of supermarket transactions that contains detailed information about items bought together by individuals. Even after removing all personal characteristics of the buyer, which can serve as links to his identity, the publication of such data is still subject to privacy attacks from adversaries who have partial knowledge about the set. Unlike most previous works, we do not distinguish data as sensitive and non-sensitive, but we consider them both as potential quasi-identifiers and potential sensitive data, depending on the knowledge of the adversary. We define a new version of the k-anonymity guarantee, the k m-anonymity, to limit the effects of the data dimensionality, and we propose efficient algorithms to transform the database. Our anonymization model relies on generalization instead of suppression, which is the most common practice in related works on such data. We develop an algorithm that finds the optimal solution, however, at a high cost that makes it inapplicable for large, realistic problems. Then, we propose a greedy heuristic, which performs generalizations in an Apriori, level-wise fashion. The heuristic scales much better and in most of the cases finds a solution close to the optimal. Finally, we investigate the application of techniques that partition the database and perform anonymization locally, aiming at the reduction of the memory consumption and further scalability. A thorough experimental evaluation with real datasets shows that a vertical partitioning approach achieves excellent results in practice. © 2010 Springer-Verlag.postprin
Local Suppression and Splitting Techniques for Privacy Preserving Publication of Trajectories
postprin
Functional impairment of human resident cardiac stem cells by the cardiotoxic antineoplastic agent trastuzumab
Trastuzumab (TZM), a monoclonal antibody against the ERBB2 protein, increases survival in ERBB2-positive breast cancer patients. Its clinical use, however, is limited by cardiotoxicity. We sought to evaluate whether TZM cardiotoxicity involves inhibition of human adult cardiac-derived stem cells, in addition to previously reported direct adverse effects on cardiomyocytes. To test this idea, we exposed human cardiosphere-derived cells (hCDCs), a natural mixture of cardiac stem cells and supporting cells that has been shown to exert potent regenerative effects, to TZM and tested the effects in vitro and in vivo. We found that ERBB2 mRNA and protein are expressed in hCDCs at levels comparable to those in human myocardium. Although clinically relevant concentrations of TZM had no effect on proliferation, apoptosis, or size of the c-kit-positive hCDC subpopulation, in vitro assays demonstrated diminished potential for cardiogenic differentiation and impaired ability to form microvascular networks in TZM-treated cells. The functional benefit of hCDCs injected into the border zone of acutely infarcted mouse hearts was abrogated by TZM: infarcted animals treated with TZM + hCDCs had a lower ejection fraction, thinner infarct scar, and reduced capillary density in the infarct border zone compared with animals that received hCDCs alone (n = 12 per group). Collectively, these results indicate that TZM inhibits the cardiomyogenic and angiogenic capacities of hCDCs in vitro and abrogates the morphological and functional benefits of hCDC transplantation in vivo. Thus, TZM impairs the function of human resident cardiac stem cells, potentially contributing to TZM cardiotoxicity
Modeling andsimulationofspeedselectiononleftventricular assist devices
The control problem for LVADs is to set pump speed such that cardiac output and pressure perfusion are within acceptable physiological ranges. However, current technology of LVADs cannot provide for a closed-loop control scheme that can make adjustments based on the patient\u27s level of activity. In this context, the SensorART Speed Selection Module (SSM) integrates various hardware and software components in order to improve the quality of the patients\u27 treatment and the workflow of the specialists. It enables specialists to better understand the patient-device interactions, and improve their knowledge. The SensorART SSM includes two tools of the Specialist Decision Support System (SDSS); namely the Suction Detection Tool and the Speed Selection Tool. A VAD Heart Simulation Platform (VHSP) is also part of the system. The VHSP enables specialists to simulate the behavior of a patient?s circulatory system, using different LVAD types and functional parameters. The SDSS is a web-based application that offers specialists with a plethora of tools for monitoring, designing the best therapy plan, analyzing data, extracting new knowledge and making informative decisions. In this paper, two of these tools, the Suction Detection Tool and Speed Selection Tool are presented. The former allows the analysis of the simulations sessions from the VHSP and the identification of issues related to suction phenomenon with high accuracy 93%. The latter provides the specialists with a powerful support in their attempt to effectively plan the treatment strategy. It allows them to draw conclusions about the most appropriate pump speed settings. Preliminary assessments connecting the Suction Detection Tool to the VHSP are presented in this paper
Parallel In-Memory Evaluation of Spatial Joins
The spatial join is a popular operation in spatial database systems and its
evaluation is a well-studied problem. As main memories become bigger and faster
and commodity hardware supports parallel processing, there is a need to revamp
classic join algorithms which have been designed for I/O-bound processing. In
view of this, we study the in-memory and parallel evaluation of spatial joins,
by re-designing a classic partitioning-based algorithm to consider alternative
approaches for space partitioning. Our study shows that, compared to a
straightforward implementation of the algorithm, our tuning can improve
performance significantly. We also show how to select appropriate partitioning
parameters based on data statistics, in order to tune the algorithm for the
given join inputs. Our parallel implementation scales gracefully with the
number of threads reducing the cost of the join to at most one second even for
join inputs with tens of millions of rectangles.Comment: Extended version of the SIGSPATIAL'19 paper under the same titl
- …
