91 research outputs found

    Human-based approaches to pharmacology and cardiology: an interdisciplinary and intersectorial workshop.

    Get PDF
    Both biomedical research and clinical practice rely on complex datasets for the physiological and genetic characterization of human hearts in health and disease. Given the complexity and variety of approaches and recordings, there is now growing recognition of the need to embed computational methods in cardiovascular medicine and science for analysis, integration and prediction. This paper describes a Workshop on Computational Cardiovascular Science that created an international, interdisciplinary and inter-sectorial forum to define the next steps for a human-based approach to disease supported by computational methodologies. The main ideas highlighted were (i) a shift towards human-based methodologies, spurred by advances in new in silico, in vivo, in vitro, and ex vivo techniques and the increasing acknowledgement of the limitations of animal models. (ii) Computational approaches complement, expand, bridge, and integrate in vitro, in vivo, and ex vivo experimental and clinical data and methods, and as such they are an integral part of human-based methodologies in pharmacology and medicine. (iii) The effective implementation of multi- and interdisciplinary approaches, teams, and training combining and integrating computational methods with experimental and clinical approaches across academia, industry, and healthcare settings is a priority. (iv) The human-based cross-disciplinary approach requires experts in specific methodologies and domains, who also have the capacity to communicate and collaborate across disciplines and cross-sector environments. (v) This new translational domain for human-based cardiology and pharmacology requires new partnerships supported financially and institutionally across sectors. Institutional, organizational, and social barriers must be identified, understood and overcome in each specific setting

    Functional Interactions between Retinoblastoma and c-MYC in a Mouse Model of Hepatocellular Carcinoma

    Get PDF
    Inactivation of the RB tumor suppressor and activation of the MYC family of oncogenes are frequent events in a large spectrum of human cancers. Loss of RB function and MYC activation are thought to control both overlapping and distinct cellular processes during cell cycle progression. However, how these two major cancer genes functionally interact during tumorigenesis is still unclear. Here, we sought to test whether loss of RB function would affect cancer development in a mouse model of c-MYC-induced hepatocellular carcinoma (HCC), a deadly cancer type in which RB is frequently inactivated and c-MYC often activated. We found that RB inactivation has minimal effects on the cell cycle, cell death, and differentiation features of liver tumors driven by increased levels of c-MYC. However, combined loss of RB and activation of c-MYC led to an increase in polyploidy in mature hepatocytes before the development of tumors. There was a trend for decreased survival in double mutant animals compared to mice developing c-MYC-induced tumors. Thus, loss of RB function does not provide a proliferative advantage to c-MYC-expressing HCC cells but the RB and c-MYC pathways may cooperate to control the polyploidy of mature hepatocytes

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    Get PDF
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical research.Peer reviewe

    Real-Time Processing of Electrophysiological Signals acquired in a Magnetic Resonance Imaging Environment

    No full text
    L'acquisition de l'électrocardiogramme (ECG) est recommandée lors d'examens d'Imagerie par Résonance Magnétique (IRM) pour le monitorage des patients et la synchronisation de l'acquisition IRM avec l'activité cardiaque. L'environnement IRM, de par ses trois composantes physiques caractéristiques, perturbe les signaux ECG. Les gradients de champ magnétique compliquent notamment grandement l'analyse de l'ECG de manière non conventionnelle. Le développement de traitements spécifiques est donc nécessaire, les méthodes existantes de détection QRS et de débruitage ne répondant pas de manière satisfaisante à ce problème. Une base de données ECG en IRM a été réalisée, afin de permettre le développement de nouvelles méthodes et leur évaluation selon deux critères : la qualité de détection des battements cardiaques et une estimation du rapport signal sur bruit spécifique à ces enregistrements. Un détecteur QRS capable de traiter ces signaux fortement bruités a été proposé. Cette technique est basée sur la détection et la caractérisation des singularités à partir des lignes de maxima d'ondelettes. Ce détecteur apporte une information sur le rythme cardiaque, primordiale pour la mise en place de nouvelles approches statistiques. Une méthode de débruitage basée sur l'analyse en composantes indépendantes a été présentée. Celle-ci utilise uniquement les signaux ECG. Une approche bayésienne de débruitage, reposant sur une unification de deux modèles (d'ECG et des artefacts de gradient), a été proposée. Enfin, l'approche bayésienne a également été suggérée pour prédire le rythme cardiaque, afin d'améliorer la stratégie de synchronisation.Electrocardiogram (ECG) is required during Magnetic Resonance Imaging (MRI), for patient monitoring and for the synchronization of MRI acquisitions and heart activity. The MRI environment, due to its three characteristic physic components, highly disturbs ECG signals. For instance, the magnetic field gradients strongly complicate the ECG analysis in a non conventional manner. The development of specific signal processing tools is thus required. Existing methods, whether QRS detector or denoising techniques, do not accurately process these signals. A database of ECG acquired in MRI has been built, enabling the development of new processing techniques and their evaluation by using the folllowing two criteria : the cardiac beat detection quality and the signal to noise ratio estimated specifically on these particular recordings. A QRS detector, processing the noisy ECG signals, has been proposed. This technique is based on the singularity detection and characterisation provided by the wavelet modulus maximum lines. This detector provides helpful information on cardiac rhythm, for the development of novel techniques with a statistical approach. A new denoising method based on independent component analysis has been presented. This technique takes only advantage of the ECG signals. Two Bayesian based denoising methods, unifying two models (of ECG and gradient artifacts) in one state-space formulation have been proposed. Bayesian filtering has also been suggested for cardiac rhythm prediction, in order to improve the synchronization strategy

    Analyse des signaux Electrocardiogrammes : du modèle à l’apprentissage machine

    No full text
    The focus of my research is the use of modeling and signal-processing methods to extract novel clinical parameters from large databases of physiological signals, and the use of machine-learning techniques to provide predictive actionable information to clinicians. I have been applying these techniques to electrocardiographic (ECG) (and other physiological) data for more than 10 years. My research project consists in developing novel machine-learning techniques combined with and/or inspired by modelling in order to create interpretable automatic decisionmaking systems in healthcare, with a particular focus to cardiovascular health data. Clinical assessment of the cardiovascular heath requires the acquisition of multimodal data, two of the most important modalities being Magnetic Resonance Imaging (MRI) and ECG. I therefore aim at developing tools for the joint analysis of electrophysiological and imaging data. These techniques and representations of data will offer solutions for concrete clinical problems such as better risk stratification for patients, whether for Atrial Fibrillation catheter ablation outcome prediction, and the risk of presenting Ventricular Tachycardia.Mes recherches ont essentiellement porté sur l’utilisation de méthodes de modélisation et de traitement des signaux pour extraire de nouveaux paramètres cliniques de grandes bases de données de signaux physiologiques, et sur l’utilisation de techniques d’apprentissage automatique pour l’aide à la décision clinique. J’applique ces techniques aux données électrocardiographiques (ECG) depuis plus de 10 ans. Mon projet de recherche consiste à développer de nouvelles techniques d’apprentissage automatique combinées à et/ou inspirées par la modélisation afin de créer des systèmes de décision automatique interprétables dans le domaine des soins de santé cardiovasculaire. L’évaluation clinique de la santé cardiovasculaire nécessite l’acquisition de données multimodales, deux des modalités les plus importantes étant l’imagerie par résonance magnétique (IRM) et l’ECG. Mon objectif sera donc de développer des outils pour l’analyse conjointe de données électrophysiologiques et d’imagerie. Ces techniques et représentations de données offriront des solutions à des problèmes cliniques concrets tels que pour la prédiction des résultats de l’ablation par catheter de la fibrillation auriculaire, ou la stratification de risque de présenter une tachycardie ventriculaire

    Traitement en temps réel des signaux électrophysiologiques acquis dans un environnement d'imagerie par résonance magnétique

    No full text
    Electrocardiogram (ECG) is required during Magnetic Resonance Imaging (MRI), for patient monitoring and for the synchronization of MRI acquisitions and heart activity. The MRI environment, due to its three characteristic physic components, highly disturbs ECG signals. For instance, the magnetic field gradients strongly complicate the ECG analysis in a non conventional manner. The development of specific signal processing tools is thus required. Existing methods, whether QRS detector or denoising techniques, do not accurately process these signals. A database of ECG acquired in MRI has been built, enabling the development of new processing techniques and their evaluation by using the folllowing two criteria : the cardiac beat detection quality and the signal to noise ratio estimated specifically on these particular recordings. A QRS detector, processing the noisy ECG signals, has been proposed. This technique is based on the singularity detection and characterisation provided by the wavelet modulus maximum lines. This detector provides helpful information on cardiac rhythm, for the development of novel techniques with a statistical approach. A new denoising method based on independent component analysis has been presented. This technique takes only advantage of the ECG signals. Two Bayesian based denoising methods, unifying two models (of ECG and gradient artifacts) in one state-space formulation have been proposed. Bayesian filtering has also been suggested for cardiac rhythm prediction, in order to improve the synchronization strategy.L'acquisition de l'électrocardiogramme (ECG) est recommandée lors d'examens d'Imagerie par Résonance Magnétique (IRM) pour le monitorage des patients et la synchronisation de l'acquisition IRM avec l'activité cardiaque. L'environnement IRM, de par ses trois composantes physiques caractéristiques, perturbe les signaux ECG. Les gradients de champ magnétique compliquent notamment grandement l'analyse de l'ECG de manière non conventionnelle. Le développement de traitements spécifiques est donc nécessaire, les méthodes existantes de détection QRS et de débruitage ne répondant pas de manière satisfaisante à ce problème. Une base de données ECG en IRM a été réalisée, afin de permettre le développement de nouvelles méthodes et leur évaluation selon deux critères : la qualité de détection des battements cardiaques et une estimation du rapport signal sur bruit spécifique à ces enregistrements. Un détecteur QRS capable de traiter ces signaux fortement bruités a été proposé. Cette technique est basée sur la détection et la caractérisation des singularités à partir des lignes de maxima d'ondelettes. Ce détecteur apporte une information sur le rythme cardiaque, primordiale pour la mise en place de nouvelles approches statistiques. Une méthode de débruitage basée sur l'analyse en composantes indépendantes a été présentée. Celle-ci utilise uniquement les signaux ECG. Une approche bayésienne de débruitage, reposant sur une unification de deux modèles (d'ECG et des artefacts de gradient), a été proposée. Enfin, l'approche bayésienne a également été suggérée pour prédire le rythme cardiaque, afin d'améliorer la stratégie de synchronisation

    Adaptive RR Prediction for Cardiac MRI

    No full text
    ©2008 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.International audienceCardiac Magnetic Resonance Imaging (MRI) is very challenging due to the perpetual heart movements. This movement is pseudo- periodic and implies several issues for image Acquisition. Building a single image requires several shots, done on a specific timing of the cardiac cycle. Nowadays the heart rate is estimated before imaging and then it is assumed not to evolve during acquisition. Additionaly, in order to remove motion artifacts, the patients are asked to perform breathholds. Unfortunately, while performing a breath-hold, the heart rate is changing in a significant way. To address this problem in the framework of clinical applications, we propose a simple method to predict the out-coming RR interval, in order to compute adapted MR parameters. The RR interval is the time separating two consecutive R waves and corresponds to one cardiac cycle. Due to its simplicity this method is clinically applicable. The prediction is performed by modelling the Heart Rate Variation as a linear combination of different sensors (here, respiratory sensor and amplitude of R wave). We evaluated this method on five healthy subjects in a clinical setup
    corecore