881 research outputs found

    Grid Databases for Shared Image Analysis in the MammoGrid Project

    Full text link
    The MammoGrid project aims to prove that Grid infrastructures can be used for collaborative clinical analysis of database-resident but geographically distributed medical images. This requires: a) the provision of a clinician-facing front-end workstation and b) the ability to service real-world clinician queries across a distributed and federated database. The MammoGrid project will prove the viability of the Grid by harnessing its power to enable radiologists from geographically dispersed hospitals to share standardized mammograms, to compare diagnoses (with and without computer aided detection of tumours) and to perform sophisticated epidemiological studies across national boundaries. This paper outlines the approach taken in MammoGrid to seamlessly connect radiologist workstations across a Grid using an "information infrastructure" and a DICOM-compliant object model residing in multiple distributed data stores in Italy and the UKComment: 10 pages, 5 figure

    PadChest: A large chest x-ray image dataset with multi-label annotated reports

    Get PDF
    We present a labeled large-scale, high resolution chest x-ray dataset for the automated exploration of medical images along with their associated reports. This dataset includes more than 160,000 images obtained from 67,000 patients that were interpreted and reported by radiologists at Hospital San Juan Hospital (Spain) from 2009 to 2017, covering six different position views and additional information on image acquisition and patient demography. The reports were labeled with 174 different radiographic findings, 19 differential diagnoses and 104 anatomic locations organized as a hierarchical taxonomy and mapped onto standard Unified Medical Language System (UMLS) terminology. Of these reports, 27% were manually annotated by trained physicians and the remaining set was labeled using a supervised method based on a recurrent neural network with attention mechanisms. The labels generated were then validated in an independent test set achieving a 0.93 Micro-F1 score. To the best of our knowledge, this is one of the largest public chest x-ray database suitable for training supervised models concerning radiographs, and the first to contain radiographic reports in Spanish. The PadChest dataset can be downloaded from http://bimcv.cipf.es/bimcv-projects/padchest/

    Shanoir: Software as a Service Environment to Manage Population Imaging Research Repositories

    No full text
    International audienceSome of the major concerns of researchers and clinicians involved in popu- lation imaging experiments are on one hand, to manage the huge quantity and diversi- ty of produced data and, on the other hand, to be able to confront their experiments and the programs they develop with peers. In this context, we introduce Shanoir, a “Software as a Service” (SaaS) environment that offers cloud services for managing the information related to population imaging data production in the context of clini- cal neurosciences. We show how the produced images are accessible through the Sha- noir Data Management System, and we describe some of the data repositories that are hosted and managed by the Shanoir environment in different contexts

    Analysis of radiation dose using DICOM metadata

    Get PDF
    La informació és el recurs més valuós mundialment [1], i és possible trobar-la a tot arreu. En el cas de les imatges mèdiques, la informació significa no només imatges de giga píxels, sinó que també inclou metadades i mesures quantitatives [2]. DICOM (Digital Imaging and Communications in Medicine) és una clara font d’informació, ja que es tracta del Standard per emmagatzemar i transmetre imatges mèdiques [2] i la informació relacionada [3]; això significa que conté dades d’imatges sense processar i metadades relacionades amb els procediments que es donen a terme amb l’adquisició i curació de la imatge [2]. Una de les dades més rellevants que es poden trobar en un arxiu DICOM són els paràmetres de dosi de radiació. Actualment, ni les Directives Europees ni les Regulacions Espanyoles estableixen límits en la dosi de radiació per a pacients sota processos de diagnòstic ni tractament. Està científicament demostrat que les radiacions ionitzants tenen efectes nocius en la salut humana [4], i per això s’han de prendre mesures com abans possible. Aquestes accions comencen per poder quantificar la radiació rebuda pel pacient en diferents estudis al llarg del temps, cosa que es pot fer fent ús d’una taula de metadades DICOM. Aquest projecte fa èmfasi en l’anàlisi de les metadades generades per Tomografia Computeritzada per comprovar la qualitat de la informació i veure si és possible estimar la quantitat dosimètrica que té en compte la sensibilitat biològica del teixit irradiat i reflecteix el risc d’una exposició no uniforme de cos sencer: la Dosi Efectiva [5]. Finalment, amb aquesta avaluació, és possible definir passos futurs per al desenvolupament d’una eina digital capaç d’analitzar dades relacionades amb radiació i controlar el risc de la ionització per a qualsevol tipus d’examinació mèdica.La información es el recurso más valioso mundialmente [1], y es posible encontrarla en todas partes. En el caso de las imágenes médicas, la información significa no solo imágenes de giga píxeles, sino que también incluye metadatos y medidas cuantitativas [2]. DICOM (Digital Imaging and Communications in Medicine) es una clara fuente de información, ya que se trata del estándar para almacenar y transmitir imágenes médicas [2] e información relacionada [3]; esto significa que contiene datos de imágenes sin procesar y metadatos relacionados con los procedimientos que se llevan a cabo durante la adquisición y curación de la imagen [2]. Uno de los datos más relevantes que se puede encontrar en un archivo DICOM son los parámetros de dosis de radiación. Actualmente, ni las Directivas Europeas ni Regulaciones Españolas establecen límites en la dosis de radiación para pacientes ante procesos de diagnóstico ni tratamiento. Está científicamente comprobado que las radiaciones ionizantes tienen efectos nocivos en la salud humana [4], por lo que se tienen que tomar medidas lo antes posible. Estas acciones empiezan con poder cuantificar la radiación recibida por el paciente en diferentes estudios a lo largo del tiempo, lo cual se puede conseguir haciendo uso de una tabla de metadatos DICOM. Este proyecto hace énfasis en el análisis de los metadatos generados por Tomografía Computarizada para comprobar la calidad de la información y ver si es posible estimar una cantidad dosimétrica que tiene en cuenta la sensibilidad biológica del tejido irradiado y refleja el riesgo de una exposición no uniforme de cuerpo completo: la Dosis Efectiva [5]. Finalmente, con esta evaluación, es posible definir pasos futuros para el desarrollo de una herramienta digital capaz de analizar datos relacionados con la radiación y controlar el riesgo de dicha ionización para cualquier tipo de examinación médica.Data is the world’s most valuable resource [1], and it is possible to find data everywhere. In medical images, data covers not only gigapixel images, but also metadata and quantitative measurements [2]. DICOM (Digital Imaging and Communications in Medicine) is a clear source of medical data, since it is the current standard for storing and transmitting medical images [2] and related information [3]; this means it contains raw data imaging and all metadata related to the procedures of image acquisition and curation [2]. Some of the most relevant information found on DICOM files relies on radiation dose parameters. In the current defined Directives and Regulations, no limits on the radiation dose are stipulated for patients undergoing diagnostic nor treatment procedures. There is proof that ionizing radiation has direct implications in human health [4], which is why measures need to be taken as soon as possible. These actions start with being able to quantify the radiation received by a patient in studies over time, which can be done by means of a DICOM dataset of metadata. The main focus of this project is to analyze Computerized Tomography (CT) scans generated metadata in order to check the quality of the data and then be able to estimate a dosimetric quantity that takes into account the biological sensitivity of the irradiated tissue and reflects the risk of a non-uniform whole-body exposure: the Effective Dose [5]. Finally, with this evaluation, it is possible to define the future steps for the development of a digital tool to analyze radiation-related data and risk control of ionizing radiation expanded for all type of medical examinations

    ASTRO Journals' Data Sharing Policy and Recommended Best Practices.

    Get PDF
    Transparency, openness, and reproducibility are important characteristics in scientific publishing. Although many researchers embrace these characteristics, data sharing has yet to become common practice. Nevertheless, data sharing is becoming an increasingly important topic among societies, publishers, researchers, patient advocates, and funders, especially as it pertains to data from clinical trials. In response, ASTRO developed a data policy and guide to best practices for authors submitting to its journals. ASTRO's data sharing policy is that authors should indicate, in data availability statements, if the data are being shared and if so, how the data may be accessed

    Neuroimaging study designs, computational analyses and data provenance using the LONI pipeline.

    Get PDF
    Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges--management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu
    corecore