519 research outputs found

    Implementation of a 3D CNN for COPD classification

    Get PDF
    Segons les prediccions de la Organització Mundial de la Salut (OMS) pels voltants del 2030 la Malaltia Pulmonar Obstructiva Crònica (MPOC) es convertirá en la tercera causa de mort en tot el món. L’MPOC és una patologia que afecta a les vies respiratòries i als pulmons. Avui en dia esdevé crónica i incurable però, és una malaltia tractable i prevenible. Fins ara les proves de diagnòstic usades per a detectar l’MPOC es basen en l’espirometria, aquesta prova, tot i indicar el grau d’obstrucció al pas de l’aire que es produeix en els pulmons, sovint no és molt fiable. És per aquest motiu que s’estan començant a usar tècniques basades en algorismes de Deep Learning per a la classificaió més acurada d’aquesta patologia, basant-se en imatges tomogràfiques de pacients malalts d’MPOC. Les xarxes neuronals convolucionals en tres dimensions (3D-CNN) en són un exemple. A partir de les dades i les imatges obtingudes en l’estudi observacional d’ECLIPSE proporcionades per l’equip de recerca de BRGE de ISGlobal, s’implementa una 3D-CNN per a la classificació de pacients amb risc d’MPOC. Aquest treball té com a objectiu desenvolupar una recerca extensa sobre la recerca actual en aquest àmbit i proposa millores per a l’optimització i reducció del cost computacional d’una 3D-CNN per aquest cas d’estudi concret.Según las predicciones de la Organización Mundial de la Salud (OMS), para alrededor del 2030, la Enfermedad Pulmonar Obstructiva Crónica (EPOC) se convertirá en la tercera causa de muerte en todo el mundo. La EPOC es una enfermedad que afecta las vías respiratorias y los pulmones. En la actualidad, se considera crónica e incurable, pero es una enfermedad tratable y prevenible. Hasta ahora, las pruebas de diagnóstico utilizadas para detectar la EPOC se basan en la espirometría. Esta prueba, a pesar de indicar el grado de obstrucción en el flujo de aire que ocurre en los pulmones, a menudo no es muy confiable. Es por esta razón que se están empezando a utilizar técnicas basadas en algoritmos de Deep Learning para una clasificación más precisa de esta patología, utilizando imágenes tomográficas de pacientes enfermos de EPOC. Las redes neuronales convolucionales en tres dimensiones (3D-CNN) son un ejemplo de esto. A partir de los datos y las imágenes obtenidas en el estudio observacional ECLIPSE proporcionado por el equipo de investigación de BRGE de ISGlobal, se implementa una 3D-CNN para la clasificación de pacientes con riesgo de EPOC. Este trabajo tiene como objetivo desarrollar una investigación exhaustiva sobre la investigación actual en este campo y propone mejoras para la optimización y reducción del costo computacional de una 3D-CNN para este caso de estudio concreto.According to predictions by the World Health Organization (WHO), by around 2030, Chronic Obstructive Pulmonary Disease (COPD) will become the third leading cause of death worldwide. COPD is a condition that affects the respiratory tract and lungs. Currently, it is considered chronic and incurable, but it is a treatable and preventable disease. Up to now, diagnostic tests used to detect COPD have been based on spirometry. Despite indicating the degree of airflow obstruction in the lungs, this test is often not very reliable. That is why techniques based on Deep Learning algorithms are being increasingly used for more accurate classification of this pathology, based on tomographic images of COPD patients. Three-dimensional Convolutional Neural Networks (3D-CNN) are an example of such techniques. Based on the data and images obtained in the observational study called ECLIPSE, provided by the research team at BRGE of ISGlobal, a 3D-CNN is implemented for the classification of patients at risk of COPD. This work aims to conduct extensive research on the current state of research in this field and proposes improvements for the optimization and reduction of the computational cost of a 3D-CNN for this specific case study

    Decision Support Systems

    Get PDF
    Decision support systems (DSS) have evolved over the past four decades from theoretical concepts into real world computerized applications. DSS architecture contains three key components: knowledge base, computerized model, and user interface. DSS simulate cognitive decision-making functions of humans based on artificial intelligence methodologies (including expert systems, data mining, machine learning, connectionism, logistical reasoning, etc.) in order to perform decision support functions. The applications of DSS cover many domains, ranging from aviation monitoring, transportation safety, clinical diagnosis, weather forecast, business management to internet search strategy. By combining knowledge bases with inference rules, DSS are able to provide suggestions to end users to improve decisions and outcomes. This book is written as a textbook so that it can be used in formal courses examining decision support systems. It may be used by both undergraduate and graduate students from diverse computer-related fields. It will also be of value to established professionals as a text for self-study or for reference

    UWOMJ Volume 64, Number 1, Winter 1994

    Get PDF
    Schulich School of Medicine & Dentistryhttps://ir.lib.uwo.ca/uwomj/1241/thumbnail.jp

    Optimization with artificial intelligence in additive manufacturing: a systematic review

    Get PDF
    In situations requiring high levels of customization and limited production volumes, additive manufacturing (AM) is a frequently utilized technique with several benefits. To properly configure all the parameters required to produce final goods of the utmost quality, AM calls for qualified designers and experienced operators. This research demonstrates how, in this scenario, artificial intelligence (AI) could significantly enable designers and operators to enhance additive manufacturing. Thus, 48 papers have been selected from the comprehensive collection of research using a systematic literature review to assess the possibilities that AI may bring to AM. This review aims to better understand the current state of AI methodologies that can be applied to optimize AM technologies and the potential future developments and applications of AI algorithms in AM. Through a detailed discussion, it emerges that AI might increase the efficiency of the procedures associated with AM, from simulation optimization to in-process monitoring

    Simulating Land Use Land Cover Change Using Data Mining and Machine Learning Algorithms

    Get PDF
    The objectives of this dissertation are to: (1) review the breadth and depth of land use land cover (LUCC) issues that are being addressed by the land change science community by discussing how an existing model, Purdue\u27s Land Transformation Model (LTM), has been used to better understand these very important issues; (2) summarize the current state-of-the-art in LUCC modeling in an attempt to provide a context for the advances in LUCC modeling presented here; (3) use a variety of statistical, data mining and machine learning algorithms to model single LUCC transitions in diverse regions of the world (e.g. United States and Africa) in order to determine which tools are most effective in modeling common LUCC patterns that are nonlinear; (4) develop new techniques for modeling multiple class (MC) transitions at the same time using existing LUCC models as these models are rare and in great demand; (5) reconfigure the existing LTM for urban growth boundary (UGB) simulation because UGB modeling has been ignored by the LUCC modeling community, and (6) compare two rule based models for urban growth boundary simulation for use in UGB land use planning. The review of LTM applications during the last decade indicates that a model like the LTM has addressed a majority of land change science issues although it has not explicitly been used to study terrestrial biodiversity issues. The review of the existing LUCC models indicates that there is no unique typology to differentiate between LUCC model structures and no models exist for UGB. Simulations designed to compare multiple models show that ANN-based LTM results are similar to Multivariate Adaptive Regression Spline (MARS)-based models and both ANN and MARS-based models outperform Classification and Regression Tree (CART)-based models for modeling single LULC transition; however, for modeling MC, an ANN-based LTM-MC is similar in goodness of fit to CART and both models outperform MARS in different regions of the world. In simulations across three regions (two in United States and one in Africa), the LTM had better goodness of fit measures while the outcome of CART and MARS were more interpretable and understandable than the ANN-based LTM. Modeling MC LUCC require the examination of several class separation rules and is thus more complicated than single LULC transition modeling; more research is clearly needed in this area. One of the greatest challenges identified with MC modeling is evaluating error distributions and map accuracies for multiple classes. A modified ANN-based LTM and a simple rule based UGBM outperformed a null model in all cardinal directions. For UGBM model to be useful for planning, other factors need to be considered including a separate routine that would determine urban quantity over time

    ENDOMET database – A means to identify novel diagnostic and prognostic tools for endometriosis

    Get PDF
    Endometriosis is a common benign hormone reliant inflammatory gynecological disease that affects fertile aged women and has a considerable economic impact on healthcare systems. Symptoms include intense menstrual pain, persistent pelvic pain, and infertility. It is defined by the existence of endometrium-like tissue developing in ectopic locations outside the uterine cavity and inflammation in the peritoneal cavity. Endometriosis presents with multifactorial etiology, and despite extensive research the etiology is still poorly understood. Diagnostic delay from the onset of the disease to when a conclusive diagnosis is reached is between 7–12 years. There is no known cure, although symptoms can be improved with hormonal medications (which often have multiple side effects and prevent pregnancy), or through surgery which carries its own risk. Current non-invasive tools for diagnosis are not sufficiently dependable, and a definite diagnosis is achieved through laparoscopy or laparotomy. This study was based on two prospective cohorts: The ENDOMET study, including 137 endometriosis patients scheduled for surgery and 62 healthy women, and PROENDO that included 138 endometriosis patients and 33 healthy women. Our long-term goal with the current study was to support the discovery of innovative new tools for efficient diagnosis of endometriosis as well as tools to further understand the etiology and pathogenesis of the disease. We set about achieving this goal by creating a database, EndometDB, based on a relational data model, implemented with PostgreSQL programming language. The database allows e.g., for the exploration of global genome-wide expression patterns in the peritoneum, endometrium, and in endometriosis lesions of endometriosis patients as well as in the peritoneum and endometrium of healthy control women of reproductive age. The data collected in the EndometDB was also used for the development and validation of a symptom and biomarker-based predictive model designed for risk evaluation and early prediction of endometriosis without invasive diagnostic methods. Using the data in the EndometDB we discovered that compared with the eutopic endometrium, the WNT- signaling pathway is one of the molecular pathways that undergo strong changes in endometriosis. We then evaluated the potential role for secreted frizzled-related protein 2 (SFRP-2, a WNT-signaling pathway modulator), in improving endometriosis lesion border detection. The SFRP-2 expression visualizes the lesion better than previously used markers and can be used to better define lesion size and that the surgical excision of the lesions is complete.ENDOMET tietokanta – Keino tunnistaa uusi diagnostinen ja ennustava työkalu endometrioosille Endometrioosi on yleinen hyvänlaatuinen, hormoneista riippuvainen tulehduksellinen lisääntymisikäisten naisten gynekologinen sairaus, joka kuormittaa terveydenhuoltojärjestelmää merkittävästi. Endometrioositaudin oireita ovat mm. voimakas kuukautiskipu, jatkuva lantion alueen kipu ja hedelmättömyys. Sairaus määritellään kohdun limakalvon kaltaisen kudoksen esiintymisenä kohdun ulkopuolella sekä siihen liittyvänä vatsakalvon tulehduksena. Endometrioosin etiologia on monitahoinen, ja laajasta tutkimuksesta huolimatta edelleen huonosti tunnettu. Kesto taudin puhkeamisesta lopullisen diagnoosin saamiseen on usein jopa 7–12 vuotta. Sairauteen ei tunneta parannuskeinoa, mutta oireita voidaan lievittää esimerkiksi hormonaalisilla lääkkeillä (joilla on usein monia sivuvaikutuksia ja jotka estävät raskauden) tai leikkauksella, johon liittyy omat tunnetut riskit. Nykyiset ei-invasiiviset diagnoosityökalut eivät ole riittävän luotettavia sairauden tunnistamiseen, ja varma endometrioosin diagnoosi saavutetaan laparoskopian tai laparotomian avulla. Tämä tutkimus perustui kahteen prospektiiviseen kohorttiin: ENDOMET-tutkimuk-seen, johon osallistui 137 endometrioosipotilasta ja 62 terveellistä naista, sekä PROENDO-tutkimukseen, johon osallistui 138 endometrioosipotilasta ja 33 terveellistä naista. Tässä tutkimuksessa pitkän aikavälin tavoitteemme oli löytää uusia työkalujen endometrioosin diagnosointiin, sekä ymmärtää endometrioosin etiologiaa ja patogeneesiä. Ensimmäisessä vaiheessa loimme EndometDB –tietokannan PostgreSQL-ohjelmointi-kielellä. Tämän osittain avoimeen käyttöön vapautetun tietokannan avulla voidaan tutkia genomin, esimerkiksi kaikkien tunnettujen geenien ilmentymistä peritoneumissa, endo-metriumissa ja endometrioosipotilaiden endometrioosileesioissa EndometDB-tietokantaan kerättyjä tietoja käytettiin oireiden ja biomarkkeripohjaisen ennustemallin kehittämiseen ja validointiin. Malli tuottaa riskinarvioinnin endometrioositaudin varhaiseen ennustamiseen ilman laparoskopiaa. Käyttäen EndometDB-tietokannan tietoja havaitsimme, että endo-metrioositautikudoksessa tapahtui voimakkaita geeni-ilmentymisen muutoksia erityisesti geeneissä, jotka liittyvät WNT-signalointireitin säätelyyn. Keskeisin löydös oli, että SFRP-2 proteiinin ilmentyminen oli huomattavasti koholla endometrioosikudoksessa ja SFRP-2 proteiinin immunohistokemiallinen värjäys erottaa endometrioosin tautikudoksen terveestä kudoksesta aiempia merkkiaineita paremmin. Löydetyllä menetelmällä voidaan siten selvittää tautikudoksen laajuus ja tarvittaessa osoittaa, että leikkauksella on kyetty poistamaan koko sairas kudos

    Modeling network traffic on a global network-centric system with artificial neural networks

    Get PDF
    This dissertation proposes a new methodology for modeling and predicting network traffic. It features an adaptive architecture based on artificial neural networks and is especially suited for large-scale, global, network-centric systems. Accurate characterization and prediction of network traffic is essential for network resource sizing and real-time network traffic management. As networks continue to increase in size and complexity, the task has become increasingly difficult and current methodology is not sufficiently adaptable or scaleable. Current methods model network traffic with express mathematical equations which are not easily maintained or adjusted. The accuracy of these models is based on detailed characterization of the traffic stream which is measured at points along the network where the data is often subject to constant variation and rapid evolution. The main contribution of this dissertation is development of a methodology that allows utilization of artificial neural networks with increased capability for adaptation and scalability. Application on an operating global, broadband network, the Connexion by Boeingʼ network, was evaluated to establish feasibility. A simulation model was constructed and testing was conducted with operational scenarios to demonstrate applicability on the case study network and to evaluate improvements in accuracy over existing methods --Abstract, page iii

    Data- og ekspertdreven variabelseleksjon for prediktive modeller i helsevesenet : mot økt tolkbarhet i underbestemte maskinlæringsproblemer

    Get PDF
    Modern data acquisition techniques in healthcare generate large collections of data from multiple sources, such as novel diagnosis and treatment methodologies. Some concrete examples are electronic healthcare record systems, genomics, and medical images. This leads to situations with often unstructured, high-dimensional heterogeneous patient cohort data where classical statistical methods may not be sufficient for optimal utilization of the data and informed decision-making. Instead, investigating such data structures with modern machine learning techniques promises to improve the understanding of patient health issues and may provide a better platform for informed decision-making by clinicians. Key requirements for this purpose include (a) sufficiently accurate predictions and (b) model interpretability. Achieving both aspects in parallel is difficult, particularly for datasets with few patients, which are common in the healthcare domain. In such cases, machine learning models encounter mathematically underdetermined systems and may overfit easily on the training data. An important approach to overcome this issue is feature selection, i.e., determining a subset of informative features from the original set of features with respect to the target variable. While potentially raising the predictive performance, feature selection fosters model interpretability by identifying a low number of relevant model parameters to better understand the underlying biological processes that lead to health issues. Interpretability requires that feature selection is stable, i.e., small changes in the dataset do not lead to changes in the selected feature set. A concept to address instability is ensemble feature selection, i.e. the process of repeating the feature selection multiple times on subsets of samples of the original dataset and aggregating results in a meta-model. This thesis presents two approaches for ensemble feature selection, which are tailored towards high-dimensional data in healthcare: the Repeated Elastic Net Technique for feature selection (RENT) and the User-Guided Bayesian Framework for feature selection (UBayFS). While RENT is purely data-driven and builds upon elastic net regularized models, UBayFS is a general framework for ensembles with the capabilities to include expert knowledge in the feature selection process via prior weights and side constraints. A case study modeling the overall survival of cancer patients compares these novel feature selectors and demonstrates their potential in clinical practice. Beyond the selection of single features, UBayFS also allows for selecting whole feature groups (feature blocks) that were acquired from multiple data sources, as those mentioned above. Importance quantification of such feature blocks plays a key role in tracing information about the target variable back to the acquisition modalities. Such information on feature block importance may lead to positive effects on the use of human, technical, and financial resources if systematically integrated into the planning of patient treatment by excluding the acquisition of non-informative features. Since a generalization of feature importance measures to block importance is not trivial, this thesis also investigates and compares approaches for feature block importance rankings. This thesis demonstrates that high-dimensional datasets from multiple data sources in the medical domain can be successfully tackled by the presented approaches for feature selection. Experimental evaluations demonstrate favorable properties of both predictive performance, stability, as well as interpretability of results, which carries a high potential for better data-driven decision support in clinical practice.Moderne datainnsamlingsteknikker i helsevesenet genererer store datamengder fra flere kilder, som for eksempel nye diagnose- og behandlingsmetoder. Noen konkrete eksempler er elektroniske helsejournalsystemer, genomikk og medisinske bilder. Slike pasientkohortdata er ofte ustrukturerte, høydimensjonale og heterogene og hvor klassiske statistiske metoder ikke er tilstrekkelige for optimal utnyttelse av dataene og god informasjonsbasert beslutningstaking. Derfor kan det være lovende å analysere slike datastrukturer ved bruk av moderne maskinlæringsteknikker for å øke forståelsen av pasientenes helseproblemer og for å gi klinikerne en bedre plattform for informasjonsbasert beslutningstaking. Sentrale krav til dette formålet inkluderer (a) tilstrekkelig nøyaktige prediksjoner og (b) modelltolkbarhet. Å oppnå begge aspektene samtidig er vanskelig, spesielt for datasett med få pasienter, noe som er vanlig for data i helsevesenet. I slike tilfeller må maskinlæringsmodeller håndtere matematisk underbestemte systemer og dette kan lett føre til at modellene overtilpasses treningsdataene. Variabelseleksjon er en viktig tilnærming for å håndtere dette ved å identifisere en undergruppe av informative variabler med hensyn til responsvariablen. Samtidig som variabelseleksjonsmetoder kan lede til økt prediktiv ytelse, fremmes modelltolkbarhet ved å identifisere et lavt antall relevante modellparametere. Dette kan gi bedre forståelse av de underliggende biologiske prosessene som fører til helseproblemer. Tolkbarhet krever at variabelseleksjonen er stabil, dvs. at små endringer i datasettet ikke fører til endringer i hvilke variabler som velges. Et konsept for å adressere ustabilitet er ensemblevariableseleksjon, dvs. prosessen med å gjenta variabelseleksjon flere ganger på en delmengde av prøvene i det originale datasett og aggregere resultater i en metamodell. Denne avhandlingen presenterer to tilnærminger for ensemblevariabelseleksjon, som er skreddersydd for høydimensjonale data i helsevesenet: "Repeated Elastic Net Technique for feature selection" (RENT) og "User-Guided Bayesian Framework for feature selection" (UBayFS). Mens RENT er datadrevet og bygger på elastic net-regulariserte modeller, er UBayFS et generelt rammeverk for ensembler som muliggjør inkludering av ekspertkunnskap i variabelseleksjonsprosessen gjennom forhåndsbestemte vekter og sidebegrensninger. En case-studie som modellerer overlevelsen av kreftpasienter sammenligner disse nye variabelseleksjonsmetodene og demonstrerer deres potensiale i klinisk praksis. Utover valg av enkelte variabler gjør UBayFS det også mulig å velge blokker eller grupper av variabler som representerer de ulike datakildene som ble nevnt over. Kvantifisering av viktigheten av variabelgrupper spiller en nøkkelrolle for forståelsen av hvorvidt datakildene er viktige for responsvariablen. Tilgang til slik informasjon kan føre til at bruken av menneskelige, tekniske og økonomiske ressurser kan forbedres dersom informasjonen integreres systematisk i planleggingen av pasientbehandlingen. Slik kan man redusere innsamling av ikke-informative variabler. Siden generaliseringen av viktighet av variabelgrupper ikke er triviell, undersøkes og sammenlignes også tilnærminger for rangering av viktigheten til disse variabelgruppene. Denne avhandlingen viser at høydimensjonale datasett fra flere datakilder fra det medisinske domenet effektivt kan håndteres ved bruk av variabelseleksjonmetodene som er presentert i avhandlingen. Eksperimentene viser at disse kan ha positiv en effekt på både prediktiv ytelse, stabilitet og tolkbarhet av resultatene. Bruken av disse variabelseleksjonsmetodene bærer et stort potensiale for bedre datadrevet beslutningsstøtte i klinisk praksis

    Digital Pathology: The Time Is Now to Bridge the Gap between Medicine and Technological Singularity

    Get PDF
    Digitalization of the imaging in radiology is a reality in several healthcare institutions worldwide. The challenges of filing, confidentiality, and manipulation have been brilliantly solved in radiology. However, digitalization of hematoxylin- and eosin-stained routine histological slides has shown slow movement. Although the application for external quality assurance is a reality for a pathologist with most of the continuing medical education programs utilizing virtual microscopy, the abandonment of traditional glass slides for routine diagnostics is far from the perspectives of many departments of laboratory medicine and pathology. Digital pathology images are captured as images by scanning and whole slide imaging/virtual microscopy can be obtained by microscopy (robotic) on an entire histological (microscopic) glass slide. Since 1986, services using telepathology for the transfer of images of anatomic pathology between detached locations have benefited countless patients globally, including the University of Alberta. The purpose of specialist recertification or re-validation for the Royal College of Pathologists of Canada belonging to the Royal College of Physicians and Surgeons of Canada and College of American Pathologists is a milestone in virtual reality. Challenges, such as high bandwidth requirement, electronic platforms, the stability of the operating systems, have been targeted and are improving enormously. The encryption of digital images may be a requirement for the accreditation of laboratory services—quantum computing results in quantum-mechanical phenomena, such as superposition and entanglement. Different from binary digital electronic computers based on transistors where data are encoded into binary digits (bits) with two different states (0 and 1), quantum computing uses quantum bits (qubits), which can be in superpositions of states. The use of quantum computing protocols on encrypted data is crucial for the permanent implementation of virtual pathology in hospitals and universities. Quantum computing may well represent the technological singularity to create new classifications and taxonomic rules in medicine
    • …
    corecore