335 research outputs found

    Review on electrical impedance tomography: Artificial intelligence methods and its applications

    Full text link
    © 2019 by the authors. Electrical impedance tomography (EIT) has been a hot topic among researchers for the last 30 years. It is a new imaging method and has evolved over the last few decades. By injecting a small amount of current, the electrical properties of tissues are determined and measurements of the resulting voltages are taken. By using a reconstructing algorithm these voltages then transformed into a tomographic image. EIT contains no identified threats and as compared to magnetic resonance imaging (MRI) and computed tomography (CT) scans (imaging techniques), it is cheaper in cost as well. In this paper, a comprehensive review of efforts and advancements undertaken and achieved in recent work to improve this technology and the role of artificial intelligence to solve this non-linear, ill-posed problem are presented. In addition, a review of EIT clinical based applications has also been presented

    Optimizing Image Reconstruction in Electrical Impedance Tomography

    Get PDF
    Tato disertační práce pojednává o optimalizaci algoritmů pro rekonstrukci obrazu neznámé měrné vodivosti z měřených dat pořízených elektrickou impedanční tomografií. Danou problematiku zde věcně vymezuje několik různých prvků, zejména pak stručný matematický popis dopředné a inverzní úlohy řešené různými přístupy, metodika měření a pořizování dat pro rekonstrukci a přehled dostupných numerických nástrojů. Uvedenou charakteristiku rozšiřuje rozbor optimalizací parametrů modelu ovlivňujících přesnost rekonstrukce, způsoby paralelního zpracování algoritmů a souhrn dostupných zařízení pro měření tomografických dat. Na základě získaných poznatků byla navržena optimalizace parametrů matematického modelu, která umožňuje jeho velmi přesný návrh dle měřených dat. V této souvislosti dochází ke snížení nejistoty rekonstrukce rozložení konduktivity. Pro zefektivnění procesu získávání dat bylo navrženo zařízení k automatizaci tomografie s důrazem na cenovou dostupnost a snížení nejistoty měření. V oblasti tvorby numerického modelu byly dále zkoumány možnosti užití otevřených a uzavřených domén pro různé metody regularizace a hrubost sítě, a to s ohledem na velikost chyby rekonstruované konduktivity a výpočetní náročnost. Součástí práce je také paralelizace subalgoritmů rekonstrukce s využitím vícejádrové grafické karty. Předložené výsledky mají přímý vliv na snížení nejistoty rekonstrukce (optimalizací počáteční hodnoty konduktivity, rozmístění elektrod a tvarové deformace domény, regularizačních metod a typu domén) a urychlení výpočtů paralelizací algoritmů, přičemž výzkum byl podpořen vlastním návrhem jednotky pro automatizaci tomografie.The thesis presents, analyzes, and discusses the optimization of algorithms that reconstruct images of unknown specific conductivity from data acquired via electrical impedance tomography. In this context, the author provides a brief mathematical description of the forward and inverse tasks solved by using diverse approaches, characterizes relevant measurement techniques and data acquisition procedures, and discusses available numerical tools. Procedurally, the initial working stages involved analyzing the methods for optimizing those parameters of the model that influence the reconstruction accuracy; demonstrating approaches to the parallel processing of the algorithms; and outlining a survey of available instruments to acquire the tomographic data. The obtained knowledge then yielded a process for optimizing the parameters of the mathematical model, thus allowing the model to be designed precisely, based on the measured data; such a precondition eventually reduced the uncertainty in reconstructing the specific conductivity distribution. When forming the numerical model, the author investigated the possibilities and overall impact of combining the open and closed domains with various regularization methods and mesh element scales, considering both the character of the conductivity reconstruction error and the computational intensity. A complementary task resolved within the broader scheme outlined above lay in parallelizing the reconstruction subalgorithms by using a multi-core graphics card. The results of the thesis are directly reflected in a reduced reconstruction uncertainty (through an optimization of the initial conductivity value, placement of the electrodes, and shape deformation of the domains) and accelerated computation via parallelized algorithms. The actual research benefited from an in-house designed automated tomography unit.

    Roadmap on signal processing for next generation measurement systems

    Get PDF
    Signal processing is a fundamental component of almost any sensor-enabled system, with a wide range of applications across different scientific disciplines. Time series data, images, and video sequences comprise representative forms of signals that can be enhanced and analysed for information extraction and quantification. The recent advances in artificial intelligence and machine learning are shifting the research attention towards intelligent, data-driven, signal processing. This roadmap presents a critical overview of the state-of-the-art methods and applications aiming to highlight future challenges and research opportunities towards next generation measurement systems. It covers a broad spectrum of topics ranging from basic to industrial research, organized in concise thematic sections that reflect the trends and the impacts of current and future developments per research field. Furthermore, it offers guidance to researchers and funding agencies in identifying new prospects.AerodynamicsMicrowave Sensing, Signals & System

    The use of artificial neural networks in classifying lung scintigrams

    Get PDF
    An introduction to nuclear medical imaging and artificial neural networks (ANNs) is first given. Lung scintigrams are classified using ANNs in this study. Initial experiments using raw data are first reported. These networks did not produce suitable outputs, and a data compression method was next employed to present an orthogonal data input set containing the largest amount of information possible. This gave some encouraging results, but was neither sensitive nor accurate enough for clinical use. A set of experiments was performed to give local information on small windows of scintigram images. By this method areas of abnormality could be sent into a subsequent classification network to diagnose the cause of the defect. This automatic method of detecting potential defects did not work, though the networks explored were found to act as smoothing filters and edge detectors. Network design was investigated using genetic algorithms (GAs). The networks evolved had low connectivity but reduced error and faster convergence than fully connected networks. Subsequent simulations showed that randomly partially connected networks performed as well as GA designed ones. Dynamic parameter tuning was explored in an attempt to produce faster convergence, but the previous good results of other workers could not be replicated. Classification of scintigrams using manually delineated regions of interest was explored as inputs to ANNs, both in raw state and as principal components (PCs). Neither representation was shown to be effective on test data

    27th Annual Computational Neuroscience Meeting (CNS*2018): Part One

    Get PDF
    corecore