266 research outputs found
Biomedical Image Processing and Classification
Biomedical image processing is an interdisciplinary field involving a variety of disciplines, e.g., electronics, computer science, physics, mathematics, physiology, and medicine. Several imaging techniques have been developed, providing many approaches to the study of the human body. Biomedical image processing is finding an increasing number of important applications in, for example, the study of the internal structure or function of an organ and the diagnosis or treatment of a disease. If associated with classification methods, it can support the development of computer-aided diagnosis (CAD) systems, which could help medical doctors in refining their clinical picture
Digital Pathology: The Time Is Now to Bridge the Gap between Medicine and Technological Singularity
Digitalization of the imaging in radiology is a reality in several healthcare institutions worldwide. The challenges of filing, confidentiality, and manipulation have been brilliantly solved in radiology. However, digitalization of hematoxylin- and eosin-stained routine histological slides has shown slow movement. Although the application for external quality assurance is a reality for a pathologist with most of the continuing medical education programs utilizing virtual microscopy, the abandonment of traditional glass slides for routine diagnostics is far from the perspectives of many departments of laboratory medicine and pathology. Digital pathology images are captured as images by scanning and whole slide imaging/virtual microscopy can be obtained by microscopy (robotic) on an entire histological (microscopic) glass slide. Since 1986, services using telepathology for the transfer of images of anatomic pathology between detached locations have benefited countless patients globally, including the University of Alberta. The purpose of specialist recertification or re-validation for the Royal College of Pathologists of Canada belonging to the Royal College of Physicians and Surgeons of Canada and College of American Pathologists is a milestone in virtual reality. Challenges, such as high bandwidth requirement, electronic platforms, the stability of the operating systems, have been targeted and are improving enormously. The encryption of digital images may be a requirement for the accreditation of laboratory services—quantum computing results in quantum-mechanical phenomena, such as superposition and entanglement. Different from binary digital electronic computers based on transistors where data are encoded into binary digits (bits) with two different states (0 and 1), quantum computing uses quantum bits (qubits), which can be in superpositions of states. The use of quantum computing protocols on encrypted data is crucial for the permanent implementation of virtual pathology in hospitals and universities. Quantum computing may well represent the technological singularity to create new classifications and taxonomic rules in medicine
PET/MR imaging of hypoxic atherosclerotic plaque using 64Cu-ATSM
ABSTRACT OF THE DISSERTATION
PET/MR Imaging of Hypoxic Atherosclerotic Plaque Using 64Cu-ATSM
by
Xingyu Nie
Doctor of Philosophy in Biomedical Engineering
Washington University in St. Louis, 2017
Professor Pamela K. Woodard, Chair
Professor Suzanne Lapi, Co-Chair
It is important to accurately identify the factors involved in the progression of atherosclerosis because advanced atherosclerotic lesions are prone to rupture, leading to disability or death. Hypoxic areas have been known to be present in human atherosclerotic lesions, and lesion progression is associated with the formation of lipid-loaded macrophages and increased local inflammation which are potential major factors in the formation of vulnerable plaque. This dissertation work represents a comprehensive investigation of non-invasive identification of hypoxic atherosclerotic plaque in animal models and human subjects using the PET hypoxia imaging agent 64Cu-ATSM.
We first demonstrated the feasibility of 64Cu-ATSM for the identification of hypoxic atherosclerotic plaque and evaluated the relative effects of diet and genetics on hypoxia progression in atherosclerotic plaque in a genetically-altered mouse model. We then fully validated the feasibility of using 64Cu-ATSM to image the extent of hypoxia in a rabbit model with atherosclerotic-like plaque using a simultaneous PET-MR system. We also proceeded with a pilot clinical trial to determine whether 64Cu-ATSM MR/PET scanning is capable of detecting hypoxic carotid atherosclerosis in human subjects.
In order to improve the 64Cu-ATSM PET image quality, we investigated the Siemens HD (high-definition) PET software and 4 partial volume correction methods to correct for partial volume effects. In addition, we incorporated the attenuation effect of the carotid surface coil into the MR attenuation correction _-map to correct for photon attention.
In the long term, this imaging strategy has the potential to help identify patients at risk for cardiovascular events, guide therapy, and add to the understanding of plaque biology in human patients
Brain-Inspired Computing
This open access book constitutes revised selected papers from the 4th International Workshop on Brain-Inspired Computing, BrainComp 2019, held in Cetraro, Italy, in July 2019. The 11 papers presented in this volume were carefully reviewed and selected for inclusion in this book. They deal with research on brain atlasing, multi-scale models and simulation, HPC and data infra-structures for neuroscience as well as artificial and natural neural architectures
Recommended from our members
Development and applications of high speed and hyperspectral nonlinear microscopy
Nonlinear microscopy refers to a range of laser scanning microscopy techniques that are based on nonlinear optical processes such as two-photon excited fluorescence and second harmonic generation. Nonlinear microscopy techniques are powerful because they enable the visualization of highly scattering biological samples with subcellular resolution. This capability is especially valuable for in vivo and live tissue imaging since it can provide both structural and functional information about tissues in their native environment. With the use of a range of exogenous dyes and intrinsic contrast, in vivo nonlinear microscopy can be used to characterize and measure dynamic processes of tissues in their normal environment. These advances have been particularly relevant in neuroscience, where truly understanding the function of the brain requires that its neural and vascular networks be observed while undisturbed. Despite these advantages, in vivo nonlinear microscopy still faces several major challenges.
First, observing dynamics that occur in large areas over short time scales, such as neuronal signaling and blood flow, is challenging because nonlinear microscopy generally requires scanning to create an image. This limits the study of dynamic behavior to either a single plane or to a small subset of regions within a volume. Second, applications that rely on the use of exogenous dyes can be limited by the need to stain tissues before imaging, the availability of dyes, and specificity that can be achieved.
Usually considered a nuisance, endogenous tissue contrast from autofluorescence or structures exhibiting second harmonic generation can produce stunning images for visualizing subcellular morphology. Imaging endogenous contrast can also provide valuable information about the chemical makeup and metabolic state of the tissue. Few methods have been developed to carefully and quantitatively examine endogenous fluorescence in living tissues. In this thesis, these two challenges in nonlinear microscopy are addressed. The development of a novel hyperspectral two-photon microscopy method to acquire spectroscopic data from tissues and increase the information available from endogenous contrast is presented. This system was applied to visualize and identify sources of endogenous contrast in gastrointestinal tissues, providing robust references for the assessment of normal and diseased tissues.
Secondly, three methods for high speed volumetric imaging using laser scanning nonlinear microscopy were developed to address the need for improved high-speed imaging in living tissues. A spectrally-encoded high-speed imaging method that can provide simultaneous imaging of multiple regions of the living brain in parallel is presented and used to study spontaneous changes in vascular tone in the brain. This technique is then extended for use with second harmonic generation microscopy, which has the potential to greatly increase the degree of multiplexing. Finally, a complete system design capable of volumetric scan rates >1Hz is shown, offering improved performance and versatility to image brain activity
Linking quantitative radiology to molecular mechanism for improved vascular disease therapy selection and follow-up
Objective: Therapeutic advancements in atherosclerotic cardiovascular disease have improved the prevention of ischemic stroke and myocardial infarction. However, diagnostic methods for atherosclerotic plaque phenotyping to aid individualized therapy are lacking. In this thesis, we aimed to elucidate plaque biology through the analysis of computed-tomography angiography (CTA) with sufficient sensitivity and specificity to capture the differentiated drivers of the disease. We then aimed to use such data to calibrate a systems biology model of atherosclerosis with adequate granularity to be clinically relevant. Such development may be possible with computational modeling, but given, the multifactorial biology of atherosclerosis, modeling must be based on complete biological networks that capture protein-protein interactions estimated to drive disease progression.
Approach and Results: We employed machine intelligence using CTA paired with a molecular assay to determine cohort-level associations and individual patient predictions. Examples of predicted transcripts included ion transporters, cytokine receptors, and a number of microRNAs. Pathway analyses elucidated enrichment of several biological processes relevant to atherosclerosis and plaque pathophysiology. The ability of the models to predict plaque gene expression from CTAs was demonstrated using sequestered patients with transcriptomes of corresponding lesions. We further performed a case study exploring the relationship between biomechanical quantities and plaque morphology, indicating the ability to determine stress and strain from tissue characteristics. Further, we used a uniquely constituted plaque proteomic dataset to create a comprehensive systems biology disease model, which was finally used to simulate responses to different drug categories in individual patients. Individual patient response was simulated for intensive lipid-lowering, anti-inflammatory drugs, anti-diabetic, and combination therapy. Plaque tissue was collected from 18 patients with 6735 proteins at two locations per patient. 113 pathways were identified and included in the systems biology model of endothelial cells, vascular smooth muscle cells, macrophages, lymphocytes, and the integrated intima, altogether spanning 4411 proteins, demonstrating a range of 39-96% plaque instability. Simulations of drug responses varied in patients with initially unstable lesions from high (20%, on combination therapy) to marginal improvement, whereas patients with initially stable plaques showed generally less improvement, but importantly, variation across patients.
Conclusion: The results of this thesis show that atherosclerotic plaque phenotyping by multi-scale image analysis of conventional CTA can elucidate the molecular signatures that reflect atherosclerosis. We further showed that calibrated system biology models may be used to simulate drug response in terms of atherosclerotic plaque instability at the individual level, providing a potential strategy for improved personalized management of patients with cardiovascular disease. These results hold promise for optimized and personalized therapy in the prevention of myocardial infarction and ischemic stroke, which warrants further investigations in larger cohorts
On Improving Generalization of CNN-Based Image Classification with Delineation Maps Using the CORF Push-Pull Inhibition Operator
Deployed image classification pipelines are typically dependent on the images captured in real-world environments. This means that images might be affected by different sources of perturbations (e.g. sensor noise in low-light environments). The main challenge arises by the fact that image quality directly impacts the reliability and consistency of classification tasks. This challenge has, hence, attracted wide interest within the computer vision communities. We propose a transformation step that attempts to enhance the generalization ability of CNN models in the presence of unseen noise in the test set. Concretely, the delineation maps of given images are determined using the CORF push-pull inhibition operator. Such an operation transforms an input image into a space that is more robust to noise before being processed by a CNN. We evaluated our approach on the Fashion MNIST data set with an AlexNet model. It turned out that the proposed CORF-augmented pipeline achieved comparable results on noise-free images to those of a conventional AlexNet classification model without CORF delineation maps, but it consistently achieved significantly superior performance on test images perturbed with different levels of Gaussian and uniform noise
- …