3,111 research outputs found

    Advancements and Breakthroughs in Ultrasound Imaging

    Get PDF
    Ultrasonic imaging is a powerful diagnostic tool available to medical practitioners, engineers and researchers today. Due to the relative safety, and the non-invasive nature, ultrasonic imaging has become one of the most rapidly advancing technologies. These rapid advances are directly related to the parallel advancements in electronics, computing, and transducer technology together with sophisticated signal processing techniques. This book focuses on state of the art developments in ultrasonic imaging applications and underlying technologies presented by leading practitioners and researchers from many parts of the world

    A Study of clutter reduction techniques in wide bandwidth HF/VHF deep ground penetrating radar

    Get PDF
    Reducing clutter is one of the most daunting problems a radar processing engineer faces. Clutter causes a significant problem when attempting to detect sub-surface targets, as any significant change in the ground dielectric will produce a return at the receiver. The difficulty in reducing the clutter is compounded by the fact that the spectral characteristics of the clutter are similar to that of the target. While there are many methods that exist to reduce clutter, few do not require a priori information of either the target or the clutter. There are applications, of interest to the electromagnetic community, that are restricted in the amount of a priori information available to them. Estimation-subtraction filters calculate an estimate of the clutter from the statistics of the data collected and subtract that estimate from the original data. The Wiener filter has long been used as a way to suppress noise signals when a target reference is known. Using it to reduce clutter is a relatively new area of research. This research proposes estimation-subtraction filters and an application of the Wiener filter, which do not require a priori information to reduce the clutter of a bi-static synthetic aperture based, wideband deep ground penetrating radar system. The results of applying these filters to data collected in this way, at these depths, are illustrated here for the first time

    Fast Objective Coupled Planar Illumination Microscopy

    Get PDF
    Among optical imaging techniques light sheet fluorescence microscopy stands out as one of the most attractive for capturing high-speed biological dynamics unfolding in three dimensions. The technique is potentially millions of times faster than point-scanning techniques such as two-photon microscopy. This potential is especially poignant for neuroscience applications due to the fact that interactions between neurons transpire over mere milliseconds within tissue volumes spanning hundreds of cubic microns. However current-generation light sheet microscopes are limited by volume scanning rate and/or camera frame rate. We begin by reviewing the optical principles underlying light sheet fluorescence microscopy and the origin of these rate bottlenecks. We present an analysis leading us to the conclusion that Objective Coupled Planar Illumination (OCPI) microscopy is a particularly promising technique for recording the activity of large populations of neurons at high sampling rate. We then present speed-optimized OCPI microscopy, the first fast light sheet technique to avoid compromising image quality or photon efficiency. We enact two strategies to develop the fast OCPI microscope. First, we devise a set of optimizations that increase the rate of the volume scanning system to 40 Hz for volumes up to 700 microns thick. Second, we introduce Multi-Camera Image Sharing (MCIS), a technique to scale imaging rate by incorporating additional cameras. MCIS can be applied not only to OCPI but to any widefield imaging technique, circumventing the limitations imposed by the camera. Detailed design drawings are included to aid in dissemination to other research groups. We also demonstrate fast calcium imaging of the larval zebrafish brain and find a heartbeat-induced motion artifact. We recommend a new preprocessing step to remove the artifact through filtering. This step requires a minimal sampling rate of 15 Hz, and we expect it to become a standard procedure in zebrafish imaging pipelines. In the last chapter we describe essential computational considerations for controlling a fast OCPI microscope and processing the data that it generates. We introduce a new image processing pipeline developed to maximize computational efficiency when analyzing these multi-terabyte datasets, including a novel calcium imaging deconvolution algorithm. Finally we provide a demonstration of how combined innovations in microscope hardware and software enable inference of predictive relationships between neurons, a promising complement to more conventional correlation-based analyses

    Recent Advances in Signal Processing

    Get PDF
    The signal processing task is a very critical issue in the majority of new technological inventions and challenges in a variety of applications in both science and engineering fields. Classical signal processing techniques have largely worked with mathematical models that are linear, local, stationary, and Gaussian. They have always favored closed-form tractability over real-world accuracy. These constraints were imposed by the lack of powerful computing tools. During the last few decades, signal processing theories, developments, and applications have matured rapidly and now include tools from many areas of mathematics, computer science, physics, and engineering. This book is targeted primarily toward both students and researchers who want to be exposed to a wide variety of signal processing techniques and algorithms. It includes 27 chapters that can be categorized into five different areas depending on the application at hand. These five categories are ordered to address image processing, speech processing, communication systems, time-series analysis, and educational packages respectively. The book has the advantage of providing a collection of applications that are completely independent and self-contained; thus, the interested reader can choose any chapter and skip to another without losing continuity

    Computational Imaging Approach to Recovery of Target Coordinates Using Orbital Sensor Data

    Get PDF
    This dissertation addresses the components necessary for simulation of an image-based recovery of the position of a target using orbital image sensors. Each component is considered in detail, focusing on the effect that design choices and system parameters have on the accuracy of the position estimate. Changes in sensor resolution, varying amounts of blur, differences in image noise level, selection of algorithms used for each component, and lag introduced by excessive processing time all contribute to the accuracy of the result regarding recovery of target coordinates using orbital sensor data. Using physical targets and sensors in this scenario would be cost-prohibitive in the exploratory setting posed, therefore a simulated target path is generated using Bezier curves which approximate representative paths followed by the targets of interest. Orbital trajectories for the sensors are designed on an elliptical model representative of the motion of physical orbital sensors. Images from each sensor are simulated based on the position and orientation of the sensor, the position of the target, and the imaging parameters selected for the experiment (resolution, noise level, blur level, etc.). Post-processing of the simulated imagery seeks to reduce noise and blur and increase resolution. The only information available for calculating the target position by a fully implemented system are the sensor position and orientation vectors and the images from each sensor. From these data we develop a reliable method of recovering the target position and analyze the impact on near-realtime processing. We also discuss the influence of adjustments to system components on overall capabilities and address the potential system size, weight, and power requirements from realistic implementation approaches

    Advancing fluorescent contrast agent recovery methods for surgical guidance applications

    Get PDF
    Fluorescence-guided surgery (FGS) utilizes fluorescent contrast agents and specialized optical instruments to assist surgeons in intraoperatively identifying tissue-specific characteristics, such as perfusion, malignancy, and molecular function. In doing so, FGS represents a powerful surgical navigation tool for solving clinical challenges not easily addressed by other conventional imaging methods. With growing translational efforts, major hurdles within the FGS field include: insufficient tools for understanding contrast agent uptake behaviors, the inability to image tissue beyond a couple millimeters, and lastly, performance limitations of currently-approved contrast agents in accurately and rapidly labeling disease. The developments presented within this thesis aim to address such shortcomings. Current preclinical fluorescence imaging tools often sacrifice either 3D scale or spatial resolution. To address this gap in high-resolution, whole-body preclinical imaging tools available, the crux of this work lays on the development of a hyperspectral cryo-imaging system and image-processing techniques to accurately recapitulate high-resolution, 3D biodistributions in whole-animal experiments. Specifically, the goal is to correct each cryo-imaging dataset such that it becomes a useful reporter for whole-body biodistributions in relevant disease models. To investigate potential benefits of seeing deeper during FGS, we investigated short-wave infrared imaging (SWIR) for recovering fluorescence beyond the conventional top few millimeters. Through phantom, preclinical, and clinical SWIR imaging, we were able to 1) validate the capability of SWIR imaging with conventional NIR-I fluorophores, 2) demonstrate the translational benefits of SWIR-ICG angiography in a large animal model, and 3) detect micro-dose levels of an EGFR-targeted NIR-I probe during a Phase 0 clinical trial. Lastly, we evaluated contrast agent performances for FGS glioma resection and breast cancer margin assessment. To evaluate glioma-labeling performance of untargeted contrast agents, 3D agent biodistributions were compared voxel-by-voxel to gold-standard Gd-MRI and pathology slides. Finally, building on expertise in dual-probe ratiometric imaging at Dartmouth, a 10-pt clinical pilot study was carried out to assess the technique’s efficacy for rapid margin assessment. In summary, this thesis serves to advance FGS by introducing novel fluorescence imaging devices, techniques, and agents which overcome challenges in understanding whole-body agent biodistributions, recovering agent distributions at greater depths, and verifying agents’ performance for specific FGS applications

    Computer-aided diagnosis of complications of total hip replacement X-ray images

    Get PDF
    Hip replacement surgery has experienced a dramatic evolution in recent years supported by the latest developments in many areas of technology and surgical procedures. Unfortunately complications that follow hip replacement surgery remains the most challenging dilemma faced both by the patients and medical experts. The thesis presents a novel approach to segment the prosthesis of a THR surgical process by using an Active Contour Model (ACM) that is initiated via an automatically detected seed point within the enarthrosis region of the prosthesis. The circular area is detected via the use of a Fast, Randomized Circle Detection Algorithm. Experimental results are provided to compare the performance of the proposed ACM based approach to popular thresholding based approaches. Further an approach to automatically detect the Obturator Foramen using an ACM approach is also presented. Based on analysis of how medical experts carry out the detection of loosening and subsidence of a prosthesis and the presence of infections around the prosthesis area, this thesis presents novel computational analysis concepts to identify the key feature points of the prosthesis that are required to detect all of the above three types of complications. Initially key points along the prosthesis boundary are determined by measuring the curvature on the surface of the prosthesis. By traversing the edge pixels, starting from one end of the boundary of a detected prosthesis, the curvature values are determined and effectively used to determine key points of the prosthesis surface and their relative positioning. After the key-points are detected, pixel value gradients across the boundary of the prosthesis are determined along the boundary of the prosthesis to determine the presence of subsidence, loosening and infections. Experimental results and analysis are presented to show that the presence of subsidence is determined by the identification of dark pixels around the convex bend closest to the stem area of the prosthesis and away from it. The presence of loosening is determined by the additional presence of dark regions just outside the two straight line edges of the stem area of the prosthesis. The presence of infections is represented by the determination of dark areas around the tip of the stem of the prosthesis. All three complications are thus determined by a single process where the detailed analysis defer. The experimental results presented show the effectiveness of all proposed approaches which are also compared and validated against the ground truth recorded manually with expert user input
    • …
    corecore