156 research outputs found
COMPUTER-AIDED QUANTITATIVE EARLY DIAGNOSIS OF DIABETIC FOOT
Diabetes is an incurable metabolic disease characterized by high blood sugar levels. The feet of people with diabetes are at the risk of a variety of pathological consequences including peripheral vascular disease, deformity, ulceration, and ultimately amputation. The key to managing the diabetic foot is prevention and early detection. Unfortunately, current hospital centered reactive diabetes care and the availability of inadequate qualitative diagnostic screening procedures causes physicians to miss the diagnosis in 61% of the patients. We have developed a computer aided diagnostic system for early detection of diabetic foot. The key idea is that diabetic foot exhibits significant neuropathic and vascular damages. When a diabetic foot is placed under cold stress, the thermal recovery will be much slower. This thermal recovery speed can be a quantitative measure for the diagnosis of diabetic foot condition. In our research, thermal recovery of the feet following cold stress is captured using an infrared camera. The captured infrared video is then filtered, segmented, and registered. The temperature recovery at each point on the foot is extracted and analyzed using a thermal regulation model, and the problematic regions are identified. In this thesis, we present our research on the following aspects of the developed computer aided diagnostic systems: subject measurement protocols, a trustful numerical model of the camera noise and noise parameter estimations, infrared video segmentation, new models of thermal regulations, thermal patterns classifications, and our preliminary findings based on small scale clinical study of about 40 subjects, which demonstrated the potential the new diagnostic system
Optical Force Measurements In Concentrated Colloidal Suspensions
This work concerns the construction and testing of an optical tweezers-based
force transducer, and its application to a hard-sphere colloidal system. A
particle in an optical trap forward-scatters a fraction of the trapping light,
which is collected in order to give high-resolution information on the trapped
particle’s position relative to the trap centre. The system is then calibrated
to convert particle displacements to forces. The colloid used in this study is
a density- and refractive index-matched suspension of PMMA particles, radius
860 ± 70nm, with volume fractions in the range φ = 40 → 62%. Passive
microrheological measurements have yielded information about rearrangements
in a tracer’s cage of nearest neighbours, as well as highly localised measurements
of the high-frequency viscosity, where the presence of the colloidal host causes
around a tenfold increase compared to the bare solvent case. Measurements have
also demonstrated the effect of sample history on local short-time self-diffusion
coefficient, with perturbations caused by translating a particle within the sample
taking up to an hour to relax in a φ = 58% sample. The high resolution particle
tracking offered by this technique has also allowed for the first measurement of
structure at a shorter lengthscale than the ‘dynamic cage size’ observed using
other experimental techniques. In addition, active measurements have shown the
emergence of a yield stress on the order of 5Pa as the volume fraction approaches
the glass transition at φ ≈ 58%
Colour depth-from-defocus incorporating experimental point spread function measurements
Depth-From-Defocus (DFD) is a monocular computer vision technique for creating
depth maps from two images taken on the same optical axis with different intrinsic camera
parameters. A pre-processing stage for optimally converting colour images to monochrome
using a linear combination of the colour planes has been shown to improve the
accuracy of the depth map. It was found that the first component formed using Principal
Component Analysis (PCA) and a technique to maximise the signal-to-noise ratio (SNR)
performed better than using an equal weighting of the colour planes with an additive noise
model. When the noise is non-isotropic the Mean Square Error (MSE) of the depth map
by maximising the SNR was improved by 7.8 times compared to an equal weighting and
1.9 compared to PCA. The fractal dimension (FD) of a monochrome image gives a measure
of its roughness and an algorithm was devised to maximise its FD through colour
mixing. The formulation using a fractional Brownian motion (mm) model reduced the
SNR and thus produced depth maps that were less accurate than using PCA or an equal
weighting. An active DFD algorithm to reduce the image overlap problem has been
developed, called Localisation through Colour Mixing (LCM), that uses a projected colour
pattern. Simulation results showed that LCM produces a MSE 9.4 times lower than equal
weighting and 2.2 times lower than PCA.
The Point Spread Function (PSF) of a camera system models how a point source of
light is imaged. For depth maps to be accurately created using DFD a high-precision PSF
must be known. Improvements to a sub-sampled, knife-edge based technique are presented
that account for non-uniform illumination of the light box and this reduced the
MSE by 25%. The Generalised Gaussian is presented as a model of the PSF and shown to
be up to 16 times better than the conventional models of the Gaussian and pillbox
Intelligent detectors
Die vorliegende Arbeit stellt eine Basis zur Entwicklung von On-Board Software für astronomische Satelliten dar. Sie dient als Anleitung und Nachschlagewerk und zeigt anhand der Projekte Herschel/PACS und SPICA/SAFARI, wie aus den Grundlagen weltraumtaugliche Flugsoftware entsteht. Dazu gehören das Verstehen des wissenschaftlichen Zwecks, also was soll wie gemessen werden und wofür ist das gut, sowie die Kenntnis der physikalischen Eigenschaften des Detektors, das Beherrschen der mathematischen Operationen zur
Verarbeitung der Daten und natürlich auch die Berücksichtigung der Umstände, unter welchen der Detektor zum Einsatz kommt.This thesis contains the knowledge and a good deal of experience that are necessary for the development of such astronomical on-board software for satellites. The key elements in the development are the understanding of the scientific purpose, knowledge of the physical properties of the detector, the comprehension of the mathematical operations involved in data processing and the consideration of the technical and observational circumstances
Digital Image Processing
Newspapers and the popular scientific press today publish many examples of highly impressive images. These images range, for example, from those showing regions of star birth in the distant Universe to the extent of the stratospheric ozone depletion over Antarctica in springtime, and to those regions of the human brain affected by Alzheimer’s disease. Processed digitally to generate spectacular images, often in false colour, they all make an immediate and deep impact on the viewer’s imagination and understanding.
Professor Jonathan Blackledge’s erudite but very useful new treatise Digital Image Processing: Mathematical and Computational Methods explains both the underlying theory and the techniques used to produce such images in considerable detail. It also provides many valuable example problems - and their solutions - so that the reader can test his/her grasp of the physical, mathematical and numerical aspects of the particular topics and methods discussed. As such, this magnum opus complements the author’s earlier work Digital Signal Processing. Both books are a wonderful resource for students who wish to make their careers in this fascinating and rapidly developing field which has an ever increasing number of areas of application.
The strengths of this large book lie in: • excellent explanatory introduction to the subject; • thorough treatment of the theoretical foundations, dealing with both electromagnetic and acoustic wave scattering and allied techniques; • comprehensive discussion of all the basic principles, the mathematical transforms (e.g. the Fourier and Radon transforms), their interrelationships and, in particular, Born scattering theory and its application to imaging systems modelling; discussion in detail - including the assumptions and limitations - of optical imaging, seismic imaging, medical imaging (using ultrasound), X-ray computer aided tomography, tomography when the wavelength of the probing radiation is of the same order as the dimensions of the scatterer, Synthetic Aperture Radar (airborne or spaceborne), digital watermarking and holography; detail devoted to the methods of implementation of the analytical schemes in various case studies and also as numerical packages (especially in C/C++); • coverage of deconvolution, de-blurring (or sharpening) an image, maximum entropy techniques, Bayesian estimators, techniques for enhancing the dynamic range of an image, methods of filtering images and techniques for noise reduction; • discussion of thresholding, techniques for detecting edges in an image and for contrast stretching, stochastic scattering (random walk models) and models for characterizing an image statistically; • investigation of fractal images, fractal dimension segmentation, image texture, the coding and storing of large quantities of data, and image compression such as JPEG; • valuable summary of the important results obtained in each Chapter given at its end; • suggestions for further reading at the end of each Chapter. I warmly commend this text to all readers, and trust that they will find it to be invaluable.
Professor Michael J Rycroft Visiting Professor at the International Space University, Strasbourg, France, and at Cranfield University, England
Holographic Fourier domain diffuse correlation spectroscopy
Diffuse correlation spectroscopy (DCS) is a non-invasive optical modality which can be used to measure cerebral blood flow (CBF) in real-time. It has important potential applications in clinical monitoring, as well as in neuroscience and the development of a non-invasive brain-computer interface. However, a trade-off exists between the signal-to-noise ratio (SNR) and imaging depth, and thus CBF sensitivity, of this technique. Additionally, as DCS is a diffuse optical technique, it is limited by a lack of inherent depth discrimination within the illuminated region of each source-detector pair, and the CBF signal is therefore also prone to contamination by the extracerebral tissues which the light traverses.
Placing a particular emphasis on scalability, affordability, and robustness to ambient light, in this work I demonstrate a novel approach which fuses the fields of digital holography and DCS: holographic Fourier domain DCS (FD-DCS). The mathematical formalism of FD-DCS is derived and validated, followed by the construction and validation (for both in vitro and in vivo experiments) of a holographic FD-DCS instrument. By undertaking a systematic SNR performance assessment and developing a novel multispeckle denoising algorithm, I demonstrate the highest SNR gain reported in the DCS literature to date, achieved using scalable and low-cost camera-based detection.
With a view to generating a forward model for holographic FD-DCS, in this thesis I propose a novel framework to simulate statistically accurate time-integrated dynamic speckle patterns in biomedical optics. The solution that I propose to this previously unsolved problem is based on the Karhunen-Loève expansion of the electric field, and I validate this technique against novel expressions for speckle contrast for different forms of homogeneous field. I also show that this method can readily be extended to cases with spatially varying sample properties, and that it can also be used to model optical and acoustic parameters
Information Extraction and Modeling from Remote Sensing Images: Application to the Enhancement of Digital Elevation Models
To deal with high complexity data such as remote sensing images presenting metric resolution over large areas, an innovative, fast and robust image processing system is presented.
The modeling of increasing level of information is used to extract, represent and link image features to semantic content.
The potential of the proposed techniques is demonstrated with an application to enhance and regularize digital elevation models based on information collected from RS images
Creating the Future: Research and Technology
With the many different technical talents, Marshall Space Flight Center (MSFC) continues to be an important force behind many scientific breakthroughs. The MSFC's annual report reviews the technology developments, research in space and microgravity sciences, studies in space system concepts, and technology transfer. The technology development programs include development in: (1) space propulsion and fluid management, (2) structures and dynamics, (3) materials and processes and (4) avionics and optics
30 años (1977-2007): Centro de Investigaciones Ópticas (CIOp)
La edición de este libro fue financiada en parte por la Comisión de Investigaciones Científicas de la Provincia de Buenos Aires
Entropy in Image Analysis III
Image analysis can be applied to rich and assorted scenarios; therefore, the aim of this recent research field is not only to mimic the human vision system. Image analysis is the main methods that computers are using today, and there is body of knowledge that they will be able to manage in a totally unsupervised manner in future, thanks to their artificial intelligence. The articles published in the book clearly show such a future
- …