27 research outputs found

    Quantitative lung CT analysis for the study and diagnosis of Chronic Obstructive Pulmonary Disease

    Get PDF
    The importance of medical imaging in the research of Chronic Obstructive Pulmonary Dis- ease (COPD) has risen over the last decades. COPD affects the pulmonary system through two competing mechanisms; emphysema and small airways disease. The relative contribu- tion of each component varies widely across patients whilst they can also evolve regionally in the lung. Patients can also be susceptible to exacerbations, which can dramatically ac- celerate lung function decline. Diagnosis of COPD is based on lung function tests, which measure airflow limitation. There is a growing consensus that this is inadequate in view of the complexities of COPD. Computed Tomography (CT) facilitates direct quantification of the pathological changes that lead to airflow limitation and can add to our understanding of the disease progression of COPD. There is a need to better capture lung pathophysiology whilst understanding regional aspects of disease progression. This has motivated the work presented in this thesis. Two novel methods are proposed to quantify the severity of COPD from CT by analysing the global distribution of features sampled locally in the lung. They can be exploited in the classification of lung CT images or to uncover potential trajectories of disease progression. A novel lobe segmentation algorithm is presented that is based on a probabilistic segmen- tation of the fissures whilst also constructing a groupwise fissure prior. In combination with the local sampling methods, a pipeline of analysis was developed that permits a re- gional analysis of lung disease. This was applied to study exacerbation susceptible COPD. Lastly, the applicability of performing disease progression modelling to study COPD has been shown. Two main subgroups of COPD were found, which are consistent with current clinical knowledge of COPD subtypes. This research may facilitate precise phenotypic characterisation of COPD from CT, which will increase our understanding of its natural history and associated heterogeneities. This will be instrumental in the precision medicine of COPD

    Methods for Analysing Endothelial Cell Shape and Behaviour in Relation to the Focal Nature of Atherosclerosis

    Get PDF
    The aim of this thesis is to develop automated methods for the analysis of the spatial patterns, and the functional behaviour of endothelial cells, viewed under microscopy, with applications to the understanding of atherosclerosis. Initially, a radial search approach to segmentation was attempted in order to trace the cell and nuclei boundaries using a maximum likelihood algorithm; it was found inadequate to detect the weak cell boundaries present in the available data. A parametric cell shape model was then introduced to fit an equivalent ellipse to the cell boundary by matching phase-invariant orientation fields of the image and a candidate cell shape. This approach succeeded on good quality images, but failed on images with weak cell boundaries. Finally, a support vector machines based method, relying on a rich set of visual features, and a small but high quality training dataset, was found to work well on large numbers of cells even in the presence of strong intensity variations and imaging noise. Using the segmentation results, several standard shear-stress dependent parameters of cell morphology were studied, and evidence for similar behaviour in some cell shape parameters was obtained in in-vivo cells and their nuclei. Nuclear and cell orientations around immature and mature aortas were broadly similar, suggesting that the pattern of flow direction near the wall stayed approximately constant with age. The relation was less strong for the cell and nuclear length-to-width ratios. Two novel shape analysis approaches were attempted to find other properties of cell shape which could be used to annotate or characterise patterns, since a wide variability in cell and nuclear shapes was observed which did not appear to fit the standard parameterisations. Although no firm conclusions can yet be drawn, the work lays the foundation for future studies of cell morphology. To draw inferences about patterns in the functional response of cells to flow, which may play a role in the progression of disease, single-cell analysis was performed using calcium sensitive florescence probes. Calcium transient rates were found to change with flow, but more importantly, local patterns of synchronisation in multi-cellular groups were discernable and appear to change with flow. The patterns suggest a new functional mechanism in flow-mediation of cell-cell calcium signalling

    Development, Implementation and Pre-clinical Evaluation of Medical Image Computing Tools in Support of Computer-aided Diagnosis: Respiratory, Orthopedic and Cardiac Applications

    Get PDF
    Over the last decade, image processing tools have become crucial components of all clinical and research efforts involving medical imaging and associated applications. The imaging data available to the radiologists continue to increase their workload, raising the need for efficient identification and visualization of the required image data necessary for clinical assessment. Computer-aided diagnosis (CAD) in medical imaging has evolved in response to the need for techniques that can assist the radiologists to increase throughput while reducing human error and bias without compromising the outcome of the screening, diagnosis or disease assessment. More intelligent, but simple, consistent and less time-consuming methods will become more widespread, reducing user variability, while also revealing information in a more clear, visual way. Several routine image processing approaches, including localization, segmentation, registration, and fusion, are critical for enhancing and enabling the development of CAD techniques. However, changes in clinical workflow require significant adjustments and re-training and, despite the efforts of the academic research community to develop state-of-the-art algorithms and high-performance techniques, their footprint often hampers their clinical use. Currently, the main challenge seems to not be the lack of tools and techniques for medical image processing, analysis, and computing, but rather the lack of clinically feasible solutions that leverage the already developed and existing tools and techniques, as well as a demonstration of the potential clinical impact of such tools. Recently, more and more efforts have been dedicated to devising new algorithms for localization, segmentation or registration, while their potential and much intended clinical use and their actual utility is dwarfed by the scientific, algorithmic and developmental novelty that only result in incremental improvements over already algorithms. In this thesis, we propose and demonstrate the implementation and evaluation of several different methodological guidelines that ensure the development of image processing tools --- localization, segmentation and registration --- and illustrate their use across several medical imaging modalities --- X-ray, computed tomography, ultrasound and magnetic resonance imaging --- and several clinical applications: Lung CT image registration in support for assessment of pulmonary nodule growth rate and disease progression from thoracic CT images. Automated reconstruction of standing X-ray panoramas from multi-sector X-ray images for assessment of long limb mechanical axis and knee misalignment. Left and right ventricle localization, segmentation, reconstruction, ejection fraction measurement from cine cardiac MRI or multi-plane trans-esophageal ultrasound images for cardiac function assessment. When devising and evaluating our developed tools, we use clinical patient data to illustrate the inherent clinical challenges associated with highly variable imaging data that need to be addressed before potential pre-clinical validation and implementation. In an effort to provide plausible solutions to the selected applications, the proposed methodological guidelines ensure the development of image processing tools that help achieve sufficiently reliable solutions that not only have the potential to address the clinical needs, but are sufficiently streamlined to be potentially translated into eventual clinical tools provided proper implementation. G1: Reducing the number of degrees of freedom (DOF) of the designed tool, with a plausible example being avoiding the use of inefficient non-rigid image registration methods. This guideline addresses the risk of artificial deformation during registration and it clearly aims at reducing complexity and the number of degrees of freedom. G2: The use of shape-based features to most efficiently represent the image content, either by using edges instead of or in addition to intensities and motion, where useful. Edges capture the most useful information in the image and can be used to identify the most important image features. As a result, this guideline ensures a more robust performance when key image information is missing. G3: Efficient method of implementation. This guideline focuses on efficiency in terms of the minimum number of steps required and avoiding the recalculation of terms that only need to be calculated once in an iterative process. An efficient implementation leads to reduced computational effort and improved performance. G4: Commence the workflow by establishing an optimized initialization and gradually converge toward the final acceptable result. This guideline aims to ensure reasonable outcomes in consistent ways and it avoids convergence to local minima, while gradually ensuring convergence to the global minimum solution. These guidelines lead to the development of interactive, semi-automated or fully-automated approaches that still enable the clinicians to perform final refinements, while they reduce the overall inter- and intra-observer variability, reduce ambiguity, increase accuracy and precision, and have the potential to yield mechanisms that will aid with providing an overall more consistent diagnosis in a timely fashion

    Finite element and mechanobiological modelling of vascular devices

    Get PDF
    There are two main surgical treatments for vascular diseases, (i) percutaneous stent deployment and (ii) replacement of an atherosclerotic artery with a vascular graft or tissue engineered blood vessel. The aim of this thesis was to develop computational models that could assist in the design of vascular stents and tissue engineered vascular grafts and scaffolds. In this context, finite element (FE) models of stent expansion in idealised and patient specific models of atherosclerotic arteries were developed. Different modelling strategies were investigated and an optimal modelling approach was identified which minimised computational cost without compromising accuracy. Numerical models of thin and thick strut stents were developed using this modelling approach to replicate the ISAR-STEREO clinical trial and the models identified arterial stresses as a suitable measure of stent induced vascular injury. In terms of evaluating vascular graft performance, mechanical characterisation experiments can be conducted in order to develop constitutive models that can be used in FE models of vascular grafts to predict their mechanical behaviour in-situ. In this context, bacterial cellulose (BC), a novel biomaterial, was mechanically characterised and a constitutive model was developed to describe its mechanical response. In addition, the interaction of smooth muscle cells with BC was studied using cell culture experiments. The constitutive model developed for BC was used as an input for a novel multi-scale mechanobiological modelling framework. The mechanobiological model was developed by coupling an FE model of a vascular scaffold and a lattice free agent based model of cell growth dynamics and remodelling in vascular scaffolds. By comparison with published in-vivo and in-vitro works, the model was found to successfully capture the key characteristics of vascular remodelling. It can therefore be used as a predictive tool for the growth and remodelling of vascular scaffolds and graft

    Effect of curing conditions and harvesting stage of maturity on Ethiopian onion bulb drying properties

    Get PDF
    The study was conducted to investigate the impact of curing conditions and harvesting stageson the drying quality of onion bulbs. The onion bulbs (Bombay Red cultivar) were harvested at three harvesting stages (early, optimum, and late maturity) and cured at three different temperatures (30, 40 and 50 oC) and relative humidity (30, 50 and 70%). The results revealed that curing temperature, RH, and maturity stage had significant effects on all measuredattributesexcept total soluble solids

    Numerical modelling of additive manufacturing process for stainless steel tension testing samples

    Get PDF
    Nowadays additive manufacturing (AM) technologies including 3D printing grow rapidly and they are expected to replace conventional subtractive manufacturing technologies to some extents. During a selective laser melting (SLM) process as one of popular AM technologies for metals, large amount of heats is required to melt metal powders, and this leads to distortions and/or shrinkages of additively manufactured parts. It is useful to predict the 3D printed parts to control unwanted distortions and shrinkages before their 3D printing. This study develops a two-phase numerical modelling and simulation process of AM process for 17-4PH stainless steel and it considers the importance of post-processing and the need for calibration to achieve a high-quality printing at the end. By using this proposed AM modelling and simulation process, optimal process parameters, material properties, and topology can be obtained to ensure a part 3D printed successfully

    Functional respiratory imaging : opening the black box

    Get PDF
    In respiratory medicine, several quantitative measurement tools exist that assist the clinicians in their diagnosis. The main issue with these traditional techniques is that they lack sensitivity to detect changes and that the variation between different measurements is very high. The result is that the development of respiratory drugs is the most expensive of all drug development. This limits innovation, resulting in an unmet need for sensitive quantifiable outcome parameters in pharmacological development and clinical respiratory practice. In this thesis, functional respiratory imaging (FRI) is proposed as a tool to tackle these issues. FRI is a workflow where patient specific medical images are combined with computational fluid dynamics in order to give patient specific local information of anatomy and functionality in the respiratory system. A robust high throughput automation system is designed in order get a workflow that is of a high quality, consistent and fast. This makes it possible to apply this technology on large datasets as typically seen in clinical trials. FRI is performed on 486 unique geometries of patients with various pathologies such as asthma, chronic obstructive lung disease, sleep apnea and cystic fibrosis. This thesis shows that FRI can have an added value in multiple research domains. The high sensitivity and specificity of FRI make it very well suited as a tool to make decisions early in the development process of a device or drug. Furthermore, FRI also seems to be an interesting technology to gain better insight in rare diseases and can possibly be useful in personalized medicine

    Brain and Human Body Modeling 2020

    Get PDF
    ​This open access book describes modern applications of computational human modeling in an effort to advance neurology, cancer treatment, and radio-frequency studies including regulatory, safety, and wireless communication fields. Readers working on any application that may expose human subjects to electromagnetic radiation will benefit from this book’s coverage of the latest models and techniques available to assess a given technology’s safety and efficacy in a timely and efficient manner. Describes computational human body phantom construction and application; Explains new practices in computational human body modeling for electromagnetic safety and exposure evaluations; Includes a survey of modern applications for which computational human phantoms are critical
    corecore