1,384 research outputs found

    EQUALISATION TECHNIQUES FOR MULTI-LEVEL DIGITAL MAGNETIC RECORDING

    Get PDF
    A large amount of research has been put into areas of signal processing, medium design, head and servo-mechanism design and coding for conventional longitudinal as well as perpendicular magnetic recording. This work presents some further investigation in the signal processing and coding aspects of longitudinal and perpendicular digital magnetic recording. The work presented in this thesis is based upon numerical analysis using various simulation methods. The environment used for implementation of simulation models is C/C + + programming. Important results based upon bit error rate calculations have been documented in this thesis. This work presents the new designed Asymmetric Decoder (AD) which is modified to take into account the jitter noise and shows that it has better performance than classical BCJR decoders with the use of Error Correction Codes (ECC). In this work, a new method of designing Generalised Partial Response (GPR) target and its equaliser has been discussed and implemented which is based on maximising the ratio of the minimum squared euclidean distance of the PR target to the noise penalty introduced by the Partial Response (PR) filter. The results show that the new designed GPR targets have consistently better performance in comparison to various GPR targets previously published. Two methods of equalisation including the industry's standard PR, and a novel Soft-Feedback- Equalisation (SFE) have been discussed which are complimentary to each other. The work on SFE, which is a novelty of this work, was derived from the problem of Inter Symbol Interference (ISI) and noise colouration in PR equalisation. This work also shows that multi-level SFE with MAP/BCJR feedback based magnetic recording with ECC has similar performance when compared to high density binary PR based magnetic recording with ECC, thus documenting the benefits of multi-level magnetic recording. It has been shown that 4-level PR based magnetic recording with ECC at half the density of binary PR based magnetic recording has similar performance and higher packing density by a factor of 2. A novel technique of combining SFE and PR equalisation to achieve best ISI cancellation in a iterative fashion has been discussed. A consistent gain of 0.5 dB and more is achieved when this technique is investigated with application of Maximum Transition Run (MTR) codes. As the length of the PR target in PR equalisation increases, the gain achieved using this novel technique consistently increases and reaches up to 1.2 dB in case of EEPR4 target for a bit error rate of 10-5

    AN AUTOMATED DENTAL CARIES DETECTION AND SCORING SYSTEM FOR OPTIC IMAGES OF TOOTH OCCLUSAL SURFACE

    Get PDF
    Dental caries are one of the most prevalent chronic diseases. Worldwide 60 to 90 percent of school children and nearly 100 percent of adults experienced dental caries. The management of dental caries demands detection of carious lesions at early stages. The research of designing diagnostic tools in caries has been at peak for the last decade. This research aims to design an automated system to detect and score dental caries according to the International Caries Detection and Assessment System (ICDAS) guidelines using the optical images of the occlusal tooth surface. There have been numerous works that address the problem of caries detection by using new imaging technologies or advanced measurements. However, no such study has been done to detect and score caries with the use of optical images of the tooth surface. The aim of this dissertation is to develop image processing and machine learning algorithms to address the problem of detection and scoring the caries by the use of optical image of the tooth surface

    ABANICCO: A New Color Space for Multi-Label Pixel Classification and Color Analysis

    Get PDF
    Classifying pixels according to color, and segmenting the respective areas, are necessary steps in any computer vision task that involves color images. The gap between human color perception, linguistic color terminology, and digital representation are the main challenges for developing methods that properly classify pixels based on color. To address these challenges, we propose a novel method combining geometric analysis, color theory, fuzzy color theory, and multi-label systems for the automatic classification of pixels into 12 conventional color categories, and the subsequent accurate description of each of the detected colors. This method presents a robust, unsupervised, and unbiased strategy for color naming, based on statistics and color theory. The proposed model, "ABANICCO" (AB ANgular Illustrative Classification of COlor), was evaluated through different experiments: its color detection, classification, and naming performance were assessed against the standardized ISCC-NBS color system; its usefulness for image segmentation was tested against state-of-the-art methods. This empirical evaluation provided evidence of ABANICCO's accuracy in color analysis, showing how our proposed model offers a standardized, reliable, and understandable alternative for color naming that is recognizable by both humans and machines. Hence, ABANICCO can serve as a foundation for successfully addressing a myriad of challenges in various areas of computer vision, such as region characterization, histopathology analysis, fire detection, product quality prediction, object description, and hyperspectral imaging.This research was funded by the Ministerio de Ciencia, Innovacción y Universidades, Agencia Estatal de Investigación, under grant PID2019-109820RB, MCIN/AEI/10.13039/501100011033 co-financed by the European Regional Development Fund (ERDF) "A way of making Europe" to A.M.-B. and L.N.-S.Publicad

    Quantifying Pharmaceutical Film Coating with Optical Coherence Tomography and Terahertz Pulsed Imaging: An Evaluation.

    Get PDF
    Spectral domain optical coherence tomography (OCT) has recently attracted a lot of interest in the pharmaceutical industry as a fast and non-destructive modality for quantification of thin film coatings that cannot easily be resolved with other techniques. Because of the relative infancy of this technique, much of the research to date has focused on developing the in-line measurement technique for assessing film coating thickness. To better assess OCT for pharmaceutical coating quantification, this paper evaluates tablets with a range of film coating thickness measured using OCT and terahertz pulsed imaging (TPI) in an off-line setting. In order to facilitate automated coating quantification for film coating thickness in the range of 30-200 μm, an algorithm that uses wavelet denoising and a tailored peak finding method is proposed to analyse each of the acquired A-scan. Results obtained from running the algorithm reveal an increasing disparity between the TPI and OCT measured intra-tablet variability when film coating thickness exceeds 100 μm. The finding further confirms that OCT is a suitable modality for characterising pharmaceutical dosage forms with thin film coatings, whereas TPI is well suited for thick coatings.The authors would like to acknowledge the financial support from UK EPSRC Research Grant EP/L019787/1 and EP/L019922/1.This is the final version of the article. It first appeared from Wiley via http://dx.doi.org/10.1002/jps.2453

    Impulsivity and self-control during intertemporal decision making linked to the neural dynamics of reward value representation

    Get PDF
    A characteristic marker of impulsive decision making is the discounting of delayed rewards, demonstrated via choice preferences and choice-related brain activity. However, delay discounting may also arise from how subjective reward value is dynamically represented in the brain when anticipating an upcoming chosen reward. In the current study, brain activity was continuously monitored as human participants freely selected an immediate or delayed primary liquid reward and then waited for the specified delay before consuming it. The ventromedial prefrontal cortex (vmPFC) exhibited a characteristic pattern of activity dynamics during the delay period, as well as modulation during choice, that is consistent with the time-discounted coding of subjective value. The ventral striatum (VS) exhibited a similar activity pattern, but preferentially in impulsive individuals. A contrasting profile of delay-related and choice activation was observed in the anterior PFC (aPFC), but selectively in patient individuals. Functional connectivity analyses indicated that both vmPFC and aPFC exerted modulatory, but opposite, influences on VS activation. These results link behavioral impulsivity and self-control to dynamically evolving neural representations of future reward value, not just during choice, but also during postchoice delay periods

    Peak selection in metabolic profiles using functional data analysis

    No full text
    In this thesis we describe sparse principal component analysis (PCA) methods and apply them to the analysis of short multivariate time series in order to perform both dimensionality reduction and variable selection. We take a functional data analysis (FDA) modelling approach in which each time series is treated as a continuous smooth function of time or curve. These techniques have been applied to analyse time series data arising in the area of metabonomics. Metabonomics studies chemical processes involving small molecule metabolites in a cell. We use experimental data obtained from the COnsortium for MEtabonomic Toxicology (COMET) project which is formed by six pharmaceutical companies and Imperial College London, UK. In the COMET project repeated measurements of several metabolites over time were collected which are taken from rats subjected to different drug treatments. The aim of our study is to detect important metabolites by analysing the multivariate time series. Multivariate functional PCA is an exploratory technique to describe the observed time series. In its standard form, PCA involves linear combinations of all variables (i.e. metabolite peaks) and does not perform variable selection. In order to select a subset of important metabolites we introduce sparsity into the model. We develop a novel functional Sparse Grouped Principal Component Analysis (SGPCA) algorithm using ideas related to Least Absolute Shrinkage and Selection Operator (LASSO), a regularized regression technique, with grouped variables. This SGPCA algorithm detects a sparse linear combination of metabolites which explain a large proportion of the variance. Apart from SGPCA, we also propose two alternative approaches for metabolite selection. The first one is based on thresholding the multivariate functional PCA solution, while the second method computes the variance of each metabolite curve independently and then proceeds to these rank curves in decreasing order of importance. To the best of our knowledge, this is the first application of sparse functional PCA methods to the problem of modelling multivariate metabonomic time series data and selecting a subset of metabolite peaks. We present comprehensive experimental results using simulated data and COMET project data for different multivariate and functional PCA variants from the literature and for SGPCA . Simulation results show that that the SGPCA algorithm recovers a high proportion of truly important metabolite variables. Furthermore, in the case of SGPCA applied to the COMET dataset we identify a small number of important metabolites independently for two different treatment conditions. A comparison of selected metabolites in both treatment conditions reveals that there is an overlap of over 75 percent

    Soft computing applied to optimization, computer vision and medicine

    Get PDF
    Artificial intelligence has permeated almost every area of life in modern society, and its significance continues to grow. As a result, in recent years, Soft Computing has emerged as a powerful set of methodologies that propose innovative and robust solutions to a variety of complex problems. Soft Computing methods, because of their broad range of application, have the potential to significantly improve human living conditions. The motivation for the present research emerged from this background and possibility. This research aims to accomplish two main objectives: On the one hand, it endeavors to bridge the gap between Soft Computing techniques and their application to intricate problems. On the other hand, it explores the hypothetical benefits of Soft Computing methodologies as novel effective tools for such problems. This thesis synthesizes the results of extensive research on Soft Computing methods and their applications to optimization, Computer Vision, and medicine. This work is composed of several individual projects, which employ classical and new optimization algorithms. The manuscript presented here intends to provide an overview of the different aspects of Soft Computing methods in order to enable the reader to reach a global understanding of the field. Therefore, this document is assembled as a monograph that summarizes the outcomes of these projects across 12 chapters. The chapters are structured so that they can be read independently. The key focus of this work is the application and design of Soft Computing approaches for solving problems in the following: Block Matching, Pattern Detection, Thresholding, Corner Detection, Template Matching, Circle Detection, Color Segmentation, Leukocyte Detection, and Breast Thermogram Analysis. One of the outcomes presented in this thesis involves the development of two evolutionary approaches for global optimization. These were tested over complex benchmark datasets and showed promising results, thus opening the debate for future applications. Moreover, the applications for Computer Vision and medicine presented in this work have highlighted the utility of different Soft Computing methodologies in the solution of problems in such subjects. A milestone in this area is the translation of the Computer Vision and medical issues into optimization problems. Additionally, this work also strives to provide tools for combating public health issues by expanding the concepts to automated detection and diagnosis aid for pathologies such as Leukemia and breast cancer. The application of Soft Computing techniques in this field has attracted great interest worldwide due to the exponential growth of these diseases. Lastly, the use of Fuzzy Logic, Artificial Neural Networks, and Expert Systems in many everyday domestic appliances, such as washing machines, cookers, and refrigerators is now a reality. Many other industrial and commercial applications of Soft Computing have also been integrated into everyday use, and this is expected to increase within the next decade. Therefore, the research conducted here contributes an important piece for expanding these developments. The applications presented in this work are intended to serve as technological tools that can then be used in the development of new devices

    Parameter optimization for local polynomial approximation based intersection confidence interval filter using genetic algorithm: an application for brain MRI image de-noising

    Get PDF
    Magnetic resonance imaging (MRI) is extensively exploited for more accuratepathological changes as well as diagnosis. Conversely, MRI suffers from variousshortcomings such as ambient noise from the environment, acquisition noise from theequipment, the presence of background tissue, breathing motion, body fat, etc.Consequently, noise reduction is critical as diverse types of the generated noise limit the efficiency of the medical image diagnosis. Local polynomial approximation basedintersection confidence interval (LPA-ICI) filter is one of the effective de-noising filters.This filter requires an adjustment of the ICI parameters for efficient window size selection.From the wide range of ICI parametric values, finding out the best set of tunes values is itselfan optimization problem. The present study proposed a novel technique for parameteroptimization of LPA-ICI filter using genetic algorithm (GA) for brain MR imagesde-noising. The experimental results proved that the proposed method outperforms theLPA-ICI method for de-noising in terms of various performance metrics for different noisevariance levels. Obtained results reports that the ICI parameter values depend on the noisevariance and the concerned under test image
    corecore