190 research outputs found

    Multiresolutional models of uncertainty generation and reduction

    Get PDF
    Kolmogorov's axiomatic principles of the probability theory, are reconsidered in the scope of their applicability to the processes of knowledge acquisition and interpretation. The model of uncertainty generation is modified in order to reflect the reality of engineering problems, particularly in the area of intelligent control. This model implies algorithms of learning which are organized in three groups which reflect the degree of conceptualization of the knowledge the system is dealing with. It is essential that these algorithms are motivated by and consistent with the multiresolutional model of knowledge representation which is reflected in the structure of models and the algorithms of learning

    Wavelet based segmentation of hyperspectral colon tissue imagery

    Get PDF
    Segmentation is an early stage for the automated classification of tissue cells between normal and malignant types. We present an algorithm for unsupervised segmentation of images of hyperspectral human colon tissue cells into their constituent parts by exploiting the spatial relationship between these constituent parts. This is done by employing a modification of the conventional wavelet based texture analysis, on the projection of hyperspectral image data in the first principal component direction. Results show that our algorithm is comparable to other more computationally intensive methods which exploit the spectral characteristics of the hyperspectral imagery data

    Aspects of multi-resolutional foveal images for robot vision

    Get PDF
    Imperial Users onl

    Hyperspectral colon tissue cell classification

    Get PDF
    A novel algorithm to discriminate between normal and malignant tissue cells of the human colon is presented. The microscopic level images of human colon tissue cells were acquired using hyperspectral imaging technology at contiguous wavelength intervals of visible light. While hyperspectral imagery data provides a wealth of information, its large size normally means high computational processing complexity. Several methods exist to avoid the so-called curse of dimensionality and hence reduce the computational complexity. In this study, we experimented with Principal Component Analysis (PCA) and two modifications of Independent Component Analysis (ICA). In the first stage of the algorithm, the extracted components are used to separate four constituent parts of the colon tissue: nuclei, cytoplasm, lamina propria, and lumen. The segmentation is performed in an unsupervised fashion using the nearest centroid clustering algorithm. The segmented image is further used, in the second stage of the classification algorithm, to exploit the spatial relationship between the labeled constituent parts. Experimental results using supervised Support Vector Machines (SVM) classification based on multiscale morphological features reveal the discrimination between normal and malignant tissue cells with a reasonable degree of accuracy

    One-point Statistics of the Cosmic Density Field in Real and Redshift Spaces with A Multiresolutional Decomposition

    Get PDF
    In this paper, we develop a method of performing the one-point statistics of a perturbed density field with a multiresolutional decomposition based on the discrete wavelet transform (DWT). We establish the algorithm of the one-point variable and its moments in considering the effects of Poisson sampling and selection function. We also establish the mapping between the DWT one-point statistics in redshift space and real space, i.e. the algorithm for recovering the DWT one-point statistics from the redshift distortion of bulk velocity, velocity dispersion, and selection function. Numerical tests on N-body simulation samples show that this algorithm works well on scales from a few hundreds to a few Mpc/h for four popular cold dark matter models. Taking the advantage that the DWT one-point variable is dependent on both the scale and the shape (configuration) of decomposition modes, one can design estimators of the redshift distortion parameter (beta) from combinations of DWT modes. When the non-linear redshift distortion is not negligible, the beta estimator from quadrupole-to-monopole ratio is a function of scale. This estimator would not work without adding information about the scale-dependence, such as the power-spectrum index or the real-space correlation function of the random field. The DWT beta estimators, however, do not need such extra information. Numerical tests show that the proposed DWT estimators are able to determine beta robustly with less than 15% uncertainty in the redshift range 0 < z < 3.Comment: 39 pages, 12 figures, ApJ accepte

    Solutions to non-stationary problems in wavelet space.

    Get PDF

    Multiresolutional Fault-Tolerant Sensor Integration and Object Recognition in Images.

    Get PDF
    This dissertation applies multiresolution methods to two important problems in signal analysis. The problem of fault-tolerant sensor integration in distributed sensor networks is addressed, and an efficient multiresolutional algorithm for estimating the sensors\u27 effective output is proposed. The problem of object/shape recognition in images is addressed in a multiresolutional setting using pyramidal decomposition of images with respect to an orthonormal wavelet basis. A new approach to efficient template matching to detect objects using computational geometric methods is put forward. An efficient paradigm for object recognition is described

    On-line quality control in polymer processing using hyperspectral imaging

    Get PDF
    L’industrie du plastique se tourne de plus en plus vers les matériaux composites afin d’économiser de la matière et/ou d’utiliser des matières premières à moindres coûts, tout en conservant de bonnes propriétés. L’impressionnante adaptabilité des matériaux composites provient du fait que le manufacturier peut modifier le choix des matériaux utilisés, la proportion selon laquelle ils sont mélangés, ainsi que la méthode de mise en œuvre utilisée. La principale difficulté associée au développement de ces matériaux est l’hétérogénéité de composition ou de structure, qui entraîne généralement des défaillances mécaniques. La qualité des prototypes est normalement mesurée en laboratoire, à partir de tests destructifs et de méthodes nécessitant la préparation des échantillons. La mesure en-ligne de la qualité permettrait une rétroaction quasi-immédiate sur les conditions d’opération des équipements, en plus d’être directement utilisable pour le contrôle de la qualité dans une situation de production industrielle. L’objectif de la recherche proposée consiste à développer un outil de contrôle de qualité pour la qualité des matériaux plastiques de tout genre. Quelques sondes de type proche infrarouge ou ultrasons existent présentement pour la mesure de la composition en-ligne, mais celles-ci ne fournissent qu’une valeur ponctuelle à chaque acquisition. Ce type de méthode est donc mal adapté pour identifier la distribution des caractéristiques de surface de la pièce (i.e. homogénéité, orientation, dispersion). Afin d’atteindre cet objectif, un système d’imagerie hyperspectrale est proposé. À l’aide de cet appareil, il est possible de balayer la surface de la pièce et d’obtenir une image hyperspectrale, c’est-à-dire une image formée de l’intensité lumineuse à des centaines de longueurs d’onde et ce, pour chaque pixel de l’image. L’application de méthodes chimiométriques permettent ensuite d’extraire les caractéristiques spatiales et spectrales de l’échantillon présentes dans ces images. Finalement, les méthodes de régression multivariée permettent d’établir un modèle liant les caractéristiques identifiées aux propriétés de la pièce. La construction d’un modèle mathématique forme donc l’outil d’analyse en-ligne de la qualité des pièces qui peut également prédire et optimiser les conditions de fabrication.The use of plastic composite materials has been increasing in recent years in order to reduce the amount of material used and/or use more economic materials, all of which without compromising the properties. The impressive adaptability of these composite materials comes from the fact that the manufacturer can choose the raw materials, the proportion in which they are blended as well as the processing conditions. However, these materials tend to suffer from heterogeneous compositions and structures, which lead to mechanical weaknesses. Product quality is generally measured in the laboratory, using destructive tests often requiring extensive sample preparation. On-line quality control would allow near-immediate feedback on the operating conditions and may be transferrable to an industrial production context. The proposed research consists of developing an on-line quality control tool adaptable to plastic materials of all types. A number of infrared and ultrasound probes presently exist for on-line composition estimation, but only provide single-point values at each acquisition. These methods are therefore less adapted for identifying the spatial distribution of a sample’s surface characteristics (e.g. homogeneity, orientation, dispersion). In order to achieve this objective, a hyperspectral imaging system is proposed. Using this tool, it is possible to scan the surface of a sample and obtain a hyperspectral image, that is to say an image in which each pixel captures the light intensity at hundreds of wavelengths. Chemometrics methods can then be applied to this image in order to extract the relevant spatial and spectral features. Finally, multivariate regression methods are used to build a model between these features and the properties of the sample. This mathematical model forms the backbone of an on-line quality assessment tool used to predict and optimize the operating conditions under which the samples are processed

    Advances in Robotics, Automation and Control

    Get PDF
    The book presents an excellent overview of the recent developments in the different areas of Robotics, Automation and Control. Through its 24 chapters, this book presents topics related to control and robot design; it also introduces new mathematical tools and techniques devoted to improve the system modeling and control. An important point is the use of rational agents and heuristic techniques to cope with the computational complexity required for controlling complex systems. Through this book, we also find navigation and vision algorithms, automatic handwritten comprehension and speech recognition systems that will be included in the next generation of productive systems developed by man
    corecore