8 research outputs found

    3D Object Recognition Using Fast Overlapped Block Processing Technique

    Get PDF
    Three-dimensional (3D) image and medical image processing, which are considered big data analysis, have attracted significant attention during the last few years. To this end, efficient 3D object recognition techniques could be beneficial to such image and medical image processing. However, to date, most of the proposed methods for 3D object recognition experience major challenges in terms of high computational complexity. This is attributed to the fact that the computational complexity and execution time are increased when the dimensions of the object are increased, which is the case in 3D object recognition. Therefore, finding an efficient method for obtaining high recognition accuracy with low computational complexity is essential. To this end, this paper presents an efficient method for 3D object recognition with low computational complexity. Specifically, the proposed method uses a fast overlapped technique, which deals with higher-order polynomials and high-dimensional objects. The fast overlapped block-processing algorithm reduces the computational complexity of feature extraction. This paper also exploits Charlier polynomials and their moments along with support vector machine (SVM). The evaluation of the presented method is carried out using a well-known dataset, the McGill benchmark dataset. Besides, comparisons are performed with existing 3D object recognition methods. The results show that the proposed 3D object recognition approach achieves high recognition rates under different noisy environments. Furthermore, the results show that the presented method has the potential to mitigate noise distortion and outperforms existing methods in terms of computation time under noise-free and different noisy environments

    On The Potential of Image Moments for Medical Diagnosis

    Get PDF
    Medical imaging is widely used for diagnosis and postoperative or post-therapy monitoring. The ever-increasing number of images produced has encouraged the introduction of automated methods to assist doctors or pathologists. In recent years, especially after the advent of convolutional neural networks, many researchers have focused on this approach, considering it to be the only method for diagnosis since it can perform a direct classification of images. However, many diagnostic systems still rely on handcrafted features to improve interpretability and limit resource consumption. In this work, we focused our efforts on orthogonal moments, first by providing an overview and taxonomy of their macrocategories and then by analysing their classification performance on very different medical tasks represented by four public benchmark data sets. The results confirmed that convolutional neural networks achieved excellent performance on all tasks. Despite being composed of much fewer features than those extracted by the networks, orthogonal moments proved to be competitive with them, showing comparable and, in some cases, better performance. In addition, Cartesian and harmonic categories provided a very low standard deviation, proving their robustness in medical diagnostic tasks. We strongly believe that the integration of the studied orthogonal moments can lead to more robust and reliable diagnostic systems, considering the performance obtained and the low variation of the results. Finally, since they have been shown to be effective on both magnetic resonance and computed tomography images, they can be easily extended to other imaging techniques

    Real Algebraic Geometry With a View Toward Moment Problems and Optimization

    Get PDF
    Continuing the tradition initiated in MFO workshop held in 2014, the aim of this workshop was to foster the interaction between real algebraic geometry, operator theory, optimization, and algorithms for systems control. A particular emphasis was given to moment problems through an interesting dialogue between researchers working on these problems in finite and infinite dimensional settings, from which emerged new challenges and interdisciplinary applications

    A comparison of polynomial chaos and Gaussian process emulation for uncertainty quantification in computer experiments

    Get PDF
    Computer simulation of real world phenomena is now ubiquitous in science, because experimentation in the field can be expensive, time-consuming, or impossible in practice. Examples include climate science, where future climate is examined under global warming scenarios, and cosmology, where the evolution of galaxies is studied from the beginning of the universe to present day. Combining complex mathematical models and numerical procedures to solve them in a computer program, these simulators are computationally expensive, in that they can take months to complete a single run. The practice of using a simulator to understand reality raises some interesting scientific questions, and there are many sources of uncertainty to consider. For example, the discrepancy between the simulator and the real world process. The field of uncertainty quantification is concerned with the characterisation and reduction of all uncertainties present in computational and real world problems. A key bottleneck in any uncertainty quantification analysis is the cost of evaluating the simulator. The solution is to replace the expensive simulator with a surrogate model, which is computationally faster to run, and can be used in subsequent analyses. Polynomial chaos and Gaussian process emulation are surrogate models developed independently in the engineering and statistics communities respectively over the last 25 years. Despite tackling similar problems in the field, there has been little interaction and collaboration between the two communities. This thesis provides a critical comparison of the two methods for a range of criteria and examples, from simple test functions to simulators used in industry. Particular focus is on the approximation accuracy of the surrogates under changes in the size and type of the experimental design. It is concluded that one method does not unanimously outperform the other, but advantages can be gained in some cases, such that the preferred method depends on the modelling goals of the practitioner. This is the first direct comparison of polynomial chaos and Gaussian process emulation in the literature. This thesis also proposes a novel methodology called probabilistic polynomial chaos, which is a hybrid of polynomial chaos and Gaussian process emulation. The approach draws inspiration from an emerging field in scientific computation known as probabilistic numerics, which treats classical numerical methods as statistical inference problems. In particular, a probabilistic integration technique called Bayesian quadrature, which employs Gaussian process emulators, is applied to a traditional form of polynomial chaos. The result is a probabilistic version of polynomial chaos, providing uncertainty information where the simulator has not yet been run

    Development of Quantum-Crystallographic Methods for Chemical and Biochemical Applications

    Get PDF
    The field of crystallography is a key branch of natural sciences, important not only for physics, geology, biology or chemistry, but it also provides crucial information for life sciences and materials science. It laid the foundations of our textbook knowledge of matter in general. In this thesis, the field of quantum crystallography – a synergistic approach of crystallography and quantum mechanics – is used as a tool to predict and understand processes of molecules and their interactions. New methods are proposed and used that provide deeper insight into the influence of local molecular environments on molecules and allows advanced predictions of the biochemical effect of drugs. Ultimately, this means that we can now understand interactions between molecules in crystal structures more completely that were long thought to be fully characterized. As part of this work, new software was developed to handle theoretical simulations as well as experimental data – and also both of them together at the same time. The introduction of non-spherical refinements in standard software for crystallography opens the field of quantum crystallography to a wide audience and will hopefully strengthen the mutual ground between experimentalists and theoreticians. Specifically, we created a new native interface between Olex2 and non-spherical refinement techniques, which we called NoSpherA2. This interface has been designed in such a way that it can be used for any kind of non-spherical atom descriptions. This will allow refinement of modern diffraction data employing modern quantum crystallographic models, leaving behind the century old Independent Atom Model (IAM). New software was also developed to provide novel models and descriptors for understanding environmental effects on the electron density and electrostatic potential of a molecule. This so-called Quantum Crystallographic Toolbox (QCrT) provides a framework for the fast and easy implementation of various methods and descriptors. File conversion tools allow the interfacing with many existing software packages and might provide useful information for future method development, experimental setups and data evaluation, as well as chemical insight into intra- and intermolecular interactions. It is fully parallelized and portable to graphic card processors (GPUs), which provide extraordinary amounts of computational power with moderate resource requirements. Especially in the context of ultra-bright X-ray sources like X-ray free electron lasers and electron diffraction these new models become crucial to have a better description of experimental findings. In applying this new framework of quantum crystallographic methods, we analyze a type of bonding at the edge of conventional organic chemistry: The push-pull systems of ethylenes. We show how X-ray constrained bonding analysis leads to the unambiguous determination of the behavior and type of bonding present in a series of compounds which are contradicting the Lewis-picture of a double-bond. This new understanding has led to the development of a new potential drug, namely a silicon analogue of ibuprofen; one of the most important drugs known to humankind. We determined its physical properties, investigated its stability and potency as a more soluble and novel alternative of ibuprofen: While retaining the same pharmaceutical activity of ibuprofen, making it a bioisoster for ibuprofen, this material shows a better applicability in aqueous media

    Abstracts on Radio Direction Finding (1899 - 1995)

    Get PDF
    The files on this record represent the various databases that originally composed the CD-ROM issue of "Abstracts on Radio Direction Finding" database, which is now part of the Dudley Knox Library's Abstracts and Selected Full Text Documents on Radio Direction Finding (1899 - 1995) Collection. (See Calhoun record https://calhoun.nps.edu/handle/10945/57364 for further information on this collection and the bibliography). Due to issues of technological obsolescence preventing current and future audiences from accessing the bibliography, DKL exported and converted into the three files on this record the various databases contained in the CD-ROM. The contents of these files are: 1) RDFA_CompleteBibliography_xls.zip [RDFA_CompleteBibliography.xls: Metadata for the complete bibliography, in Excel 97-2003 Workbook format; RDFA_Glossary.xls: Glossary of terms, in Excel 97-2003 Workbookformat; RDFA_Biographies.xls: Biographies of leading figures, in Excel 97-2003 Workbook format]; 2) RDFA_CompleteBibliography_csv.zip [RDFA_CompleteBibliography.TXT: Metadata for the complete bibliography, in CSV format; RDFA_Glossary.TXT: Glossary of terms, in CSV format; RDFA_Biographies.TXT: Biographies of leading figures, in CSV format]; 3) RDFA_CompleteBibliography.pdf: A human readable display of the bibliographic data, as a means of double-checking any possible deviations due to conversion
    corecore