24 research outputs found

    Banknote Authentication and Medical Image Diagnosis Using Feature Descriptors and Deep Learning Methods

    Get PDF
    Banknote recognition and medical image analysis have been the foci of image processing and pattern recognition research. As counterfeiters have taken advantage of the innovation in print media technologies for reproducing fake monies, hence the need to design systems which can reassure and protect citizens of the authenticity of banknotes in circulation. Similarly, many physicians must interpret medical images. But image analysis by humans is susceptible to error due to wide variations across interpreters, lethargy, and human subjectivity. Computer-aided diagnosis is vital to improvements in medical analysis, as they facilitate the identification of findings that need treatment and assist the expert’s workflow. Thus, this thesis is organized around three such problems related to Banknote Authentication and Medical Image Diagnosis. In our first research problem, we proposed a new banknote recognition approach that classifies the principal components of extracted HOG features. We further experimented on computing HOG descriptors from cells created from image patch vertices of SURF points and designed a feature reduction approach based on a high correlation and low variance filter. In our second research problem, we developed a mobile app for banknote identification and counterfeit detection using the Unity 3D software and evaluated its performance based on a Cascaded Ensemble approach. The algorithm was then extended to a client-server architecture using SIFT and SURF features reduced by Bag of Words and high correlation-based HOG vectors. In our third research problem, experiments were conducted on a pre-trained mobile app for medical image diagnosis using three convolutional layers with an Ensemble Classifier comprising PCA and bagging of five base learners. Also, we implemented a Bidirectional Generative Adversarial Network to mitigate the effect of the Binary Cross Entropy loss based on a Deep Convolutional Generative Adversarial Network as the generator and encoder with Capsule Network as the discriminator while experimenting on images with random composition and translation inferences. Lastly, we proposed a variant of the Single Image Super-resolution for medical analysis by redesigning the Super Resolution Generative Adversarial Network to increase the Peak Signal to Noise Ratio during image reconstruction by incorporating a loss function based on the mean square error of pixel space and Super Resolution Convolutional Neural Network layers

    Learning data structure from classes: A case study applied to population genetics

    Get PDF
    In most cases, the main goal of machine learning and data mining applications is to obtain good classifiers. However, final users, for instance researchers in other fields, sometimes prefer to infer new knowledge about their domain that may be useful to confirm or reject their hypotheses. This paper presents a learning method that works along these lines, in addition to reporting three interesting applications in the field of population genetics in which the aim is to discover relationships between species or breeds according to their genotypes. The proposed method has two steps: first it builds a hierarchical clustering of the set of classes and then a hierarchical classifier is learned. Both models can be analyzed by experts to extract useful information about their domain. In addition, we propose a new method for learning the hierarchical classifier. By means of a voting scheme employing pairwise binary models constrained by the hierarchical structure, the proposed classifier is computationally more efficient than previous approaches while improving on their performance

    Kuva-anturien tunnistaminen valovasteen epäyhdenmukaisuutta hyödyntäen

    Get PDF
    This thesis shows a method to identify a camera source by examining the noise inherent to the imaging process of the camera. The noise is caused by the imaging hardware, e.g. physical properties of charge-coupled device (CCD), the lens, and the Bayer pattern filter. The noise is then altered by the algorithms of the imaging pipeline. After the imaging pipeline, the noise can be isolated from the image by calculating the difference between noisy and denoised image. Noise can be used to form a camera fingerprint by calculating mean noise of a number of training images from same camera, pixel by pixel. The fingerprint can be used to identify the camera by calculating the correlation coefficient between the fingerprints from the cameras and a test image. The image is then assigned to the camera with highest correlation. The key factors affecting the recognition accuracy and stability are the de- noising algorithm and number of training images. It was shown that the best results are achieved with 60 training images and wavelet filter. This thesis evaluates the identification process in four cases. Firstly, between cameras chosen so that each is from different model. Secondly, between different individual cameras from the same model. Thirdly, between all individual cameras without considering the camera model. Finally, forming a fingerprint from one camera from each model, and then using them to identify the rest of the cameras from that model. It was shown that in the first two cases the identification process is feasible, accurate and reasonably stabile. In the latter two cases, the identification process failed to achieve sufficient accuracy to be feasible.Tässä työssä esitetään menetelmä kuvalähteenä olevan kameran tunnistamiseksi tutkimalla kuvausprosessissa sinällään syntyvää kohinaa. Kohina syntyy kuvauksessa käytettävästä laitteistosta, esim. kuva-anturista (CCD), linssistä ja Bayer-suotimesta. Kohinaa muokkaavat kameran automaattisesti kuvanparannukseen käyttämät algoritmit. Kuvanparannuksen jälkeen kohinan voi eristää muodostamalla erotuksen kohinan sisältävän kuvan ja suodatetun kuvan välillä. Kameran sormenjäljen voi muodostaa laskemalla pikseleittäin keskiarvon opetuskuvien kohinasta. Sormenjälkeä käytetään laskemaan korrelaatio testikuvan ja sormenjäljen välillä. Kuvan ottaneeksi kameraksi tunnistetaan se, jonka sormenjäljen ja testikuvan kohinan välillä on suurin korrelaatio. Tärkeimmät tunnistuksen tarkkuuteen ja vakauteen vaikuttavat tekijät ovat kohinanpoistoalgoritmi ja opetuskuvien määrä. Työssä osoitetaan, että parhaat tulokset saadaan käyttämällä 60:tä opetuskuvaa ja aallokesuodatusta. Tässä työssä arvioidaan tunnistusprosessia neljässä tapauksessa. Ensiksi eri malleista valittujen yksittäisten kameroiden suhteen, toiseksi saman kameramallin yksilöiden välillä, kolmanneksi kaikkien yksittäisten kameroiden välillä jättäen huomiotta kameramallin, ja viimeiseksi pyritään yhtä kameraa käyttäen muodostamaan prototyyppisormenjälki, jolla tunnistaa muut samanmalliset kamerat. Työssä osoitettiin, että kahdessa ensinmainitussa tapauksessa tunnistus toimii riittävän tarkasti ja vakaasti. Jälkimmäisissä kahdessa tapauksessa tunnistus ei saavuttanut riittävää tarkkuutta

    Entropy in Image Analysis II

    Get PDF
    Image analysis is a fundamental task for any application where extracting information from images is required. The analysis requires highly sophisticated numerical and analytical methods, particularly for those applications in medicine, security, and other fields where the results of the processing consist of data of vital importance. This fact is evident from all the articles composing the Special Issue "Entropy in Image Analysis II", in which the authors used widely tested methods to verify their results. In the process of reading the present volume, the reader will appreciate the richness of their methods and applications, in particular for medical imaging and image security, and a remarkable cross-fertilization among the proposed research areas

    Probabilistic multiple kernel learning

    Get PDF
    The integration of multiple and possibly heterogeneous information sources for an overall decision-making process has been an open and unresolved research direction in computing science since its very beginning. This thesis attempts to address parts of that direction by proposing probabilistic data integration algorithms for multiclass decisions where an observation of interest is assigned to one of many categories based on a plurality of information channels

    Tematski zbornik radova međunarodnog značaja. Tom 3 / Međunarodni naučni skup "Dani Arčibalda Rajsa", Beograd, 1-2. mart 2013

    Get PDF
    The Thematic Conference Proceedings contains 138 papers written by eminent scholars in the field of law, security, criminalistics, police studies, forensics, medicine, as well as members of national security system participating in education of the police, army and other security services from Russia, Ukraine, Belarus, China, Poland, Slovakia, Czech Republic, Hungary, Slovenia, Bosnia and Herzegovina, Montenegro, Republic of Srpska and Serbia. Each paper has been reviewed by two competent international reviewers, and the Thematic Conference Proceedings in whole has been reviewed by five international reviewers. The papers published in the Thematic Conference Proceedings contain the overview of con-temporary trends in the development of police educational system, development of the police and contemporary security, criminalistics and forensics, as well as with the analysis of the rule of law activities in crime suppression, situation and trends in the above-mentioned fields, and suggestions on how to systematically deal with these issues. The Thematic Conference Proceedings represents a significant contribution to the existing fund of scientific and expert knowledge in the field of criminalistic, security, penal and legal theory and practice. Publication of this Conference Proceedings contributes to improving of mutual cooperation between educational, scientific and expert institutions at national, regional and international level

    Payment Systems Report - June of 2020

    Get PDF
    With its annual Payment Systems Report, Banco de la República offers a complete overview of the infrastructure of Colombia’s financial market. Each edition of the report has four objectives: 1) to publicize a consolidated account of how the figures for payment infrastructures have evolved with respect to both financial assets and goods and services; 2) to summarize the issues that are being debated internationally and are of interest to the industry that provides payment clearing and settlement services; 3) to offer the public an explanation of the ideas and concepts behind retail-value payment processes and the trends in retail payments within the circuit of individuals and companies; and 4) to familiarize the public, the industry, and all other financial authorities with the methodological progress that has been achieved through applied research to analyze the stability of payment systems. This edition introduces changes that have been made in the structure of the report, which are intended to make it easier and more enjoyable to read. The initial sections in this edition, which is the eleventh, contain an analysis of the statistics on the evolution and performance of financial market infrastructures. These are understood as multilateral systems wherein the participating entities clear, settle and register payments, securities, derivatives and other financial assets. The large-value payment system (CUD) saw less momentum in 2019 than it did the year before, mainly because of a decline in the amount of secondary market operations for government bonds, both in cash and sell/buy-backs, which was offset by an increase in operations with collective investment funds (CIFs) and Banco de la República’s operations to increase the money supply (repos). Consequently, the Central Securities Depository (DCV) registered less activity, due to fewer negotiations on the secondary market for public debt. This trend was also observed in the private debt market, as evidenced by the decline in the average amounts cleared and settled through the Central Securities Depository of Colombia (Deceval) and in the value of operations with financial derivatives cleared and settled through the Central Counterparty of Colombia (CRCC). Section three offers a comprehensive look at the market for retail-value payments; that is, transactions made by individuals and companies. During 2019, electronic transfers increased, and payments made with debit and credit cards continued to trend upward. In contrast, payments by check continued to decline, although the average daily value was almost four times the value of debit and credit card purchases. The same section contains the results of the fourth survey on how the use of retail-value payment instruments (for usual payments) is perceived. Conducted at the end of 2019, the main purpose of the survey was to identify the availability of these payment instruments, the public’s preferences for them, and their acceptance by merchants. It is worth noting that cash continues to be the instrument most used by the population for usual monthly payments (88.1% with respect to the number of payments and 87.4% in value). However, its use in terms of value has declined, having registered 89.6% in the 2017 survey. In turn, the level of acceptance by merchants of payment instruments other than cash is 14.1% for debit cards, 13.4% for credit cards, 8.2% for electronic transfers of funds and 1.8% for checks. The main reason for the use of cash is the absence of point-of-sale terminals at commercial establishments. Considering that the retail-payment market worldwide is influenced by constant innovation in payment services, by the modernization of clearing and settlement systems, and by the efforts of regulators to redefine the payment industry for the future, these trends are addressed in the fourth section of the report. There is an account of how innovations in technology-based financial payment services have developed, and it shows that while this topic is not new, it has evolved, particularly in terms of origin and vocation. One of the boxes that accompanies the fourth section deals with certain payment aspects of open banking and international experience in that regard, which has given the customers of a financial entity sovereignty over their data, allowing them, under transparent and secure conditions, to authorize a third party, other than their financial entity, to request information on their accounts with financial entities, thus enabling the third party to offer various financial services or initiate payments. Innovation also has sparked interest among international organizations, central banks, and research groups concerning the creation of digital currencies. Accordingly, the last box deals with the recent international debate on issuance of central bank digital currencies. In terms of the methodological progress that has been made, it is important to underscore the work that has been done on the role of central counterparties (CCPs) in mitigating liquidity and counterparty risk. The fifth section of the report offers an explanation of a document in which the work of CCPs in financial markets is analyzed and corroborated through an exercise that was built around the Central Counterparty of Colombia (CRCC) in the Colombian market for non-delivery peso-dollar forward exchange transactions, using the methodology of network topology. The results provide empirical support for the different theoretical models developed to study the effect of CCPs on financial markets. Finally, the results of research using artificial intelligence with information from the large-value payment system are presented. Based on the payments made among financial institutions in the large-value payment system, a methodology is used to compare different payment networks, as well as to determine which ones can be considered abnormal. The methodology shows signs that indicate when a network moves away from its historical trend, so it can be studied and monitored. A methodology similar to the one applied to classify images is used to make this comparison, the idea being to extract the main characteristics of the networks and use them as a parameter for comparison. Juan José Echavarría Governo

    A comparison of the CAR and DAGAR spatial random effects models with an application to diabetics rate estimation in Belgium

    Get PDF
    When hierarchically modelling an epidemiological phenomenon on a finite collection of sites in space, one must always take a latent spatial effect into account in order to capture the correlation structure that links the phenomenon to the territory. In this work, we compare two autoregressive spatial models that can be used for this purpose: the classical CAR model and the more recent DAGAR model. Differently from the former, the latter has a desirable property: its ρ parameter can be naturally interpreted as the average neighbor pair correlation and, in addition, this parameter can be directly estimated when the effect is modelled using a DAGAR rather than a CAR structure. As an application, we model the diabetics rate in Belgium in 2014 and show the adequacy of these models in predicting the response variable when no covariates are available
    corecore