481,466 research outputs found

    CLASSIFICATION OF FEATURE SELECTION BASED ON ARTIFICIAL NEURAL NETWORK

    Get PDF
    Pattern recognition (PR) is the central in a variety of engineering applications. For this reason, it is indeed vital to develop efficient pattern recognition systems that facilitate decision making automatically and reliably. In this study, the implementation of PR system based on computational intelligence approach namely artificial neural network (ANN) is performed subsequent to selection of the best feature vectors. A framework to determine the best eigenvectors which we named as ‘eigenpostures’ of four main human postures specifically, standing, squatting/sitting, bending and lying based on the rules of thumb of Principal Component Analysis (PCA) has been developed. Accordingly, all three rules of PCA namely the KG-rule, Cumulative Variance and the Scree test suggest retaining only 35 main principal component or ‘eigenpostures’. Next, these ‘eigenpostures’ are statistically analyzed via Analysis of Variance (ANOVA) prior to classification. Thus, the most relevant component of the selected eigenpostures can be determined. Both categories of ‘eigenpostures’ prior to ANOVA as well as after ANOVA served as inputs to the ANN classifier to verify the effectiveness of feature selection based on statistical analysis. Results attained confirmed that the statistical analysis has enabled us to perform effectively the selection of eigenpostures for classification of four types of human postures

    Rule groupings in expert systems using nearest neighbour decision rules, and convex hulls

    Get PDF
    Expert System shells are lacking in many areas of software engineering. Large rule based systems are not semantically comprehensible, difficult to debug, and impossible to modify or validate. Partitioning a set of rules found in CLIPS (C Language Integrated Production System) into groups of rules which reflect the underlying semantic subdomains of the problem, will address adequately the concerns stated above. Techniques are introduced to structure a CLIPS rule base into groups of rules that inherently have common semantic information. The concepts involved are imported from the field of A.I., Pattern Recognition, and Statistical Inference. Techniques focus on the areas of feature selection, classification, and a criteria of how 'good' the classification technique is, based on Bayesian Decision Theory. A variety of distance metrics are discussed for measuring the 'closeness' of CLIPS rules and various Nearest Neighbor classification algorithms are described based on the above metric

    The VEX-93 environment as a hybrid tool for developing knowledge systems with different problem solving techniques

    Get PDF
    The paper describes VEX-93 as a hybrid environment for developing knowledge-based and problem solver systems. It integrates methods and techniques from artificial intelligence, image and signal processing and data analysis, which can be mixed. Two hierarchical levels of reasoning contains an intelligent toolbox with one upper strategic inference engine and four lower ones containing specific reasoning models: truth-functional (rule-based), probabilistic (causal networks), fuzzy (rule-based) and case-based (frames). There are image/signal processing-analysis capabilities in the form of programming languages with more than one hundred primitive functions. User-made programs are embeddable within knowledge basis, allowing the combination of perception and reasoning. The data analyzer toolbox contains a collection of numerical classification, pattern recognition and ordination methods, with neural network tools and a data base query language at inference engines's disposal. VEX-93 is an open system able to communicate with external computer programs relevant to a particular application. Metaknowledge can be used for elaborate conclusions, and man-machine interaction includes, besides windows and graphical interfaces, acceptance of voice commands and production of speech output. The system was conceived for real-world applications in general domains, but an example of a concrete medical diagnostic support system at present under completion as a cuban-spanish project is mentioned. Present version of VEX-93 is a huge system composed by about one and half millions of lines of C code and runs in microcomputers under Windows 3.1.Postprint (published version

    Machine learning model for clinical named entity recognition

    Get PDF
    To extract important concepts (named entities) from clinical notes, most widely used NLP task is named entity recognition (NER). It is found from the literature that several researchers have extensively used machine learning models for clinical NER.The most fundamental tasks among the medical data mining tasks are medical named entity recognition and normalization. Medical named entity recognition is different from general NER in various ways. Huge number of alternate spellings and synonyms create explosion of word vocabulary sizes. This reduces the medicine dictionary efficiency. Entities often consist of long sequences of tokens, making harder to detect boundaries exactly. The notes written by clinicians written notes are less structured and are in minimal grammatical form with cryptic short hand. Because of this, it poses challenges in named entity recognition. Generally, NER systems are either rule based or pattern based. The rules and patterns are not generalizable because of the diverse writing style of clinicians. The systems that use machine learning based approach to resolve these issues focus on choosing effective features for classifier building. In this work, machine learning based approach has been used to extract the clinical data in a required manne

    On the optimal decision rule for sequential interactive structured prediction

    Full text link
    This is the author’s version of a work that was accepted for publication in Pattern Recognition Letters. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Pattern Recognition Letters [Volume 33, Issue 16, 1 December 2012, Pages 2226–2231] DOI: 10.1016/j.patrec.2012.07.010[EN] Interactive structured prediction (ISP) is an emerging framework for structured prediction (SP) where the user and the system collaborate to produce a high quality output. Typically, search algorithms applied to ISP problems have been based on the algorithms for fully-automatic SP systems. However, the decision rule applied should not be considered as optimal since the goal in ISP is to reduce human effort instead of output errors. In this work, we present some insight into the theory of the sequential ISP search problem. First, it is formulated as a decision theory problem from which a general analytical formulation of the opti- mal decision rule is derived. Then, it is compared with the standard formulation to establish under what conditions the standard algorithm should perform similarly to the optimal decision rule. Finally, a general and practical implementation is given and evaluated against three classical ISP problems: interactive machine translation, interactive handwritten text recognition, and interactive speech recognition.The research leading to these results has received funding from the European Union Seventh Framework Programme (FP7/2007-2013) under Grant agreement no. 287576 (CasMaCat), and from the Spanish MEC/MICINN under the MIPRCV "Consolider Ingenio 2010" program (CSD2007-00018) and iTrans2 (TIN2009-14511) project. It is also supported by the Generalitat Valenciana under grant ALMPR (Prometeo/2009/01) and GV/2010/067. The authors thank the anonymous reviewers for their criticisms and suggestions.Alabau, V.; Sanchis Navarro, JA.; Casacuberta Nolla, F. (2012). On the optimal decision rule for sequential interactive structured prediction. Pattern Recognition Letters. 33(16):2226-2231. https://doi.org/10.1016/j.patrec.2012.07.010S22262231331

    A dynamic trading rule based on filtered flag pattern recognition for stock market price forecasting

    Full text link
    [EN] In this paper we propose and validate a trading rule based on flag pattern recognition, incorporating im- portant innovations with respect to the previous research. Firstly, we propose a dynamic window scheme that allows the stop loss and take profit to be updated on a quarterly basis. In addition, since the flag pat- tern is a trend-following pattern, we have added the EMA indicator to filter trades. This technical analysis indicator is calculated both for 15-min and 1-day timeframes, which enables short and medium terms to be considered simultaneously. We also filter the flags according to the price range on which they are de- veloped and have limited the maximum loss of each trade to 100 points. The proposed methodology was applied to 91,309 intraday observations of the DJIA index, considerably improving the results obtained in the previous proposals and those obtained by the buy & hold strategy, both for profitability and risk, and also after taking into account the transaction costs. These results seem to challenge market efficiency in line with other similar studies, in the specific analysis carried out on the DJIA index and is also limited to the setup considered.The fourth author of this work was partially supported by MINECO, Project MTM2016-75963-P.Arévalo, R.; García, J.; Guijarro, F.; Peris Manguillot, A. (2017). A dynamic trading rule based on filtered flag pattern recognition for stock market price forecasting. Expert Systems with Applications. 81:177-192. https://doi.org/10.1016/j.eswa.2017.03.0281771928

    Computing with Memristor-based Nonlinear Oscillators

    Get PDF
    Among the recent disruptive technologies, volatile/nonvolatile memory-resistor (memristor) has attracted the researchers' attention as a fundamental computation element. It has been experimentally shown that memristive elements can emulate synaptic dynamics and are even capable of supporting spike timing dependent plasticity (STDP), an important adaptation rule for neuromorphic computing systems. The overall goal of this work is to provide an unconventional computing platform exploiting memristor-based nonlinear oscillators described by means of phase deviation equations. Experimental results show that the approach significantly outperforms conventional architectures used for pattern recognition tasks

    Методология контроля качества изготовления узлов на заводах-изготовителях

    Get PDF
    В статье предложена методология контроля качества изготовления узлов электромеханических систем на заводах-изготовителях, основанная на методах цифровой обработки сигнала и теории распознавания образов, включающая в себя методики классификации, идентификации эталонов классов, целеобразования, распознавания, а также правила принятия решений. Для предложенных методик приводятся результаты численного исследования.У статті запропонована методологія контролю якості виготовлення вузлів електромеханічних систем на заводах-виготовлювачах, заснована на методах цифрової обробки сигналу й теорії розпізнавання образів, що включає в себе методики класифікації, ідентифікації еталонів класів, цілеутворення, розпізнавання, а також правила прийняття рішень. Для запропонованих методик приводяться результати чисельного дослідження.In the article the methodology of quality control of manufacturing of units of electromechanical systems at plantsmanufacturers based on methods of digital processing of a signal and the theory of a pattern recognition, including techniques of a classification, identification of pattern of classes, goal formation, recognition, and also rule decision making is offered. For offered methods the outcomes of numerical research are resulted

    A Novel Two-Stage Spectrum-Based Approach for Dimensionality Reduction: A Case Study on the Recognition of Handwritten Numerals

    Get PDF
    Dimensionality reduction (feature selection) is an important step in pattern recognition systems. Although there are different conventional approaches for feature selection, such as Principal Component Analysis, Random Projection, and Linear Discriminant Analysis, selecting optimal, effective, and robust features is usually a difficult task. In this paper, a new two-stage approach for dimensionality reduction is proposed. This method is based on one-dimensional and two-dimensional spectrum diagrams of standard deviation and minimum to maximum distributions for initial feature vector elements. The proposed algorithm is validated in an OCR application, by using two big standard benchmark handwritten OCR datasets, MNIST and Hoda. In the beginning, a 133-element feature vector was selected from the most used features, proposed in the literature. Finally, the size of initial feature vector was reduced from 100% to 59.40% (79 elements) for the MNIST dataset, and to 43.61% (58 elements) for the Hoda dataset, in order. Meanwhile, the accuracies of OCR systems are enhanced 2.95% for the MNIST dataset, and 4.71% for the Hoda dataset. The achieved results show an improvement in the precision of the system in comparison to the rival approaches, Principal Component Analysis and Random Projection. The proposed technique can also be useful for generating decision rules in a pattern recognition system using rule-based classifiers

    IDENTITY RECOGNITION OPTIMIZATION BASED ON LBP FEATURE EXTRACTION

    Get PDF
    ABSTRACT Unimodal systems have limited information that can be used for identity recognition systems. The multimodal system was created to improve the unimodal system. The multimodal system used in this study is the combination of the face and palms at the matching score level. Matching scores is done using the Weighted Sum Rule method. Extract features from each sample using the Local Binary Pattern (LBP) method. Meanwhile, large data dimensions are reduced by using the Principal Component Analysis (PCA) method. The distance between face and palm data is measured using the closest distance, namely the Euclidean Distance method. Benchmark dataset using ORL, FERET and PolyU. Based on testing on each database, an accuracy rate of 98% (ORL and PolyU) and 95% (FERET and PolyU) is obtained. The test results show that the multimodal system using the Hybrid method (PCA and LBP) biometric system runs well and optimally. Keywords: Artificial intelegency, recognition, LBP, multimoda
    corecore