66 research outputs found

    Modern Approach for Designing and Solving Interval Estimated Linear Fractional Programming Models

    Get PDF
    Optimization methods have been widely applied in statistics. In mathematical programming, the coefficients of the models are always categorized as deterministic values. However uncertainty always exists in realistic problems. Therefore, interval-estimated optimization models may provide an alternative choice for considering the uncertainty into the optimization models. In this aspect, this paper concentrates, the lower and upper values of interval estimated linear fractional programming model (IELFPM) are obtained by using generalized confidence interval estimation method. An IELFPM is a LFP with interval form of the coefficients in the objective function and all requirements. The solution of the IELFPM is also analyzed

    The Karush-Kuhn-Tucker Optimality Conditions for the Fuzzy Optimization Problems in the Quotient Space of Fuzzy Numbers

    No full text
    We propose the solution concepts for the fuzzy optimization problems in the quotient space of fuzzy numbers. The Karush-Kuhn-Tucker (KKT) optimality conditions are elicited naturally by introducing the Lagrange function multipliers. The effectiveness is illustrated by examples

    Reconnaissance de l'écriture manuscrite en-ligne par approche combinant systèmes à vastes marges et modèles de Markov cachés

    Get PDF
    Handwriting recognition is one of the leading applications of pattern recognition and machine learning. Despite having some limitations, handwriting recognition systems have been used as an input method of many electronic devices and helps in the automation of many manual tasks requiring processing of handwriting images. In general, a handwriting recognition system comprises three functional components; preprocessing, recognition and post-processing. There have been improvements made within each component in the system. However, to further open the avenues of expanding its applications, specific improvements need to be made in the recognition capability of the system. Hidden Markov Model (HMM) has been the dominant methods of recognition in handwriting recognition in offline and online systems. However, the use of Gaussian observation densities in HMM and representational model for word modeling often does not lead to good classification. Hybrid of Neural Network (NN) and HMM later improves word recognition by taking advantage of NN discriminative property and HMM representational capability. However, the use of NN does not optimize recognition capability as the use of Empirical Risk minimization (ERM) principle in its training leads to poor generalization. In this thesis, we focus on improving the recognition capability of a cursive online handwritten word recognition system by using an emerging method in machine learning, the support vector machine (SVM). We first evaluated SVM in isolated character recognition environment using IRONOFF and UNIPEN character databases. SVM, by its use of principle of structural risk minimization (SRM) have allowed simultaneous optimization of representational and discriminative capability of the character recognizer. We finally demonstrate the various practical issues in using SVM within a hybrid setting with HMM. In addition, we tested the hybrid system on the IRONOFF word database and obtained favourable results.Nos travaux concernent la reconnaissance de l'écriture manuscrite qui est l'un des domaines de prédilection pour la reconnaissance des formes et les algorithmes d'apprentissage. Dans le domaine de l'écriture en-ligne, les applications concernent tous les dispositifs de saisie permettant à un usager de communiquer de façon transparente avec les systèmes d'information. Dans ce cadre, nos travaux apportent une contribution pour proposer une nouvelle architecture de reconnaissance de mots manuscrits sans contrainte de style. Celle-ci se situe dans la famille des approches hybrides locale/globale où le paradigme de la segmentation/reconnaissance va se trouver résolu par la complémentarité d'un système de reconnaissance de type discriminant agissant au niveau caractère et d'un système par approche modèle pour superviser le niveau global. Nos choix se sont portés sur des Séparateurs à Vastes Marges (SVM) pour le classifieur de caractères et sur des algorithmes de programmation dynamique, issus d'une modélisation par Modèles de Markov Cachés (HMM). Cette combinaison SVM/HMM est unique dans le domaine de la reconnaissance de l'écriture manuscrite. Des expérimentations ont été menées, d'abord dans un cadre de reconnaissance de caractères isolés puis sur la base IRONOFF de mots cursifs. Elles ont montré la supériorité des approches SVM par rapport aux solutions à bases de réseaux de neurones à convolutions (Time Delay Neural Network) que nous avions développées précédemment, et leur bon comportement en situation de reconnaissance de mots

    Epileptic seizure detection and prediction based on EEG signal

    Get PDF
    Epilepsy is a kind of chronic brain disfunction, manifesting as recurrent seizures which is caused by sudden and excessive discharge of neurons. Electroencephalogram (EEG) recordings is regarded as the golden standard for clinical diagnosis of epilepsy disease. The diagnosis of epilepsy disease by professional doctors clinically is time-consuming. With the help artificial intelligence algorithms, the task of automatic epileptic seizure detection and prediction is called a research hotspot. The thesis mainly contributes to propose a solution to overfitting problem of EEG signal in deep learning and a method of multiple channels fusion for EEG features. The result of proposed method achieves outstanding performance in seizure detection task and seizure prediction task. In seizure detection task, this paper mainly explores the effect of the deep learning in small data size. This thesis designs a hybrid model of CNN and SVM for epilepsy detection compared with end-to-end classification by deep learning. Another technique for overfitting is new EEG signal generation based on decomposition and recombination of EEG in time-frequency domain. It achieved a classification accuracy of 98.8%, a specificity of 98.9% and a sensitivity of 98.4% on the classic Bonn EEG data. In seizure prediction task, this paper proposes a feature fusion method for multi-channel EEG signals. We extract a three-order tensor feature in temporal, spectral and spatial domain. UMLDA is a tensor-to-vector projection method, which ensures minimal redundancy between feature dimensions. An excellent experimental result was finally obtained, including an average accuracy of 95%, 94% F1-measure and 90% Kappa index

    The Third Air Force/NASA Symposium on Recent Advances in Multidisciplinary Analysis and Optimization

    Get PDF
    The third Air Force/NASA Symposium on Recent Advances in Multidisciplinary Analysis and Optimization was held on 24-26 Sept. 1990. Sessions were on the following topics: dynamics and controls; multilevel optimization; sensitivity analysis; aerodynamic design software systems; optimization theory; analysis and design; shape optimization; vehicle components; structural optimization; aeroelasticity; artificial intelligence; multidisciplinary optimization; and composites

    Stochastic and deterministic algorithms for continuous black-box optimization

    Get PDF
    Continuous optimization is never easy: the exact solution is always a luxury demand and the theory of it is not always analytical and elegant. Continuous optimization, in practice, is essentially about the efficiency: how to obtain the solution with same quality using as minimal resources (e.g., CPU time or memory usage) as possible? In this thesis, the number of function evaluations is considered as the most important resource to save. To achieve this goal, various efforts have been implemented and applied successfully. One research stream focuses on the so-called stochastic variation (mutation) operator, which conducts an (local) exploration of the search space. The efficiency of those operator has been investigated closely, which shows a good stochastic variation should be able to generate a good coverage of the local neighbourhood around the current search solution. This thesis contributes on this issue by formulating a novel stochastic variation that yields good space coverage. Algorithms and the Foundations of Software technolog

    Reasoning with random sets: An agenda for the future

    Full text link
    In this paper, we discuss a potential agenda for future work in the theory of random sets and belief functions, touching upon a number of focal issues: the development of a fully-fledged theory of statistical reasoning with random sets, including the generalisation of logistic regression and of the classical laws of probability; the further development of the geometric approach to uncertainty, to include general random sets, a wider range of uncertainty measures and alternative geometric representations; the application of this new theory to high-impact areas such as climate change, machine learning and statistical learning theory.Comment: 94 pages, 17 figure
    • …
    corecore