86 research outputs found

    Max-min Learning of Approximate Weight Matrices from Fuzzy Data

    Full text link
    In this article, we study the approximate solutions set Λb\Lambda_b of an inconsistent system of maxmin\max-\min fuzzy relational equations (S):Aminmaxx=b(S): A \Box_{\min}^{\max}x =b. Using the LL_\infty norm, we compute by an explicit analytical formula the Chebyshev distance Δ = infcCbc\Delta~=~\inf_{c \in \mathcal{C}} \Vert b -c \Vert, where C\mathcal{C} is the set of second members of the consistent systems defined with the same matrix AA. We study the set Cb\mathcal{C}_b of Chebyshev approximations of the second member bb i.e., vectors cCc \in \mathcal{C} such that bc=Δ\Vert b -c \Vert = \Delta, which is associated to the approximate solutions set Λb\Lambda_b in the following sense: an element of the set Λb\Lambda_b is a solution vector xx^\ast of a system Aminmaxx=cA \Box_{\min}^{\max}x =c where cCbc \in \mathcal{C}_b. As main results, we describe both the structure of the set Λb\Lambda_b and that of the set Cb\mathcal{C}_b. We then introduce a paradigm for maxmin\max-\min learning weight matrices that relates input and output data from training data. The learning error is expressed in terms of the LL_\infty norm. We compute by an explicit formula the minimal value of the learning error according to the training data. We give a method to construct weight matrices whose learning error is minimal, that we call approximate weight matrices. Finally, as an application of our results, we show how to learn approximately the rule parameters of a possibilistic rule-based system according to multiple training data

    Data Normalization in Decision Making Processes

    Get PDF
    With the fast-growing of data-rich systems, dealing with complex decision problems is unavoidable. Normalization is a crucial step in most multi criteria decision making (MCDM) models, to produce comparable and dimensionless data from heterogeneous data. Further, MCDM requires data to be numerical and comparable to be aggregated into a single score per alternative, thus providing their ranking. Several normalization techniques are available, but their performance depends on a number of characteristics of the problem at hand i.e., different normalization techniques may provide different rankings for alternatives. Therefore, it is a challenge to select a suitable normalization technique to represent an appropriate mapping from source data to a common scale. There are some attempts in the literature to address the subject of normalization in MCDM, but there is still a lack of assessment frameworks for evaluating normalization techniques. Hence, the main contribution and objective of this study is to develop an assessment framework for analysing the effects of normalization techniques on ranking of alternatives in MCDM methods and recommend the most appropriate technique for specific decision problems. The proposed assessment framework consists of four steps: (i) determining data types; (ii) chose potential candidate normalization techniques; (iii) analysis and evaluation of techniques; and (iv) selection of the best normalization technique. To validate the efficiency and robustness of the proposed framework, six normalization techniques (Max, Max-Min, Sum, Vector, Logarithmic, and Fuzzification) are selected from linear, semi-linear, and non-linear categories, and tested with four well known MCDM methods (TOPSIS, SAW, AHP, and ELECTRE), from scoring, comparative, and ranking methods. Designing the proposed assessment framework led to a conceptual model allowing an automatic decision-making process, besides recommending the most appropriate normalization technique for MCDM problems. Furthermore, the role of normalization techniques for dynamic multi criteria decision making (DMCDM) in collaborative networks is explored, specifically related to problems of selection of suppliers, business partners, resources, etc. To validate and test the utility and applicability of the assessment framework, a number of case studies are discussed and benchmarking and testimonies from experts are used. Also, an evaluation by the research community of the work developed is presented. The validation process demonstrated that the proposed assessment framework increases the accuracy of results in MCDM decision problems.Com o rápido crescimento dos sistemas ricos em dados, lidar com problemas de decisão complexos é inevitável. A normalização é uma etapa crucial na maioria dos modelos de tomada de decisão multicritério (MCDM), para produzir dados comparáveis e adimensionais a partir de dados heterogéneos, porque os dados precisam ser numéricos e comparáveis para serem agregados em uma única pontuação por alternativa. Como tal, várias técnicas de normalização estão disponíveis, mas o seu desempenho depende de uma série de características do problema em questão, ou seja, diferentes técnicas de normalização podem resultar em diferentes classificações para as alternativas. Portanto, é um desafio selecionar uma técnica de normalização adequada para representar o mapeamento dos dados de origem para uma escala comum. Existem algumas tentativas na literatura de abordar o assunto da normalização, mas ainda há uma falta de estrutura de avaliação para avaliar as técnicas de normalização sobre qual técnica é mais apropriada para os métodos MCDM.Assim, a principal contribuição e objetivo deste estudo são desenvolver uma ferramenta de avaliação para analisar os efeitos das técnicas de normalização na seriação de alternativas em métodos MCDM e recomendar a técnica mais adequada para problemas de decisão específicos. A estrutura de avaliação da ferramenta proposta consiste em quatro etapas: (i) determinar os tipos de dados, (ii) selecionar potenciais técnicas de normalização, (iii) análise e avaliação de técnicas em problemas de MCDM, e (iv) recomendação da melhor técnica para o problema de decisão. Para validar a eficácia e robustez da ferramenta proposta, seis técnicas de normalização (Max, Max-Min, Sum, Vector, Logarithmic e Fuzzification) foram selecionadas - das categorias lineares, semilineares e não lineares- e quatro conhecidos métodos de MCDM foram escolhidos (TOPSIS, SAW, AHP e ELECTRE). O desenho da ferramenta de avaliação proposta levou ao modelo conceptual que forneceu um processo automático de tomada de decisão, além de recomendar a técnica de normalização mais adequada para problemas de decisão. Além disso, é explorado o papel das técnicas de normalização para tomada de decisão multicritério dinâmica (DMCDM) em redes colaborativas, especificamente relacionadas com problemas de seleção de fornecedores, parceiros de negócios, recursos, etc. Para validar e testar a utilidade e aplicabilidade da ferramenta de avaliação, uma série de casos de estudo são discutidos e benchmarking e testemunhos de especialistas são usados. Além disso, uma avaliação do trabalho desenvolvido pela comunidade de investigação também é apresentada. Esta validação demonstrou que a ferramenta proposta aumenta a precisão dos resultados em problemas de decisão multicritério

    Multispace & Multistructure. Neutrosophic Transdisciplinarity (100 Collected Papers of Sciences), Vol. IV

    Get PDF
    The fourth volume, in my book series of “Collected Papers”, includes 100 published and unpublished articles, notes, (preliminary) drafts containing just ideas to be further investigated, scientific souvenirs, scientific blogs, project proposals, small experiments, solved and unsolved problems and conjectures, updated or alternative versions of previous papers, short or long humanistic essays, letters to the editors - all collected in the previous three decades (1980-2010) – but most of them are from the last decade (2000-2010), some of them being lost and found, yet others are extended, diversified, improved versions. This is an eclectic tome of 800 pages with papers in various fields of sciences, alphabetically listed, such as: astronomy, biology, calculus, chemistry, computer programming codification, economics and business and politics, education and administration, game theory, geometry, graph theory, information fusion, neutrosophic logic and set, non-Euclidean geometry, number theory, paradoxes, philosophy of science, psychology, quantum physics, scientific research methods, and statistics. It was my preoccupation and collaboration as author, co-author, translator, or cotranslator, and editor with many scientists from around the world for long time. Many topics from this book are incipient and need to be expanded in future explorations

    Machine Learning Aided Stochastic Elastoplastic and Damage Analysis of Functionally Graded Structures

    Full text link
    The elastoplastic and damage analyses, which serve as key indicators for the nonlinear performances of engineering structures, have been extensively investigated during the past decades. However, with the development of advanced composite material, such as the functionally graded material (FGM), the nonlinear behaviour evaluations of such advantageous materials still remain tough challenges. Moreover, despite of the assumption that structural system parameters are widely adopted as deterministic, it is already illustrated that the inevitable and mercurial uncertainties of these system properties inherently associate with the concerned structural models and nonlinear analysis process. The existence of such fluctuations potentially affects the actual elastoplastic and damage behaviours of the FGM structures, which leads to the inadequacy between the approximation results with the actual structural safety conditions. Consequently, it is requisite to establish a robust stochastic nonlinear analysis framework complied with the requirements of modern composite engineering practices. In this dissertation, a novel uncertain nonlinear analysis framework, namely the machine leaning aided stochastic elastoplastic and damage analysis framework, is presented herein for FGM structures. The proposed approach is a favorable alternative to determine structural reliability when full-scale testing is not achievable, thus leading to significant eliminations of manpower and computational efforts spent in practical engineering applications. Within the developed framework, a novel extended support vector regression (X-SVR) with Dirichlet feature mapping approach is introduced and then incorporated for the subsequent uncertainty quantification. By successfully establishing the governing relationship between the uncertain system parameters and any concerned structural output, a comprehensive probabilistic profile including means, standard deviations, probability density functions (PDFs), and cumulative distribution functions (CDFs) of the structural output can be effectively established through a sampling scheme. Consequently, by adopting the machine learning aided stochastic elastoplastic and damage analysis framework into real-life engineering application, the advantages of the next generation uncertainty quantification analysis can be highlighted, and appreciable contributions can be delivered to both structural safety evaluation and structural design fields

    Advances in Data Mining Knowledge Discovery and Applications

    Get PDF
    Advances in Data Mining Knowledge Discovery and Applications aims to help data miners, researchers, scholars, and PhD students who wish to apply data mining techniques. The primary contribution of this book is highlighting frontier fields and implementations of the knowledge discovery and data mining. It seems to be same things are repeated again. But in general, same approach and techniques may help us in different fields and expertise areas. This book presents knowledge discovery and data mining applications in two different sections. As known that, data mining covers areas of statistics, machine learning, data management and databases, pattern recognition, artificial intelligence, and other areas. In this book, most of the areas are covered with different data mining applications. The eighteen chapters have been classified in two parts: Knowledge Discovery and Data Mining Applications

    Discrete Mathematics and Symmetry

    Get PDF
    Some of the most beautiful studies in Mathematics are related to Symmetry and Geometry. For this reason, we select here some contributions about such aspects and Discrete Geometry. As we know, Symmetry in a system means invariance of its elements under conditions of transformations. When we consider network structures, symmetry means invariance of adjacency of nodes under the permutations of node set. The graph isomorphism is an equivalence relation on the set of graphs. Therefore, it partitions the class of all graphs into equivalence classes. The underlying idea of isomorphism is that some objects have the same structure if we omit the individual character of their components. A set of graphs isomorphic to each other is denominated as an isomorphism class of graphs. The automorphism of a graph will be an isomorphism from G onto itself. The family of all automorphisms of a graph G is a permutation group

    Mathematical Fuzzy Logic in the Emerging Fields of Engineering, Finance, and Computer Sciences

    Get PDF
    Mathematical fuzzy logic (MFL) specifically targets many-valued logic and has significantly contributed to the logical foundations of fuzzy set theory (FST). It explores the computational and philosophical rationale behind the uncertainty due to imprecision in the backdrop of traditional mathematical logic. Since uncertainty is present in almost every real-world application, it is essential to develop novel approaches and tools for efficient processing. This book is the collection of the publications in the Special Issue “Mathematical Fuzzy Logic in the Emerging Fields of Engineering, Finance, and Computer Sciences”, which aims to cover theoretical and practical aspects of MFL and FST. Specifically, this book addresses several problems, such as:- Industrial optimization problems- Multi-criteria decision-making- Financial forecasting problems- Image processing- Educational data mining- Explainable artificial intelligence, etc

    ISIPTA'07: Proceedings of the Fifth International Symposium on Imprecise Probability: Theories and Applications

    Get PDF
    B

    Uncertain Multi-Criteria Optimization Problems

    Get PDF
    Most real-world search and optimization problems naturally involve multiple criteria as objectives. Generally, symmetry, asymmetry, and anti-symmetry are basic characteristics of binary relationships used when modeling optimization problems. Moreover, the notion of symmetry has appeared in many articles about uncertainty theories that are employed in multi-criteria problems. Different solutions may produce trade-offs (conflicting scenarios) among different objectives. A better solution with respect to one objective may compromise other objectives. There are various factors that need to be considered to address the problems in multidisciplinary research, which is critical for the overall sustainability of human development and activity. In this regard, in recent decades, decision-making theory has been the subject of intense research activities due to its wide applications in different areas. The decision-making theory approach has become an important means to provide real-time solutions to uncertainty problems. Theories such as probability theory, fuzzy set theory, type-2 fuzzy set theory, rough set, and uncertainty theory, available in the existing literature, deal with such uncertainties. Nevertheless, the uncertain multi-criteria characteristics in such problems have not yet been explored in depth, and there is much left to be achieved in this direction. Hence, different mathematical models of real-life multi-criteria optimization problems can be developed in various uncertain frameworks with special emphasis on optimization problems

    Symmetry in Applied Mathematics

    Get PDF
    Applied mathematics and symmetry work together as a powerful tool for problem reduction and solving. We are communicating applications in probability theory and statistics (A Test Detecting the Outliers for Continuous Distributions Based on the Cumulative Distribution Function of the Data Being Tested, The Asymmetric Alpha-Power Skew-t Distribution), fractals - geometry and alike (Khovanov Homology of Three-Strand Braid Links, Volume Preserving Maps Between p-Balls, Generation of Julia and Mandelbrot Sets via Fixed Points), supersymmetry - physics, nanostructures -chemistry, taxonomy - biology and alike (A Continuous Coordinate System for the Plane by Triangular Symmetry, One-Dimensional Optimal System for 2D Rotating Ideal Gas, Minimal Energy Configurations of Finite Molecular Arrays, Noether-Like Operators and First Integrals for Generalized Systems of Lane-Emden Equations), algorithms, programs and software analysis (Algorithm for Neutrosophic Soft Sets in Stochastic Multi-Criteria Group Decision Making Based on Prospect Theory, On a Reduced Cost Higher Order Traub-Steffensen-Like Method for Nonlinear Systems, On a Class of Optimal Fourth Order Multiple Root Solvers without Using Derivatives) to specific subjects (Facility Location Problem Approach for Distributed Drones, Parametric Jensen-Shannon Statistical Complexity and Its Applications on Full-Scale Compartment Fire Data). Diverse topics are thus combined to map out the mathematical core of practical problems
    corecore