588 research outputs found

    BLADE: Filter Learning for General Purpose Computational Photography

    Full text link
    The Rapid and Accurate Image Super Resolution (RAISR) method of Romano, Isidoro, and Milanfar is a computationally efficient image upscaling method using a trained set of filters. We describe a generalization of RAISR, which we name Best Linear Adaptive Enhancement (BLADE). This approach is a trainable edge-adaptive filtering framework that is general, simple, computationally efficient, and useful for a wide range of problems in computational photography. We show applications to operations which may appear in a camera pipeline including denoising, demosaicing, and stylization

    A Competitive Learning Approach for Specialized Models: A Solution for Complex Physical Systems with Distinct Functional Regimes

    Full text link
    Complex systems in science and engineering sometimes exhibit behavior that changes across different regimes. Traditional global models struggle to capture the full range of this complex behavior, limiting their ability to accurately represent the system. In response to this challenge, we propose a novel competitive learning approach for obtaining data-driven models of physical systems. The primary idea behind the proposed approach is to employ dynamic loss functions for a set of models that are trained concurrently on the data. Each model competes for each observation during training, allowing for the identification of distinct functional regimes within the dataset. To demonstrate the effectiveness of the learning approach, we coupled it with various regression methods that employ gradient-based optimizers for training. The proposed approach was tested on various problems involving model discovery and function approximation, demonstrating its ability to successfully identify functional regimes, discover true governing equations, and reduce test errors

    A survey on utilization of data mining approaches for dermatological (skin) diseases prediction

    Get PDF
    Due to recent technology advances, large volumes of medical data is obtained. These data contain valuable information. Therefore data mining techniques can be used to extract useful patterns. This paper is intended to introduce data mining and its various techniques and a survey of the available literature on medical data mining. We emphasize mainly on the application of data mining on skin diseases. A categorization has been provided based on the different data mining techniques. The utility of the various data mining methodologies is highlighted. Generally association mining is suitable for extracting rules. It has been used especially in cancer diagnosis. Classification is a robust method in medical mining. In this paper, we have summarized the different uses of classification in dermatology. It is one of the most important methods for diagnosis of erythemato-squamous diseases. There are different methods like Neural Networks, Genetic Algorithms and fuzzy classifiaction in this topic. Clustering is a useful method in medical images mining. The purpose of clustering techniques is to find a structure for the given data by finding similarities between data according to data characteristics. Clustering has some applications in dermatology. Besides introducing different mining methods, we have investigated some challenges which exist in mining skin data

    Modeling and control of operator functional state in a unified framework of fuzzy inference petri nets

    Get PDF
    Background and objective: In human-machine (HM) hybrid control systems, human operator and machine cooperate to achieve the control objectives. To enhance the overall HM system performance, the discrete manual control task-load by the operator must be dynamically allocated in accordance with continuous-time fluctuation of psychophysiological functional status of the operator, so-called operator functional state (OFS). The behavior of the HM system is hybrid in nature due to the co-existence of discrete task-load (control) variable and continuous operator performance (system output) variable. Methods: Petri net is an effective tool for modeling discrete event systems, but for hybrid system involving discrete dynamics, generally Petri net model has to be extended. Instead of using different tools to represent continuous and discrete components of a hybrid system, this paper proposed a method of fuzzy inference Petri nets (FIPN) to represent the HM hybrid system comprising a Mamdani-type fuzzy model of OFS and a logical switching controller in a unified framework, in which the task-load level is dynamically reallocated between the operator and machine based on the model-predicted OFS. Furthermore, this paper used a multi-model approach to predict the operator performance based on three electroencephalographic (EEG) input variables (features) via the Wang-Mendel (WM) fuzzy modeling method. The membership function parameters of fuzzy OFS model for each experimental participant were optimized using artificial bee colony (ABC) evolutionary algorithm. Three performance indices, RMSE, MRE, and EPR, were computed to evaluate the overall modeling accuracy. Results: Experiment data from six participants are analyzed. The results show that the proposed method (FIPN with adaptive task allocation) yields lower breakdown rate (from 14.8% to 3.27%) and higher human performance (from 90.30% to 91.99%). Conclusion: The simulation results of the FIPN-based adaptive HM (AHM) system on six experimental participants demonstrate that the FIPN framework provides an effective way to model and regulate/optimize the OFS in HM hybrid systems composed of continuous-time OFS model and discrete-event switching controller

    A Survey of Neural Trees

    Full text link
    Neural networks (NNs) and decision trees (DTs) are both popular models of machine learning, yet coming with mutually exclusive advantages and limitations. To bring the best of the two worlds, a variety of approaches are proposed to integrate NNs and DTs explicitly or implicitly. In this survey, these approaches are organized in a school which we term as neural trees (NTs). This survey aims to present a comprehensive review of NTs and attempts to identify how they enhance the model interpretability. We first propose a thorough taxonomy of NTs that expresses the gradual integration and co-evolution of NNs and DTs. Afterward, we analyze NTs in terms of their interpretability and performance, and suggest possible solutions to the remaining challenges. Finally, this survey concludes with a discussion about other considerations like conditional computation and promising directions towards this field. A list of papers reviewed in this survey, along with their corresponding codes, is available at: https://github.com/zju-vipa/awesome-neural-treesComment: 35 pages, 7 figures and 1 tabl

    Low-level interpretability and high-level interpretability: a unified view of data-driven interpretable fuzzy system modelling

    Get PDF
    This paper aims at providing an in-depth overview of designing interpretable fuzzy inference models from data within a unified framework. The objective of complex system modelling is to develop reliable and understandable models for human being to get insights into complex real-world systems whose first-principle models are unknown. Because system behaviour can be described naturally as a series of linguistic rules, data-driven fuzzy modelling becomes an attractive and widely used paradigm for this purpose. However, fuzzy models constructed from data by adaptive learning algorithms usually suffer from the loss of model interpretability. Model accuracy and interpretability are two conflicting objectives, so interpretation preservation during adaptation in data-driven fuzzy system modelling is a challenging task, which has received much attention in fuzzy system modelling community. In order to clearly discriminate the different roles of fuzzy sets, input variables, and other components in achieving an interpretable fuzzy model, a taxonomy of fuzzy model interpretability is first proposed in terms of low-level interpretability and high-level interpretability in this paper. The low-level interpretability of fuzzy models refers to fuzzy model interpretability achieved by optimizing the membership functions in terms of semantic criteria on fuzzy set level, while the high-level interpretability refers to fuzzy model interpretability obtained by dealing with the coverage, completeness, and consistency of the rules in terms of the criteria on fuzzy rule level. Some criteria for low-level interpretability and high-level interpretability are identified, respectively. Different data-driven fuzzy modelling techniques in the literature focusing on the interpretability issues are reviewed and discussed from the perspective of low-level interpretability and high-level interpretability. Furthermore, some open problems about interpretable fuzzy models are identified and some potential new research directions on fuzzy model interpretability are also suggested. Crown Copyright © 2008

    Multiobjective evolutionary optimization of quadratic Takagi-Sugeno fuzzy rules for remote bathymetry estimation

    Get PDF
    In this work we tackle the problem of bathymetry estimation using: i) a multispectral optical image of the region of interest, and ii) a set of in situ measurements. The idea is to learn the relation that between the reflectances and the depth using a supervised learning approach. In particular, quadratic Takagi-Sugeno fuzzy rules are used to model this relation. The rule base is optimized by means of a multiobjective evolutionary algorithm. To the best of our knowledge this work represents the first use of a quadratic Takagi-Sugeno fuzzy system optimized by a multiobjective evolutionary algorithm with bounded complexity, i.e., able to control the complexity of the consequent part of second-order fuzzy rules. This model has an outstanding modeling power, without inheriting the drawback of complexity due to the use of quadratic functions (which have complexity that scales quadratically with the number of inputs). This opens the way to the use of the proposed approach even for medium/high dimensional problems, like in the case of hyper-spectral images

    Data Mining in Smart Grids

    Get PDF
    Effective smart grid operation requires rapid decisions in a data-rich, but information-limited, environment. In this context, grid sensor data-streaming cannot provide the system operators with the necessary information to act on in the time frames necessary to minimize the impact of the disturbances. Even if there are fast models that can convert the data into information, the smart grid operator must deal with the challenge of not having a full understanding of the context of the information, and, therefore, the information content cannot be used with any high degree of confidence. To address this issue, data mining has been recognized as the most promising enabling technology for improving decision-making processes, providing the right information at the right moment to the right decision-maker. This Special Issue is focused on emerging methodologies for data mining in smart grids. In this area, it addresses many relevant topics, ranging from methods for uncertainty management, to advanced dispatching. This Special Issue not only focuses on methodological breakthroughs and roadmaps in implementing the methodology, but also presents the much-needed sharing of the best practices. Topics include, but are not limited to, the following: Fuzziness in smart grids computing Emerging techniques for renewable energy forecasting Robust and proactive solution of optimal smart grids operation Fuzzy-based smart grids monitoring and control frameworks Granular computing for uncertainty management in smart grids Self-organizing and decentralized paradigms for information processin
    corecore