44 research outputs found

    A randomized neural network for data streams

    Get PDF
    © 2017 IEEE. Randomized neural network (RNN) is a highly feasible solution in the era of big data because it offers a simple and fast working principle in processing dynamic and evolving data streams. This paper proposes a novel RNN, namely recurrent type-2 random vector functional link network (RT2McRVFLN), which provides a highly scalable solution for data streams in a strictly online and integrated framework. It is built upon the psychologically inspired concept of metacognitive learning, which covers three basic components of human learning: what-to-learn, how-to-learn, and when-to-learn. The what-to-learn selects important samples on the fly with the use of online active learning scenario, which renders our algorithm an online semi-supervised algorithm. The how-to-learn process combines an open structure of evolving concept and a randomized learning algorithm of random vector functional link network (RVFLN). The efficacy of the RT2McRVFLN has been numerically validated through two real-world case studies and comparisons with its counterparts, which arrive at a conclusive finding that our algorithm delivers a tradeoff between accuracy and simplicity

    An evolving machine learning method for human activity recognition systems

    No full text
    In this paper is presented a novel approach for human activity recognition (HAR) through complex data provided from wearable sensors. This approach considers the development of a more realistic system which takes into account the diversity of the population. It aims to define a general HAR model for any type of individuals. To achieve this much-needed processing capacity, this novel approach makes use of customizable, self-adaptive, self-development capacities of the so-called machine learning technique named evolving intelligent systems. An online pre-processing model to suit real-time capacities has been developed and is also explained in detail in this paper. Additionally, this paper provides valuable information on sensor analysis, online feature extraction, and evolving classifiers used for the attainment of this purpose

    Fuzzy Systems Design: Direct and Indirect Approaches.

    No full text
    A systematic classification of the data-driven approaches for design of fuzzy systems is given in the paper. The possible ways to solve this modelling and identification problem are classified on the basis of the optimisation techniques used for this purpose. One algorithm for each of the two basic categories of design methods is presented and its advantages and disadvantages are discussed. Both types of algorithms are self-learning and do not require interaction during the process of fuzzy model design. They perform adaptation of both the fuzzy model structure (rule-base) and the parameters. The indirect approach exploits the dual nature of Takagi-Sugeno (TS) models and is based on recently introduced recursive clustering combined with Kalman filtering-based procedure for recursive estimation of the parameter of the local sub-models. Both algorithms result in finding compact and transparent fuzzy models. The direct approach solves the optimisation problem directly, while the indirect one decomposes the original problem into on-line clustering and recursive estimation problems and finds a sub-optimal solution in real-time. The later one is computationally very efficient and has a range of potential applications in real-time process control, moving images recognition, autonomous systems design etc. It is extended in this paper for the case of multi-input–multi-output (MIMO systems). Both approaches have been tested with real data from an engineering process. (c) Springe

    Empirical Approach—Introduction

    No full text
    In this chapter, we will describe the fundamentals of the proposed new “empirical” approach as a systematic methodology with its nonparametric quantities derived entirely from the actual data with no subjective and/or problem-specific assumptions made. It has a potential to be a powerful extension of (and/or alternative to) the traditional probability theory, statistical learning and computational intelligence methods. The nonparametric quantities of the proposed new empirical approach include: (1) the cumulative proximity; (2) the eccentricity, and the standardized eccentricity; (3) the data density, and (4) the typicality. They can be recursively updated on a sample-by-sample basis, and they have unimodal and multimodal, discrete and continuous forms/versions. The nonparametric quantities are based on ensemble properties of the data and not limited by prior restrictive assumptions. The discrete version of the typicality resembles the unimodal probability density function, but is in a discrete form. The discrete multimodal typicality resembles the probability mass function
    corecore