26,588 research outputs found

    Extracting finite structure from infinite language

    Get PDF
    This paper presents a novel connectionist memory-rule based model capable of learning the finite-state properties of an input language from a set of positive examples. The model is based upon an unsupervised recurrent self-organizing map [T. McQueen, A. Hopgood, J. Tepper, T. Allen, A recurrent self-organizing map for temporal sequence processing, in: Proceedings of Fourth International Conference in Recent Advances in Soft Computing (RASC2002), Nottingham, 2002] with laterally interconnected neurons. A derivation of functionalequivalence theory [J. Hopcroft, J. Ullman, Introduction to Automata Theory, Languages and Computation, vol. 1, Addison-Wesley, Reading, MA, 1979] is used that allows the model to exploit similarities between the future context of previously memorized sequences and the future context of the current input sequence. This bottom-up learning algorithm binds functionally related neurons together to form states. Results show that the model is able to learn the Reber grammar [A. Cleeremans, D. Schreiber, J. McClelland, Finite state automata and simple recurrent networks, Neural Computation, 1 (1989) 372–381] perfectly from a randomly generated training set and to generalize to sequences beyond the length of those found in the training set

    Data-driven Soft Sensors in the Process Industry

    Get PDF
    In the last two decades Soft Sensors established themselves as a valuable alternative to the traditional means for the acquisition of critical process variables, process monitoring and other tasks which are related to process control. This paper discusses characteristics of the process industry data which are critical for the development of data-driven Soft Sensors. These characteristics are common to a large number of process industry fields, like the chemical industry, bioprocess industry, steel industry, etc. The focus of this work is put on the data-driven Soft Sensors because of their growing popularity, already demonstrated usefulness and huge, though yet not completely realised, potential. A comprehensive selection of case studies covering the three most important Soft Sensor application fields, a general introduction to the most popular Soft Sensor modelling techniques as well as a discussion of some open issues in the Soft Sensor development and maintenance and their possible solutions are the main contributions of this work

    Towards a Formal Model of Recursive Self-Reflection

    Get PDF
    Self-awareness holds the promise of better decision making based on a comprehensive assessment of a system\u27s own situation. Therefore it has been studied for more than ten years in a range of settings and applications. However, in the literature the term has been used in a variety of meanings and today there is no consensus on what features and properties it should include. In fact, researchers disagree on the relative benefits of a self-aware system compared to one that is very similar but lacks self-awareness. We sketch a formal model, and thus a formal definition, of self-awareness. The model is based on dynamic dataflow semantics and includes self-assessment, a simulation and an abstraction as facilitating techniques, which are modeled by spawning new dataflow actors in the system. Most importantly, it has a method to focus on any of its parts to make it a subject of analysis by applying abstraction, self-assessment and simulation. In particular, it can apply this process to itself, which we call recursive self-reflection. There is no arbitrary limit to this self-scrutiny except resource constraints

    Potential implementation of Reservoir Computing models based on magnetic skyrmions

    Full text link
    Reservoir Computing is a type of recursive neural network commonly used for recognizing and predicting spatio-temporal events relying on a complex hierarchy of nested feedback loops to generate a memory functionality. The Reservoir Computing paradigm does not require any knowledge of the reservoir topology or node weights for training purposes and can therefore utilize naturally existing networks formed by a wide variety of physical processes. Most efforts prior to this have focused on utilizing memristor techniques to implement recursive neural networks. This paper examines the potential of skyrmion fabrics formed in magnets with broken inversion symmetry that may provide an attractive physical instantiation for Reservoir Computing.Comment: 11 pages, 3 figure

    Forecasting the CATS benchmark with the Double Vector Quantization method

    Full text link
    The Double Vector Quantization method, a long-term forecasting method based on the SOM algorithm, has been used to predict the 100 missing values of the CATS competition data set. An analysis of the proposed time series is provided to estimate the dimension of the auto-regressive part of this nonlinear auto-regressive forecasting method. Based on this analysis experimental results using the Double Vector Quantization (DVQ) method are presented and discussed. As one of the features of the DVQ method is its ability to predict scalars as well as vectors of values, the number of iterative predictions needed to reach the prediction horizon is further observed. The method stability for the long term allows obtaining reliable values for a rather long-term forecasting horizon.Comment: Accepted for publication in Neurocomputing, Elsevie

    A Review of Bankruptcy Prediction Studies: 1930-Present

    Get PDF
    One of the most well-known bankruptcy prediction models was developed by Altman [1968] using multivariate discriminant analysis. Since Altman\u27s model, a multitude of bankruptcy prediction models have flooded the literature. The primary goal of this paper is to summarize and analyze existing research on bankruptcy prediction studies in order to facilitate more productive future research in this area. This paper traces the literature on bankruptcy prediction from the 1930\u27s, when studies focused on the use of simple ratio analysis to predict future bankruptcy, to present. The authors discuss how bankruptcy prediction studies have evolved, highlighting the different methods, number and variety of factors, and specific uses of models. Analysis of 165 bankruptcy prediction studies published from 1965 to present reveals trends in model development. For example, discriminant analysis was the primary method used to develop models in the 1960\u27s and 1970\u27s. Investigation of model type by decade shows that the primary method began to shift to logit analysis and neural networks in the 1980\u27s and 1990\u27s. The number of factors utilized in models is also analyzed by decade, showing that the average has varied over time but remains around 10 overall. Analysis of accuracy of the models suggests that multivariate discriminant analysis and neural networks are the most promising methods for bankruptcy prediction models. The findings also suggest that higher model accuracy is not guaranteed with a greater number of factors. Some models with two factors are just as capable of accurate prediction as models with 21 factors

    Spectrum Leasing as an Incentive towards Uplink Macrocell and Femtocell Cooperation

    Full text link
    The concept of femtocell access points underlaying existing communication infrastructure has recently emerged as a key technology that can significantly improve the coverage and performance of next-generation wireless networks. In this paper, we propose a framework for macrocell-femtocell cooperation under a closed access policy, in which a femtocell user may act as a relay for macrocell users. In return, each cooperative macrocell user grants the femtocell user a fraction of its superframe. We formulate a coalitional game with macrocell and femtocell users being the players, which can take individual and distributed decisions on whether to cooperate or not, while maximizing a utility function that captures the cooperative gains, in terms of throughput and delay.We show that the network can selforganize into a partition composed of disjoint coalitions which constitutes the recursive core of the game representing a key solution concept for coalition formation games in partition form. Simulation results show that the proposed coalition formation algorithm yields significant gains in terms of average rate per macrocell user, reaching up to 239%, relative to the non-cooperative case. Moreover, the proposed approach shows an improvement in terms of femtocell users' rate of up to 21% when compared to the traditional closed access policy.Comment: 29 pages, 11 figures, accepted at the IEEE JSAC on Femtocell Network

    The herd moves? Emergence and self-organization in collective actors?

    Get PDF
    The puzzle about collective actors is in the focus of this contribution. The first section enters into the question of the adequateness and inadequateness of reductionist explanations for the description of entities. The considerations in this part do not draw on systems and hence not on principles of self-organisation, because this concept necessitates a systemic view. In other words, the first section discusses reductionism and holism on a very general level. The scope of these arguments goes far beyond self-organising systems. Pragmatically, these arguments will be discussed within the domain of corporative actors. Emergence is a concept embedded in system theory. Therefore, in the second part the previous general considerations about holism are integrated with respect to the concept “emergence”. In order to close the argument by exactly characterising self-organising systems and giving the conceptual link between self-organisation and emergence – which is done in the section four – the third section generally conceptualises systems. This conceptualisation is independent of whether these systems are self-organising or not. Feedback loops are specified as an essential component of systems. They establish the essential precondition of system-theoretic models where causes may also be effects and vice versa. System-theory is essential for dynamic models like ecological models and network thinking. In the fourth part mathematical chaos-theory bridges the gap between the presentation of systems in general and the constricted consideration of self-organising systems. The capability to behave or react chaotically is a necessary precondition of self-organisation. Nevertheless, there are striking differences in the answers given from theories of self-organisation in biology, economics or sociology on the question “What makes the whole more than the sum of its parts?” The fracture seems particularly salient at the borderline between formal-mathematical sciences like natural sciences including economy and other social sciences like sociology, for instance in the understanding and conceptualisation of “chaos” or “complexity”. Sometimes it creates the impression that originally well defined concepts from mathematics and natural science are metaphorically used in social sciences. This is a further reason why this paper concentrates on conceptualisations of self-organisation from natural sciences. The fifth part integrates the arguments from a system-theoretic point of view given in the three previous sections with respect to collective and corporative actors. Due to his prominence all five sections sometimes deal with the sociological system theory by Niklas Luhmann, especially in those parts with rigorous and important differences between his conception and the view given in this text. Despite Luhmann’s undoubted prominence in sociology, the present text strives for a more analytical and formal understanding of social systems and tries to find a base for another methodological approach.
    corecore