1,104,005 research outputs found

    Adaptive imputation of missing values for incomplete pattern classification

    Get PDF
    In classification of incomplete pattern, the missing values can either play a crucial role in the class determination, or have only little influence (or eventually none) on the classification results according to the context. We propose a credal classification method for incomplete pattern with adaptive imputation of missing values based on belief function theory. At first, we try to classify the object (incomplete pattern) based only on the available attribute values. As underlying principle, we assume that the missing information is not crucial for the classification if a specific class for the object can be found using only the available information. In this case, the object is committed to this particular class. However, if the object cannot be classified without ambiguity, it means that the missing values play a main role for achieving an accurate classification. In this case, the missing values will be imputed based on the K-nearest neighbor (K-NN) and self-organizing map (SOM) techniques, and the edited pattern with the imputation is then classified. The (original or edited) pattern is respectively classified according to each training class, and the classification results represented by basic belief assignments are fused with proper combination rules for making the credal classification. The object is allowed to belong with different masses of belief to the specific classes and meta-classes (which are particular disjunctions of several single classes). The credal classification captures well the uncertainty and imprecision of classification, and reduces effectively the rate of misclassifications thanks to the introduction of meta-classes. The effectiveness of the proposed method with respect to other classical methods is demonstrated based on several experiments using artificial and real data sets

    Supervised Classification: Quite a Brief Overview

    Full text link
    The original problem of supervised classification considers the task of automatically assigning objects to their respective classes on the basis of numerical measurements derived from these objects. Classifiers are the tools that implement the actual functional mapping from these measurements---also called features or inputs---to the so-called class label---or output. The fields of pattern recognition and machine learning study ways of constructing such classifiers. The main idea behind supervised methods is that of learning from examples: given a number of example input-output relations, to what extent can the general mapping be learned that takes any new and unseen feature vector to its correct class? This chapter provides a basic introduction to the underlying ideas of how to come to a supervised classification problem. In addition, it provides an overview of some specific classification techniques, delves into the issues of object representation and classifier evaluation, and (very) briefly covers some variations on the basic supervised classification task that may also be of interest to the practitioner

    Adaptive Regularization in Neural Network Modeling

    Get PDF
    . In this paper we address the important problem of optimizing regularization parameters in neural network modeling. The suggested optimization scheme is an extended version of the recently presented algorithm [24]. The idea is to minimize an empirical estimate -- like the cross-validation estimate -- of the generalization error with respect to regularization parameters. This is done by employing a simple iterative gradient descent scheme using virtually no additional programming overhead compared to standard training. Experiments with feed-forward neural network models for time series prediction and classification tasks showed the viability and robustness of the algorithm. Moreover, we provided some simple theoretical examples in order to illustrate the potential and limitations of the proposed regularization framework. 1 Introduction Neural networks are flexible tools for time series processing and pattern recognition. By increasing the number of hidden neurons in a 2-layer architec..

    The Foundation of Pattern Structures and their Applications

    Get PDF
    This thesis is divided into a theoretical part, aimed at developing statements around the newly introduced concept of pattern morphisms, and a practical part, where we present use cases of pattern structures. A first insight of our work clarifies the facts on projections of pattern structures. We discovered that a projection of a pattern structure does not always lead again to a pattern structure. A solution to this problem, and one of the most important points of this thesis, is the introduction of pattern morphisms in Chapter4. Pattern morphisms make it possible to describe relationships between pattern structures, and thus enable a deeper understanding of pattern structures in general. They also provide the means to describe projections of pattern structures that lead to pattern structures again. In Chapter5 and Chapter6, we looked at the impact of morphisms between pattern structures on concept lattices and on their representations and thus clarified the theoretical background of existing research in this field. The application part reveals that random forests can be described through pattern structures, which constitutes another central achievement of our work. In order to demonstrate the practical relevance of our findings, we included a use case where this finding is used to build an algorithm that solves a real world classification problem of red wines. The prediction accuracy of the random forest is better, but the high interpretability makes our algorithm valuable. Another approach to the red wine classification problem is presented in Chapter 8, where, starting from an elementary pattern structure, we built a classification model that yielded good results

    Photonic reservoir computing: a new approach to optical information processing

    Get PDF
    Despite ever increasing computational power, recognition and classification problems remain challenging to solve. Recently advances have been made by the introduction of the new concept of reservoir computing. This is a methodology coming from the field of machine learning and neural networks and has been successfully used in several pattern classification problems, like speech and image recognition. The implementations have so far been in software, limiting their speed and power efficiency. Photonics could be an excellent platform for a hardware implementation of this concept because of its inherent parallelism and unique nonlinear behaviour. We propose using a network of coupled Semiconductor Optical Amplifiers (SOA) and show in simulation that it could be used as a reservoir by comparing it on a benchmark speech recognition task to conventional software implementations. In spite of several differences, they perform as good as or better than conventional implementations. Moreover, a photonic implementation offers the promise of massively parallel information processing with low power and high speed. We will also address the role phase plays on the reservoir performance

    Has the Euro affected the choice of invoicing currency?

    Get PDF
    We present a new approach to study empirically the effect of the introduction of the euro on the pattern of currency invoicing. Our approach uses a compositional multinomial logit model, in which currency choice is explained by both currency-specific and country-specific determinants. We use unique quarterly panel data on the invoicing of Norwegian imports from OECD countries for the 1996-2006 period. We find that eurozone countries have substantially increased their share of home currency invoicing after the introduction of the euro, whereas the home currency share of non-eurozone countries fell slightly. In addition, the euro as a vehicle currency has overtaken the role of the US dollar in Norwegian imports. The substantial rise in producer currency invoicing by eurozone countries is primarily caused by a drop in inflation volatility and can only to a small extent be explained by an unobserved euro effect. JEL Classification: F33, F41, F42, E31, C25.Euro, invoicing currency, exchange rate risk, inflation volatility, vehicle currencies, compositional multinomial logit.

    A Conceptual Model for Adoption and Diffusion Process of A New Product

    Get PDF
    In the past several years, researchers have started to notice successful products whose sales patterns show rapidly declining diffusion patterns. These products include certain movies, computer software, TV game software, music CDs, etc. (e.g., Windows95 (up grade version), Final Fantasy, Terminator 2; Sawhney and Eliashberg 1996, Yamada et al. 1997, Moe and Fader 1998). These declining diffusion patterns have been rather neglected in the field of marketing (Bass 1969) because they were regarded as being peculiar to unsuccessful products, even though before Bass (1969), Fourt and Woodlock (1960) predicted first purchases of grocery products by an exponential model and in theory Lekvall and Wahlbin (1973) raised the possibility of various diffusion patterns from a bell-shaped one (logistic model) to a rapidly declining one (modified exponential model) using a mixed model similar to the Bass diffusion model. Also after Bass (1969), Gatignon and Robertson (1985) discussed the same possibility with 29 propositions. Generally speaking, however, there were no such studies that include rapidly declining diffusion patterns until recently except for the above studies (e.g., Sawhney and Eliashberg 1996; Yamada et al. 1997; Moe and Fader 1998). However, the relative importance of the entertainment industry or contents industry and IT-related industry has become greater due to the growth of the "networked" society. We believe that it is time to take a closer look at these products showing rapidly declining diffusion patterns from product classification and diffusion theory points of view. Establishing a conceptual model of adoption and diffusion process of a new product, we proposed the third "high involvement" adoption model. We call such a product as an eagerly wanted product and define it as anything that can be offered to a market to satisfy an eager want or need. Then we establish operational hypotheses to test the conceptual hypothesis that an eagerly wanted product should take a rapidly declining diffusion pattern from the beginning. We tested the following operational hypotheses on sales patterns of 254 new popular music CDs including albums and singles sold in one of the national chains of convenience stores in Japan. Common practice of music CD consumers in Japan is that they first rent single CDs and then buy albums. H1: A popular music album CD is an eagerly wanted product, that is, its diffusion pattern is rapidly declining. H2: The fraction of rapidly declining diffusion patterns for album CDs is greater than that for single CDs. H3: Sales pattern of a new singer's debut single CD does not take a rapidly declining diffusion pattern. H4: The sales pattern of a debut single of a new group or a singer produced through a well-designed process is a rapidly declining one. We also tested sales patterns of new products of beer and low malt liquor as additional evidence to H2 and H3, because new beer products may be anticipated through promotional efforts but may not be awaited as eagerly as CD albums. We obtained favorable results on all four hypotheses. As an implication of this study, a set of strategies for product development and introduction for an eagerly wanted product is proposed: (1) One should let consumers be involved from the development stage (the outset); for example (a) the ASAYAN project of TV Tokyo (see Section 4.2); (b) the use of famous artists, movie stars, and directors; (c) creating a series etc. (2) Before the introduction of a new product, its promotion and publicity should be done as intensively and widely as possible in the target market. Use media mix, etc. (3) The initial price should be set at the most reasonable level possible or free if possible. (4) To obtain a large potential market quickly, make as many business alliances as possible.Innovation diffusion process, Product classification, Diffusion pattern classification, Popular music CDs,

    What Makes a Pattern? Matching Decoding Methods to Data in Multivariate Pattern Analysis

    Get PDF
    Research in neuroscience faces the challenge of integrating information across different spatial scales of brain function. A promising technique for harnessing information at a range of spatial scales is multivariate pattern analysis (MVPA) of functional magnetic resonance imaging (fMRI) data. While the prevalence of MVPA has increased dramatically in recent years, its typical implementations for classification of mental states utilize only a subset of the information encoded in local fMRI signals. We review published studies employing multivariate pattern classification since the technique’s introduction, which reveal an extensive focus on the improved detection power that linear classifiers provide over traditional analysis techniques. We demonstrate using simulations and a searchlight approach, however, that non-linear classifiers are capable of extracting distinct information about interactions within a local region. We conclude that for spatially localized analyses, such as searchlight and region of interest, multiple classification approaches should be compared in order to match fMRI analyses to the properties of local circuits
    corecore