211 research outputs found

    On the Choquet multiple criteria preference aggregation model: theoretical and practical insights from a real-world application

    Get PDF
    We consider the use of the Choquet integral for evaluating projects or actions in a real-world application starting from the case of the re-qualification of an abandoned quarry. Despite the Choquet integral being a very well-known preference model for which there is a rich and well developed theory, its application in a multiple criteria decision aiding perspective requires some specific methodological developments. This led us to work out and implement, in practice, two new procedures: A first procedure to build interval scales with the objective of assigning utility values on a common scale to the criteria performances, and a second one to construct a ratio scale for assigning numerical values to the capacities of the Choquet integral. This article discusses the strengths and weaknesses of the Choquet integral as appearing in the case study, proposing as well insights related to the interaction of the experts within a focus grou

    Ignorance, Fixed Costs, and the Stock Market Participation Puzzle

    Get PDF
    While the existence of fixed costs in entering asset markets is the leading rationalization of the "participation puzzle" -the fact that most households do not hold stocks, despite the diversification gains and the significant risk-premium involved-, most motivations of these fixed costs are as incompatible with conventional portfolio theory as the non participation itself. Nevertheless, we believe that these motivations are empirically correct, and thus we are forced to explore alternatives to conventional portfolio theory. We find in Choquet expected utility theory a tool that is better equipped to deal with more complex forms of ignorance than expected utility is. Within such model, we are able to express the idea that staying out of the market may be a rational response to the own ignorance. Within a Probit model for the 2001 Survey of Consumer Finances, we show suggestive evidence in its favornon additive beliefs, ambiguity, ignorance, asset market participation.

    Ignorance, Fixed Costs, and the Stock-Market Participation Puzzle

    Get PDF
    While the existence of fixed costs in entering asset markets is the leading rationalization of the “participation puzzle” —the fact that most households do not hold stocks, despite the diversification gains and the significant risk-premium involved—, most motivations of these fixed costs are as incompatible with conventional portfolio theory as the non participation itself. Nevertheless, we believe that these motivations are empirically correct, and thus we are forced to explore alternatives to conventional portfolio theory. We find in Choquet expected utility theory a tool that is better equipped to deal with more complex forms of ignorance than expected utility is. Within such model, we are able to express the idea that staying out of the market may be a rational response to the own ignorance. Within a Probit model for the 2001 Survey of Consumer Finances, we show suggestive evidence in its favor.Non additive beliefs, ambiguity, ignorance, asset market participation

    Multimodal fuzzy fusion for enhancing the motor-imagery-based brain computer interface

    Get PDF
    © 2005-2012 IEEE. Brain-computer interface technologies, such as steady-state visually evoked potential, P300, and motor imagery are methods of communication between the human brain and the external devices. Motor imagery-based brain-computer interfaces are popular because they avoid unnecessary external stimuli. Although feature extraction methods have been illustrated in several machine intelligent systems in motor imagery-based brain-computer interface studies, the performance remains unsatisfactory. There is increasing interest in the use of the fuzzy integrals, the Choquet and Sugeno integrals, that are appropriate for use in applications in which fusion of data must consider possible data interactions. To enhance the classification accuracy of brain-computer interfaces, we adopted fuzzy integrals, after employing the classification method of traditional brain-computer interfaces, to consider possible links between the data. Subsequently, we proposed a novel classification framework called the multimodal fuzzy fusion-based brain-computer interface system. Ten volunteers performed a motor imagery-based brain-computer interface experiment, and we acquired electroencephalography signals simultaneously. The multimodal fuzzy fusion-based brain-computer interface system enhanced performance compared with traditional brain-computer interface systems. Furthermore, when using the motor imagery-relevant electroencephalography frequency alpha and beta bands for the input features, the system achieved the highest accuracy, up to 78.81% and 78.45% with the Choquet and Sugeno integrals, respectively. Herein, we present a novel concept for enhancing brain-computer interface systems that adopts fuzzy integrals, especially in the fusion for classifying brain-computer interface commands

    Feature and Decision Level Fusion Using Multiple Kernel Learning and Fuzzy Integrals

    Get PDF
    The work collected in this dissertation addresses the problem of data fusion. In other words, this is the problem of making decisions (also known as the problem of classification in the machine learning and statistics communities) when data from multiple sources are available, or when decisions/confidence levels from a panel of decision-makers are accessible. This problem has become increasingly important in recent years, especially with the ever-increasing popularity of autonomous systems outfitted with suites of sensors and the dawn of the ``age of big data.\u27\u27 While data fusion is a very broad topic, the work in this dissertation considers two very specific techniques: feature-level fusion and decision-level fusion. In general, the fusion methods proposed throughout this dissertation rely on kernel methods and fuzzy integrals. Both are very powerful tools, however, they also come with challenges, some of which are summarized below. I address these challenges in this dissertation. Kernel methods for classification is a well-studied area in which data are implicitly mapped from a lower-dimensional space to a higher-dimensional space to improve classification accuracy. However, for most kernel methods, one must still choose a kernel to use for the problem. Since there is, in general, no way of knowing which kernel is the best, multiple kernel learning (MKL) is a technique used to learn the aggregation of a set of valid kernels into a single (ideally) superior kernel. The aggregation can be done using weighted sums of the pre-computed kernels, but determining the summation weights is not a trivial task. Furthermore, MKL does not work well with large datasets because of limited storage space and prediction speed. These challenges are tackled by the introduction of many new algorithms in the following chapters. I also address MKL\u27s storage and speed drawbacks, allowing MKL-based techniques to be applied to big data efficiently. Some algorithms in this work are based on the Choquet fuzzy integral, a powerful nonlinear aggregation operator parameterized by the fuzzy measure (FM). These decision-level fusion algorithms learn a fuzzy measure by minimizing a sum of squared error (SSE) criterion based on a set of training data. The flexibility of the Choquet integral comes with a cost, however---given a set of N decision makers, the size of the FM the algorithm must learn is 2N. This means that the training data must be diverse enough to include 2N independent observations, though this is rarely encountered in practice. I address this in the following chapters via many different regularization functions, a popular technique in machine learning and statistics used to prevent overfitting and increase model generalization. Finally, it is worth noting that the aggregation behavior of the Choquet integral is not intuitive. I tackle this by proposing a quantitative visualization strategy allowing the FM and Choquet integral behavior to be shown simultaneously

    A Framework for Selecting Architectural Tactics Using Fuzzy Measures

    Get PDF
    Software architects cannot avoid the consideration of quality attributes when designing software architecture. Architectural styles such as Layers and Client-Server are often used by architects to describe the overall structure and behavior of software. Although an architectural style affects the achievement of quality attributes, these quality attributes are directly performed by design decisions called architectural tactics. While the implementation of an architectural tactic supports a specific quality attribute, it often enhances or hurts other quality attributes in the software. In this paper, a framework for selecting the most appropriate architectural tactics according to their best achievement of the required levels of quality attributes when developing transaction processing systems is proposed. The proposed framework is based on fuzzy measures using Choquet Integral approach and takes into account the impact of

    Data depth and multiple output regression, the distorted M-quantiles approach

    Get PDF
    For a univariate distribution, its M-quantiles are obtained as solutions to asymmetric minimization problems dealing with the distance of a random variable to a fixed point. The asymmetry refers to the different weights for the values of the random variable at either side of the fixed point. We focus on M-quantiles whose associated losses are given in terms of a power. In this setting, the classical quantiles are obtained for the first power, while the expectiles correspond to quadratic losses. The M-quantiles considered here are computed over distorted distributions, which allows to tune the weight awarded to the more central or peripheral parts of the distribution. These distorted M-quantiles are used in the multivariate setting to introduce novel families of central regions and their associated depth functions, which are further extended to the multiple output regression setting in the form of conditional regression regions and conditional depths
    corecore