25,162 research outputs found

    Dominance-based Rough Set Approach, basic ideas and main trends

    Full text link
    Dominance-based Rough Approach (DRSA) has been proposed as a machine learning and knowledge discovery methodology to handle Multiple Criteria Decision Aiding (MCDA). Due to its capacity of asking the decision maker (DM) for simple preference information and supplying easily understandable and explainable recommendations, DRSA gained much interest during the years and it is now one of the most appreciated MCDA approaches. In fact, it has been applied also beyond MCDA domain, as a general knowledge discovery and data mining methodology for the analysis of monotonic (and also non-monotonic) data. In this contribution, we recall the basic principles and the main concepts of DRSA, with a general overview of its developments and software. We present also a historical reconstruction of the genesis of the methodology, with a specific focus on the contribution of Roman S{\l}owi\'nski.Comment: This research was partially supported by TAILOR, a project funded by European Union (EU) Horizon 2020 research and innovation programme under GA No 952215. This submission is a preprint of a book chapter accepted by Springer, with very few minor differences of just technical natur

    Azimuthal Spin Asymmetries of Pion Electroproduction

    Get PDF
    Azimuthal spin asymmetries, both for charged and neutral pion production in semi-inclusive deep inelastic scattering of unpolarized charged lepton beams on longitudinally and transversely polarized nucleon targets, are analyzed and calculated. Various assumptions and approximations in the quark distributions and fragmentation functions often used in these calculations are studied in detail. It is found that different approaches to the distribution and fragmentation functions may lead to quite different predictions on the azimuthal asymmetries measured in the HERMES experiments, thus their effects should be taken into account before using the available data as a measurement of quark transversity distributions. It is also found that the unfavored quark to pion fragmentation functions must be taken into account for π−\pi^- production from a proton target, although they can be neglected for π+\pi^+ and π0\pi^0 production. Pion production from a proton target is suitable to study the uu quark transversity distribution, whereas a combination of pion production from both proton and neutron targets can measure the flavor structure of quark transversity distributions.Comment: 31 latex pages, 13 figure, to appear in PR

    Attribute Equilibrium Dominance Reduction Accelerator (DCCAEDR) Based on Distributed Coevolutionary Cloud and Its Application in Medical Records

    Full text link
    © 2013 IEEE. Aimed at the tremendous challenge of attribute reduction for big data mining and knowledge discovery, we propose a new attribute equilibrium dominance reduction accelerator (DCCAEDR) based on the distributed coevolutionary cloud model. First, the framework of N-populations distributed coevolutionary MapReduce model is designed to divide the entire population into N subpopulations, sharing the reward of different subpopulations' solutions under a MapReduce cloud mechanism. Because the adaptive balancing between exploration and exploitation can be achieved in a better way, the reduction performance is guaranteed to be the same as those using the whole independent data set. Second, a novel Nash equilibrium dominance strategy of elitists under the N bounded rationality regions is adopted to assist the subpopulations necessary to attain the stable status of Nash equilibrium dominance. This further enhances the accelerator's robustness against complex noise on big data. Third, the approximation parallelism mechanism based on MapReduce is constructed to implement rule reduction by accelerating the computation of attribute equivalence classes. Consequently, the entire attribute reduction set with the equilibrium dominance solution can be achieved. Extensive simulation results have been used to illustrate the effectiveness and robustness of the proposed DCCAEDR accelerator for attribute reduction on big data. Furthermore, the DCCAEDR is applied to solve attribute reduction for traditional Chinese medical records and to segment cortical surfaces of the neonatal brain 3-D-MRI records, and the DCCAEDR shows the superior competitive results, when compared with the representative algorithms

    Parameter Selection and Uncertainty Measurement for Variable Precision Probabilistic Rough Set

    Get PDF
    In this paper, we consider the problem of parameter selection and uncertainty measurement for a variable precision probabilistic rough set. Firstly, within the framework of the variable precision probabilistic rough set model, the relative discernibility of a variable precision rough set in probabilistic approximation space is discussed, and the conditions that make precision parameters α discernible in a variable precision probabilistic rough set are put forward. Concurrently, we consider the lack of predictability of precision parameters in a variable precision probabilistic rough set, and we propose a systematic threshold selection method based on relative discernibility of sets, using the concept of relative discernibility in probabilistic approximation space. Furthermore, a numerical example is applied to test the validity of the proposed method in this paper. Secondly, we discuss the problem of uncertainty measurement for the variable precision probabilistic rough set. The concept of classical fuzzy entropy is introduced into probabilistic approximation space, and the uncertain information that comes from approximation space and the approximated objects is fully considered. Then, an axiomatic approach is established for uncertainty measurement in a variable precision probabilistic rough set, and several related interesting properties are also discussed. Thirdly, we study the attribute reduction for the variable precision probabilistic rough set. The definition of reduction and its characteristic theorems are given for the variable precision probabilistic rough set. The main contribution of this paper is twofold. One is to propose a method of parameter selection for a variable precision probabilistic rough set. Another is to present a new approach to measurement uncertainty and the method of attribute reduction for a variable precision probabilistic rough set

    Parameter-Free Calculation of the Solar Proton Fusion Rate in Effective Field Theory

    Get PDF
    Spurred by the recent complete determination of the weak currents in two-nucleon systems up to O(Q3){\cal O}(Q^3) in heavy-baryon chiral perturbation theory, we carry out a parameter-free calculation of the solar proton fusion rate in an effective field theory that combines the merits of the standard nuclear physics method and systematic chiral expansion. Using the tritium beta-decay rate as an input to fix the only unknown parameter in the effective Lagrangian, we can evaluate with drastically improved precision the ratio of the two-body contribution to the well established one-body contribution; the ratio is determined to be (0.86\pm 0.05) %. This result is essentially independent of the cutoff parameter for a wide range of its variation (500 MeV \le \Lambda \le 800 MeV), a feature that substantiates the consistency of the calculation.Comment: 10 pages. The argument is considerably more sharpened with a reduced error ba
    • …
    corecore