386 research outputs found
Covering rough sets based on neighborhoods: An approach without using neighborhoods
Rough set theory, a mathematical tool to deal with inexact or uncertain
knowledge in information systems, has originally described the indiscernibility
of elements by equivalence relations. Covering rough sets are a natural
extension of classical rough sets by relaxing the partitions arising from
equivalence relations to coverings. Recently, some topological concepts such as
neighborhood have been applied to covering rough sets. In this paper, we
further investigate the covering rough sets based on neighborhoods by
approximation operations. We show that the upper approximation based on
neighborhoods can be defined equivalently without using neighborhoods. To
analyze the coverings themselves, we introduce unary and composition operations
on coverings. A notion of homomorphismis provided to relate two covering
approximation spaces. We also examine the properties of approximations
preserved by the operations and homomorphisms, respectively.Comment: 13 pages; to appear in International Journal of Approximate Reasonin
Autonomous clustering using rough set theory
This paper proposes a clustering technique that minimises the need for subjective
human intervention and is based on elements of rough set theory. The proposed algorithm is
unified in its approach to clustering and makes use of both local and global data properties to
obtain clustering solutions. It handles single-type and mixed attribute data sets with ease and
results from three data sets of single and mixed attribute types are used to illustrate the
technique and establish its efficiency
Computing fuzzy rough approximations in large scale information systems
Rough set theory is a popular and powerful machine learning tool. It is especially suitable for dealing with information systems that exhibit inconsistencies, i.e. objects that have the same values for the conditional attributes but a different value for the decision attribute. In line with the emerging granular computing paradigm, rough set theory groups objects together based on the indiscernibility of their attribute values. Fuzzy rough set theory extends rough set theory to data with continuous attributes, and detects degrees of inconsistency in the data. Key to this is turning the indiscernibility relation into a gradual relation, acknowledging that objects can be similar to a certain extent. In very large datasets with millions of objects, computing the gradual indiscernibility relation (or in other words, the soft granules) is very demanding, both in terms of runtime and in terms of memory. It is however required for the computation of the lower and upper approximations of concepts in the fuzzy rough set analysis pipeline. Current non-distributed implementations in R are limited by memory capacity. For example, we found that a state of the art non-distributed implementation in R could not handle 30,000 rows and 10 attributes on a node with 62GB of memory. This is clearly insufficient to scale fuzzy rough set analysis to massive datasets. In this paper we present a parallel and distributed solution based on Message Passing Interface (MPI) to compute fuzzy rough approximations in very large information systems. Our results show that our parallel approach scales with problem size to information systems with millions of objects. To the best of our knowledge, no other parallel and distributed solutions have been proposed so far in the literature for this problem
Improving circuit miniaturization and its efficiency using Rough Set Theory
High-speed, accuracy, meticulousness and quick response are notion of the
vital necessities for modern digital world. An efficient electronic circuit
unswervingly affects the maneuver of the whole system. Different tools are
required to unravel different types of engineering tribulations. Improving the
efficiency, accuracy and low power consumption in an electronic circuit is
always been a bottle neck problem. So the need of circuit miniaturization is
always there. It saves a lot of time and power that is wasted in switching of
gates, the wiring-crises is reduced, cross-sectional area of chip is reduced,
the number of transistors that can implemented in chip is multiplied many
folds. Therefore to trounce with this problem we have proposed an Artificial
intelligence (AI) based approach that make use of Rough Set Theory for its
implementation. Theory of rough set has been proposed by Z Pawlak in the year
1982. Rough set theory is a new mathematical tool which deals with uncertainty
and vagueness. Decisions can be generated using rough set theory by reducing
the unwanted and superfluous data. We have condensed the number of gates
without upsetting the productivity of the given circuit. This paper proposes an
approach with the help of rough set theory which basically lessens the number
of gates in the circuit, based on decision rules.Comment: The International Conference on Machine Intelligence Research and
Advancement,ICMIRA-201
- …