13,191 research outputs found

    Improving circuit miniaturization and its efficiency using Rough Set Theory

    Full text link
    High-speed, accuracy, meticulousness and quick response are notion of the vital necessities for modern digital world. An efficient electronic circuit unswervingly affects the maneuver of the whole system. Different tools are required to unravel different types of engineering tribulations. Improving the efficiency, accuracy and low power consumption in an electronic circuit is always been a bottle neck problem. So the need of circuit miniaturization is always there. It saves a lot of time and power that is wasted in switching of gates, the wiring-crises is reduced, cross-sectional area of chip is reduced, the number of transistors that can implemented in chip is multiplied many folds. Therefore to trounce with this problem we have proposed an Artificial intelligence (AI) based approach that make use of Rough Set Theory for its implementation. Theory of rough set has been proposed by Z Pawlak in the year 1982. Rough set theory is a new mathematical tool which deals with uncertainty and vagueness. Decisions can be generated using rough set theory by reducing the unwanted and superfluous data. We have condensed the number of gates without upsetting the productivity of the given circuit. This paper proposes an approach with the help of rough set theory which basically lessens the number of gates in the circuit, based on decision rules.Comment: The International Conference on Machine Intelligence Research and Advancement,ICMIRA-201

    A comprehensive study of implicator-conjunctor based and noise-tolerant fuzzy rough sets: definitions, properties and robustness analysis

    Get PDF
    © 2014 Elsevier B.V. Both rough and fuzzy set theories offer interesting tools for dealing with imperfect data: while the former allows us to work with uncertain and incomplete information, the latter provides a formal setting for vague concepts. The two theories are highly compatible, and since the late 1980s many researchers have studied their hybridization. In this paper, we critically evaluate most relevant fuzzy rough set models proposed in the literature. To this end, we establish a formally correct and unified mathematical framework for them. Both implicator-conjunctor-based definitions and noise-tolerant models are studied. We evaluate these models on two different fronts: firstly, we discuss which properties of the original rough set model can be maintained and secondly, we examine how robust they are against both class and attribute noise. By highlighting the benefits and drawbacks of the different fuzzy rough set models, this study appears a necessary first step to propose and develop new models in future research.Lynn D’eer has been supported by the Ghent University Special Research Fund, Chris Cornelis was partially supported by the Spanish Ministry of Science and Technology under the project TIN2011-28488 and the Andalusian Research Plans P11-TIC-7765 and P10-TIC-6858, and by project PYR-2014-8 of the Genil Program of CEI BioTic GRANADA and Lluis Godo has been partially supported by the Spanish MINECO project EdeTRI TIN2012-39348-C02-01Peer Reviewe

    Accurate and reliable segmentation of the optic disc in digital fundus images

    Get PDF
    We describe a complete pipeline for the detection and accurate automatic segmentation of the optic disc in digital fundus images. This procedure provides separation of vascular information and accurate inpainting of vessel-removed images, symmetry-based optic disc localization, and fitting of incrementally complex contour models at increasing resolutions using information related to inpainted images and vessel masks. Validation experiments, performed on a large dataset of images of healthy and pathological eyes, annotated by experts and partially graded with a quality label, demonstrate the good performances of the proposed approach. The method is able to detect the optic disc and trace its contours better than the other systems presented in the literature and tested on the same data. The average error in the obtained contour masks is reasonably close to the interoperator errors and suitable for practical applications. The optic disc segmentation pipeline is currently integrated in a complete software suite for the semiautomatic quantification of retinal vessel properties from fundus camera images (VAMPIRE)

    3D surface profile equipment for the characterization of the pavement texture - TexScan

    Get PDF
    Loads from vehicles alter the functional and structural characteristics of road pavements that directly affect the loss of resistance of the pavement and the users’ comfort and safety. Those alterations require constant observation and analysis of an extensive area of road surface with high precision. For such it was developed a new scanning prototype machine capable of acquiring the 3D road surface data and characterize the road texture through two algorithms that allows calculate the Estimated Texture Depth (ETD) and Texture Profile Level (L) indicators. The experimental results obtained from nine road samples validate the developed algorithms for the texture analysis and showed good agreement between the scanning prototype equipment and the traditional Sand Patch Method.Fundação para a Ciência e a Tecnologia (FCT) through the PhD Grant referenced SFRH/BD/18155/200

    A Noise-tolerant Approach to Fuzzy-Rough Feature Selection

    Get PDF
    In rough set based feature selection, the goal is to omit attributes (features) from decision systems such that objects in different decision classes can still be discerned. A popular way to evaluate attribute subsets with respect to this criterion is based on the notion of dependency degree. In the standard approach, attributes are expected to be qualitative; in the presence of quantitative attributes, the methodology can be generalized using fuzzy rough sets, to handle gradual (in)discernibility between attribute values more naturally. However, both the extended approach, as well as its crisp counterpart, exhibit a strong sensitivity to noise: a change in a single object may significantly influence the outcome of the reduction procedure. Therefore, in this paper, we consider a more flexible methodology based on the recently introduced Vaguely Quantified Rough Set (VQRS) model. The method can handle both crisp (discrete-valued) and fuzzy (real-valued) data, and encapsulates the existing noise-tolerant data reduction approach using Variable Precision Rough Sets (VPRS), as well as the traditional rough set model, as special cases

    An overview of decision table literature 1982-1995.

    Get PDF
    This report gives an overview of the literature on decision tables over the past 15 years. As much as possible, for each reference, an author supplied abstract, a number of keywords and a classification are provided. In some cases own comments are added. The purpose of these comments is to show where, how and why decision tables are used. The literature is classified according to application area, theoretical versus practical character, year of publication, country or origin (not necessarily country of publication) and the language of the document. After a description of the scope of the interview, classification results and the classification by topic are presented. The main body of the paper is the ordered list of publications with abstract, classification and comments.
    • …
    corecore