860 research outputs found

    Impact of Local Congruences in Attribute Reduction

    Get PDF
    Local congruences are equivalence relations whose equivalence classes are convex sublattices of the original lattice. In this paper, we present a study that relates local congruences to attribute reduction in FCA. Specifically, we will analyze the impact in the context of the use of local congruences, when they are used for complementing an attribute reduction

    Impact of local congruences in variable selection from datasets

    Get PDF
    Formal concept analysis (FCA) is a useful mathematical tool for obtaining information from relational datasets. One of the most interesting research goals in FCA is the selection of the most representative variables of the dataset, which is called attribute reduction. Recently, the attribute reduction mechanism has been complemented with the use of local congruences in order to obtain robust clusters of concepts, which form convex sublattices of the original concept lattice. Since the application of such local congruences modifies the quotient set associated with the attribute reduction, it is fundamental to know how the original context (attributes, objects and relationship) has been modified in order to understand the impact of the application of the local congruence in the attribute reduction.Partially supported by the the 2014-2020 ERDF Operational Programme in collaboration with the State Research Agency (AEI) in project TIN2016-76653-P and PID2019- 108991GB-I00, and with the Department of Economy, Knowledge, Business and University of the Regional Government of Andalusia in project FEDER-UCA18-108612, and by the European Cooperation in Science & Technology (COST) Action CA17124

    Identifying Non-Sublattice Equivalence Classes Induced by an Attribute Reduction in FCA

    Get PDF
    The detection of redundant or irrelevant variables (attributes) in datasets becomes essential in different frameworks, such as in Formal Concept Analysis (FCA). However, removing such variables can have some impact on the concept lattice, which is closely related to the algebraic structure of the obtained quotient set and their classes. This paper studies the algebraic structure of the induced equivalence classes and characterizes those classes that are convex sublattices of the original concept lattice. Particular attention is given to the reductions removing FCA's unnecessary attributes. The obtained results will be useful to other complementary reduction techniques, such as the recently introduced procedure based on local congruences

    Congruencias y factorización como herramientas de reducción en el análisis de conceptos formales

    Get PDF
    Desde su introducción a principios de los años ochenta por B. Ganter y R. Wille, el Análisis de Conceptos Formales (FCA, de sus siglas en inglés) ha sido una de las herramientas matemáticas para el análisis de datos que más desarrollo ha experimentado. El FCA es una teoría matemática que determina estructuras conceptuales entre conjuntos de datos. En particular, las bases de datos se interpretan formalmente en esta teoría con la noción de contexto, que viene determinado por un conjunto de objetos, un conjunto de atributos y una relación entre ambos conjuntos. Las herramientas que proporciona el FCA permiten manipular adecuadamente los datos y extraer información relevante de ellos. Una de las líneas de investigación con más importancia es la reducción del conjunto de atributos que contienen estos conjuntos de datos, preservando la información esencial y eliminando la redundancia que puedan contener. La reducción de atributos también ha sido estudiada en otros ambientes, como en la Teoría de Conjuntos Rugosos, así como en las distintas generalizaciones difusas de ambas teorías. En el FCA, se ha demostrado que cuando se lleva a cabo una reducción de atributos de un contexto formal, se induce una relación de equivalencia sobre el conjunto de conceptos del contexto original. Esta relación de equivalencia inducida tiene una particularidad, sus clases de equivalencia tienen una estructura de semirretículo superior con un elemento máximo, es decir, no forman estructuras algebraicas cerradas, en general. En esta tesis estudiamos cómo es posible complementar las reducciones de atributos dotando a las clases de equivalencia con una estructura algebraica cerrada. La noción de congruencia consigue este propósito, sin embargo, el uso de este tipo de relación de equivalencia puede desembocar en una gran pérdida de información debido a que las clases de equivalencia agrupan demasiados conceptos. Para abordar este problema, en esta tesis se introduce una noción debilitada de congruencia que denominamos congruencia local. La congruencia local da lugar a clases de equivalencia con estructura de subretículo convexo, siendo más flexible a la hora de agrupar conceptos pero manteniendo propiedades interesantes desde un punto de vista algebraico. Se presenta una discusión general de los principales resultados relativos al estudio y aplicación de las congruencias locales que se han obtenido a lo largo de la investigación desarrollada durante la tesis. En particular, se introduce la noción de congruencia local junto con un análisis de las propiedades que satisface, así como una relación de orden sobre el conjunto de las clases de equivalencia. Además, realizamos un análisis profundo del impacto que genera el uso de las congruencias locales en el FCA, tanto en el contexto formal como en el retículo de conceptos. En este análisis identificamos aquellas clases de equivalencia de la relación inducida por una reducción de atributos, sobre las cuales actuaría la congruencia local, realizando una agrupación de conceptos diferente para obtener subretículos convexos. Adicionalmente, llevamos a cabo un estudio sobre el uso de las congruencias locales cuando en la reducción de atributos considerada se han eliminado todos los atributos innecesarios del contexto, obtienen resultados interesantes. Presentamos diversos mecanismos que permiten calcular congruencias locales y aplicarlas sobre retículos de conceptos, detallando las modificaciones que se realizan sobre el contexto formal para proporcionar un método de reducción basado en congruencias locales. Por otra parte, otra de las estrategias que nos permite reducir la complejidad del análisis de los contextos formales son los mecanismos de factorización. Los procedimientos utilizados para factorizar permiten dividir un contexto en dos o más subcontextos formales de menor tamaño, pudiéndose estudiar por separado más fácilmente. Se presenta un estudio preliminar sobre la factorización de contextos formales difusos usando operadores modales, que no se ha publicado aún en una revista. Estos operadores modales ya han sido utilizados para extraer subcontextos independientes de un contexto formal clásico obteniéndose así una factorización del contexto original. En esta tesis estudiamos también diversas propiedades que nos ayudan a comprender mejor cómo funciona la descomposición de tablas de datos booleanos, para luego realizar una adaptación de dichas propiedades al marco de trabajo multiadjunto. El estudio de estas propiedades generales en el marco de trabajo multiadjunto será de gran relevancia para poder obtener en el futuro un procedimiento que nos permita factorizar contextos formales multiadjuntos. Por tanto, la obtención de mecanismos de factorización de contextos multiadjuntos será clave para el análisis y tratamiento de grandes bases de dato

    TSKY: a dependable middleware solution for data privacy using public storage clouds

    Get PDF
    Dissertação para obtenção do Grau de Mestre em Engenharia InformáticaThis dissertation aims to take advantage of the virtues offered by data storage cloud based systems on the Internet, proposing a solution that avoids security issues by combining different providers’ solutions in a vision of a cloud-of-clouds storage and computing. The solution, TSKY System (or Trusted Sky), is implemented as a middleware system, featuring a set of components designed to establish and to enhance conditions for security, privacy, reliability and availability of data, with these conditions being secured and verifiable by the end-user, independently of each provider. These components, implement cryptographic tools, including threshold and homomorphic cryptographic schemes, combined with encryption, replication, and dynamic indexing mecha-nisms. The solution allows data management and distribution functions over data kept in different storage clouds, not necessarily trusted, improving and ensuring resilience and security guarantees against Byzantine faults and at-tacks. The generic approach of the TSKY system model and its implemented services are evaluated in the context of a Trusted Email Repository System (TSKY-TMS System). The TSKY-TMS system is a prototype that uses the base TSKY middleware services to store mailboxes and email Messages in a cloud-of-clouds

    A history of Galois fields

    Get PDF
    This paper stresses a specific line of development of the notion of finite field, from Évariste Galois’s 1830 “Note sur la théorie des nombres,” and Camille Jordan’s 1870 Traité des substitutions et des équations algébriques, to Leonard Dickson’s 1901 Linear groups with an exposition of the Galois theory. This line of development highlights the key role played by some specific algebraic procedures. These intrinsically interlaced the indexations provided by Galois’s number-theoretic imaginaries with decompositions of the analytic representations of linear substitutions. Moreover, these procedures shed light on a key aspect of Galois’s works that had received little attention until now. The methodology of the present paper is based on investigations of intertextual references for identifying some specific collective dimensions of mathematics. We shall take as a starting point a coherent network of texts that were published mostly in France and in the U.S.A. from 1893 to 1907 (the “Galois fields network,” for short). The main shared references in this corpus were some texts published in France over the course of the 19th century, especially by Galois, Hermite, Mathieu, Serret, and Jordan. The issue of the collective dimensions underlying this network is thus especially intriguing. Indeed, the historiography of algebra has often put to the fore some specific approaches developed in Germany, with little attention to works published in France. Moreover, the “German abstract algebra” has been considered to have strongly influenced the development of the American mathematical community. Actually, this influence has precisely been illustrated by the example of Elliakim Hasting Moore’s lecture on “abstract Galois fields” at the Chicago congress in 1893. To be sure, this intriguing situation raises some issues of circulations of knowledge from Paris to Chicago. It also calls for reflection on the articulations between the individual and the collective dimensions of mathematics. Such articulations have often been analysed by appealing to categories such as nations, disciplines, or institutions (e.g., the “German algebra,” the “Chicago algebraic research school”). Yet, we shall see that these categories fail to characterize an important specific approach to Galois fields. The coherence of the Galois fields network had underlying it some collective interest for “linear groups in Galois fields.” Yet, the latter designation was less pointing to a theory, or a discipline, revolving around a specific object, i.e. Gln(Fpn) (p a prime number), than to some specific procedures. In modern parlance, general linear groups in Galois fields were introduced in this context as the maximal group in which an elementary abelian group (i.e., the multiplicative group of a Galois field) is a normal subgroup. The Galois fields network was actually rooted on a specific algebraic culture that had developed over the course of the 19th century. We shall see that this shared culture resulted from the circulation of some specific algebraic procedures of decompositions of polynomial representations of substitutions

    Typed Generic Traversal With Term Rewriting Strategies

    Full text link
    A typed model of strategic term rewriting is developed. The key innovation is that generic traversal is covered. To this end, we define a typed rewriting calculus S'_{gamma}. The calculus employs a many-sorted type system extended by designated generic strategy types gamma. We consider two generic strategy types, namely the types of type-preserving and type-unifying strategies. S'_{gamma} offers traversal combinators to construct traversals or schemes thereof from many-sorted and generic strategies. The traversal combinators model different forms of one-step traversal, that is, they process the immediate subterms of a given term without anticipating any scheme of recursion into terms. To inhabit generic types, we need to add a fundamental combinator to lift a many-sorted strategy ss to a generic type gamma. This step is called strategy extension. The semantics of the corresponding combinator states that s is only applied if the type of the term at hand fits, otherwise the extended strategy fails. This approach dictates that the semantics of strategy application must be type-dependent to a certain extent. Typed strategic term rewriting with coverage of generic term traversal is a simple but expressive model of generic programming. It has applications in program transformation and program analysis.Comment: 85 pages, submitted for publication to the Journal of Logic and Algebraic Programmin

    Efficient engineering of supervisory controllers

    Get PDF

    Absolutism, Relativism, and Universalism in Personality Traits Across Cultures: The Case of the Big Five

    Full text link
    Personality is a broad concept used to organize the myriad ways that people differ psychologically from one another. There is evidence that such differences have been important to humans everywhere, in that personality-relevant terms appear in all known languages. Empirical attempts to identify the most useful individual differences and their structure have emphasized cross-cultural evidence, but rigid adherence to a Big Five model has sometimes meant ignoring heterogenous results. We start with a framework for more precisely defining the universality versus cultural-specificity of personality concepts and models in order to better assess cross-cultural evidence. As this 50th anniversary of the IACCP is also the 50th anniversary of the first large lexical study of personality and more or less of the Big Five model, we take the opportunity to explore both how personality has been studied across contexts using the lexical method, and in 100 articles on personality topics (most using questionnaires) that were identified in the pages of JCCP. Personality articles in JCCP, classified into three types based on their balance of emic and etic components, illustrate larger trends in personality psychology. With the benefit of hindsight, we reflect on what each type has to offer going forward, and we encourage cross-cultural personality psychologists to go beyond imposed etic studies that seek primarily to confirm Western models in other contexts. The kinds of insights that more integrative emic and etic approaches can bring to the study of psychology across cultures are highlighted, and a future research agenda is provided
    corecore