57 research outputs found

    Towards Intelligent Databases

    Get PDF
    This article is a presentation of the objectives and techniques of deductive databases. The deductive approach to databases aims at extending with intensional definitions other database paradigms that describe applications extensionaUy. We first show how constructive specifications can be expressed with deduction rules, and how normative conditions can be defined using integrity constraints. We outline the principles of bottom-up and top-down query answering procedures and present the techniques used for integrity checking. We then argue that it is often desirable to manage with a database system not only database applications, but also specifications of system components. We present such meta-level specifications and discuss their advantages over conventional approaches

    Congruencias y factorización como herramientas de reducción en el análisis de conceptos formales

    Get PDF
    Desde su introducción a principios de los años ochenta por B. Ganter y R. Wille, el Análisis de Conceptos Formales (FCA, de sus siglas en inglés) ha sido una de las herramientas matemáticas para el análisis de datos que más desarrollo ha experimentado. El FCA es una teoría matemática que determina estructuras conceptuales entre conjuntos de datos. En particular, las bases de datos se interpretan formalmente en esta teoría con la noción de contexto, que viene determinado por un conjunto de objetos, un conjunto de atributos y una relación entre ambos conjuntos. Las herramientas que proporciona el FCA permiten manipular adecuadamente los datos y extraer información relevante de ellos. Una de las líneas de investigación con más importancia es la reducción del conjunto de atributos que contienen estos conjuntos de datos, preservando la información esencial y eliminando la redundancia que puedan contener. La reducción de atributos también ha sido estudiada en otros ambientes, como en la Teoría de Conjuntos Rugosos, así como en las distintas generalizaciones difusas de ambas teorías. En el FCA, se ha demostrado que cuando se lleva a cabo una reducción de atributos de un contexto formal, se induce una relación de equivalencia sobre el conjunto de conceptos del contexto original. Esta relación de equivalencia inducida tiene una particularidad, sus clases de equivalencia tienen una estructura de semirretículo superior con un elemento máximo, es decir, no forman estructuras algebraicas cerradas, en general. En esta tesis estudiamos cómo es posible complementar las reducciones de atributos dotando a las clases de equivalencia con una estructura algebraica cerrada. La noción de congruencia consigue este propósito, sin embargo, el uso de este tipo de relación de equivalencia puede desembocar en una gran pérdida de información debido a que las clases de equivalencia agrupan demasiados conceptos. Para abordar este problema, en esta tesis se introduce una noción debilitada de congruencia que denominamos congruencia local. La congruencia local da lugar a clases de equivalencia con estructura de subretículo convexo, siendo más flexible a la hora de agrupar conceptos pero manteniendo propiedades interesantes desde un punto de vista algebraico. Se presenta una discusión general de los principales resultados relativos al estudio y aplicación de las congruencias locales que se han obtenido a lo largo de la investigación desarrollada durante la tesis. En particular, se introduce la noción de congruencia local junto con un análisis de las propiedades que satisface, así como una relación de orden sobre el conjunto de las clases de equivalencia. Además, realizamos un análisis profundo del impacto que genera el uso de las congruencias locales en el FCA, tanto en el contexto formal como en el retículo de conceptos. En este análisis identificamos aquellas clases de equivalencia de la relación inducida por una reducción de atributos, sobre las cuales actuaría la congruencia local, realizando una agrupación de conceptos diferente para obtener subretículos convexos. Adicionalmente, llevamos a cabo un estudio sobre el uso de las congruencias locales cuando en la reducción de atributos considerada se han eliminado todos los atributos innecesarios del contexto, obtienen resultados interesantes. Presentamos diversos mecanismos que permiten calcular congruencias locales y aplicarlas sobre retículos de conceptos, detallando las modificaciones que se realizan sobre el contexto formal para proporcionar un método de reducción basado en congruencias locales. Por otra parte, otra de las estrategias que nos permite reducir la complejidad del análisis de los contextos formales son los mecanismos de factorización. Los procedimientos utilizados para factorizar permiten dividir un contexto en dos o más subcontextos formales de menor tamaño, pudiéndose estudiar por separado más fácilmente. Se presenta un estudio preliminar sobre la factorización de contextos formales difusos usando operadores modales, que no se ha publicado aún en una revista. Estos operadores modales ya han sido utilizados para extraer subcontextos independientes de un contexto formal clásico obteniéndose así una factorización del contexto original. En esta tesis estudiamos también diversas propiedades que nos ayudan a comprender mejor cómo funciona la descomposición de tablas de datos booleanos, para luego realizar una adaptación de dichas propiedades al marco de trabajo multiadjunto. El estudio de estas propiedades generales en el marco de trabajo multiadjunto será de gran relevancia para poder obtener en el futuro un procedimiento que nos permita factorizar contextos formales multiadjuntos. Por tanto, la obtención de mecanismos de factorización de contextos multiadjuntos será clave para el análisis y tratamiento de grandes bases de dato

    9th International Workshop "What can FCA do for Artificial Intelligence?" (FCA4AI 2021)

    Get PDF
    International audienceFormal Concept Analysis (FCA) is a mathematically well-founded theory aimed at classification and knowledge discovery that can be used for many purposes in Artificial Intelligence (AI). The objective of the ninth edition of the FCA4AI workshop (see http://www.fca4ai.hse.ru/) is to investigate several issues such as: how can FCA support various AI activities (knowledge discovery, knowledge engineering, machine learning, data mining, information retrieval, recommendation...), how can FCA be extended in order to help AI researchers to solve new and complex problems in their domains, and how FCA can play a role in current trends in AI such as explainable AI and fairness of algorithms in decision making.The workshop was held in co-location with IJCAI 2021, Montréal, Canada, August, 28 2021

    MATLAB

    Get PDF
    This excellent book represents the final part of three-volumes regarding MATLAB-based applications in almost every branch of science. The book consists of 19 excellent, insightful articles and the readers will find the results very useful to their work. In particular, the book consists of three parts, the first one is devoted to mathematical methods in the applied sciences by using MATLAB, the second is devoted to MATLAB applications of general interest and the third one discusses MATLAB for educational purposes. This collection of high quality articles, refers to a large range of professional fields and can be used for science as well as for various educational purposes

    Causality and dispersion relations and the role of the S-matrix in the ongoing research

    Full text link
    The adaptation of the Kramers-Kronig dispersion relations to the causal localization structure of QFT led to an important project in particle physics, the only one with a successful closure. The same cannot be said about the subsequent attempts to formulate particle physics as a pure S-matrix project. The feasibility of a pure S-matrix approach are critically analyzed and their serious shortcomings are highlighted. Whereas the conceptual/mathematical demands of renormalized perturbation theory are modest and misunderstandings could easily be corrected, the correct understanding about the origin of the crossing property requires the use of the mathematical theory of modular localization and its relation to the thermal KMS condition. These new concepts, which combine localization, vacuum polarization and thermal properties under the roof of modular theory, will be explained and their potential use in a new constructive (nonperturbative) approach to QFT will be indicated. The S-matrix still plays a predominant role but, different from Heisenberg's and Mandelstam's proposals, the new project is not a pure S-matrix approach. The S-matrix plays a new role as a "relative modular invariant"..Comment: 47 pages expansion of arguments and addition of references, corrections of misprints and bad formulation

    Guide to Discrete Mathematics

    Get PDF

    Representation and parsing of multiword expressions

    Get PDF
    This book consists of contributions related to the definition, representation and parsing of MWEs. These reflect current trends in the representation and processing of MWEs. They cover various categories of MWEs such as verbal, adverbial and nominal MWEs, various linguistic frameworks (e.g. tree-based and unification-based grammars), various languages including English, French, Modern Greek, Hebrew, Norwegian), and various applications (namely MWE detection, parsing, automatic translation) using both symbolic and statistical approaches

    Current trends

    Get PDF
    Deep parsing is the fundamental process aiming at the representation of the syntactic structure of phrases and sentences. In the traditional methodology this process is based on lexicons and grammars representing roughly properties of words and interactions of words and structures in sentences. Several linguistic frameworks, such as Headdriven Phrase Structure Grammar (HPSG), Lexical Functional Grammar (LFG), Tree Adjoining Grammar (TAG), Combinatory Categorial Grammar (CCG), etc., offer different structures and combining operations for building grammar rules. These already contain mechanisms for expressing properties of Multiword Expressions (MWE), which, however, need improvement in how they account for idiosyncrasies of MWEs on the one hand and their similarities to regular structures on the other hand. This collaborative book constitutes a survey on various attempts at representing and parsing MWEs in the context of linguistic theories and applications

    Embedded document security using sticky policies and identity based encryption

    Get PDF
    Data sharing domains have expanded over several, both trusted and insecure environments. At the same time, the data security boundaries have shrunk from internal network perimeters down to a single identity and a piece of information. Since new EU GDPR regulations, the personally identifiable information sharing requires data governance in favour of a data subject. Existing enterprise grade IRM solutions fail to follow open standards and lack of data sharing frameworks that could efficiently integrate with existing identity management and authentication infrastructures. IRM services that stood against cloud demands often offer a very limited access control functionality allowing an individual to store a document online giving a read or read-write permission to other individual identified by email address. Unfortunately, such limited information sharing controls are often introduced as the only safeguards in large enterprises, healthcare institutions and other organizations that should provide the highest possible personal data protection standards. The IRM suffers from a systems architecture vulnerability where IRM application installed on a semi-trusted client truly only guarantees none or full access enforcement. Since no single authority is contacted to verify each committed change the adversary having an advantage of possessing data-encrypting and key-encrypting keys could change and re-encrypt the amended content despite that read only access has been granted. Finally, the two evaluated IRM products, have either the algorithm security lifecycle (ASL) relatively short to protect the shared data, or the solution construct highly restrained secure key-encrypting key distribution and exposes a symmetric data-encrypting key over the network. Presented here sticky policy with identity-based encryption (SPIBE) solution was designed for secure cloud data sharing. SPIBE challenges are to deliver simple standardized construct that would easily integrate with popular OOXML-like document formats and provide simple access rights enforcement over protected content. It leverages a sticky policy construct using XACML access policy language to express access conditions across different cloud data sharing boundaries. XACML is a cloud-ready standard designed for a global multi-jurisdictional use. Unlike other raw ABAC implementations, the XACML offers a standardised schema and authorisation protocols hence it simplifies interoperability. The IBE is a cryptographic scheme protecting the shared document using an identified policy as an asymmetric key-encrypting a symmetric data-encrypting key. Unlike ciphertext-policy attribute-based access control (CP-ABE), the SPIBE policy contains not only access preferences but global document identifier and unique version identifier what makes each policy uniquely identifiable in relation to the protected document. In IBE scheme the public key-encrypting key is known and could be shared between the parties although the data-encrypting key is never sent over the network. Finally, the SPIBE as a framework should have a potential to protect data in case of new threats where ASL of a used cryptographic primitive is too short, when algorithm should be replaced with a new updated cryptographic primitive. The IBE like a cryptographic protocol could be implemented with different cryptographic primitives. The identity-based encryption over isogenous pairing groups (IBE-IPG) is a post-quantum ready construct that leverages the initial IBE Boneh-Franklin (IBE-BF) approach. Existing IBE implementations could be updated to IBE-IPG without major system amendments. Finally, by applying the one document versioning blockchain-like construct could verify changes authenticity and approve only legitimate document updates, where other IRM solutions fail to operate delivering the one single authority for non-repudiation and authenticity assurance
    corecore