7,608 research outputs found
Combining Clustering techniques and Formal Concept Analysis to characterize Interestingness Measures
Formal Concept Analysis "FCA" is a data analysis method which enables to
discover hidden knowledge existing in data. A kind of hidden knowledge
extracted from data is association rules. Different quality measures were
reported in the literature to extract only relevant association rules. Given a
dataset, the choice of a good quality measure remains a challenging task for a
user. Given a quality measures evaluation matrix according to semantic
properties, this paper describes how FCA can highlight quality measures with
similar behavior in order to help the user during his choice. The aim of this
article is the discovery of Interestingness Measures "IM" clusters, able to
validate those found due to the hierarchical and partitioning clustering
methods "AHC" and "k-means". Then, based on the theoretical study of sixty one
interestingness measures according to nineteen properties, proposed in a recent
study, "FCA" describes several groups of measures.Comment: 13 pages, 2 figure
A Categorical View on Algebraic Lattices in Formal Concept Analysis
Formal concept analysis has grown from a new branch of the mathematical field
of lattice theory to a widely recognized tool in Computer Science and
elsewhere. In order to fully benefit from this theory, we believe that it can
be enriched with notions such as approximation by computation or
representability. The latter are commonly studied in denotational semantics and
domain theory and captured most prominently by the notion of algebraicity, e.g.
of lattices. In this paper, we explore the notion of algebraicity in formal
concept analysis from a category-theoretical perspective. To this end, we build
on the the notion of approximable concept with a suitable category and show
that the latter is equivalent to the category of algebraic lattices. At the
same time, the paper provides a relatively comprehensive account of the
representation theory of algebraic lattices in the framework of Stone duality,
relating well-known structures such as Scott information systems with further
formalisms from logic, topology, domains and lattice theory.Comment: 36 page
Refactorings of Design Defects using Relational Concept Analysis
Software engineers often need to identify and correct design defects, ıe} recurring design problems that hinder development and maintenance\ud
by making programs harder to comprehend and--or evolve. While detection\ud
of design defects is an actively researched area, their correction---mainly\ud
a manual and time-consuming activity --- is yet to be extensively\ud
investigated for automation. In this paper, we propose an automated\ud
approach for suggesting defect-correcting refactorings using relational\ud
concept analysis (RCA). The added value of RCA consists in exploiting\ud
the links between formal objects which abound in a software re-engineering\ud
context. We validated our approach on instances of the <span class='textit'></span>Blob\ud
design defect taken from four different open-source programs
ASR error management for improving spoken language understanding
This paper addresses the problem of automatic speech recognition (ASR) error
detection and their use for improving spoken language understanding (SLU)
systems. In this study, the SLU task consists in automatically extracting, from
ASR transcriptions , semantic concepts and concept/values pairs in a e.g
touristic information system. An approach is proposed for enriching the set of
semantic labels with error specific labels and by using a recently proposed
neural approach based on word embeddings to compute well calibrated ASR
confidence measures. Experimental results are reported showing that it is
possible to decrease significantly the Concept/Value Error Rate with a state of
the art system, outperforming previously published results performance on the
same experimental data. It also shown that combining an SLU approach based on
conditional random fields with a neural encoder/decoder attention based
architecture , it is possible to effectively identifying confidence islands and
uncertain semantic output segments useful for deciding appropriate error
handling actions by the dialogue manager strategy .Comment: Interspeech 2017, Aug 2017, Stockholm, Sweden. 201
What is a quantum computer, and how do we build one?
The DiVincenzo criteria for implementing a quantum computer have been seminal
in focussing both experimental and theoretical research in quantum information
processing. These criteria were formulated specifically for the circuit model
of quantum computing. However, several new models for quantum computing
(paradigms) have been proposed that do not seem to fit the criteria well. The
question is therefore what are the general criteria for implementing quantum
computers. To this end, a formal operational definition of a quantum computer
is introduced. It is then shown that according to this definition a device is a
quantum computer if it obeys the following four criteria: Any quantum computer
must (1) have a quantum memory; (2) facilitate a controlled quantum evolution
of the quantum memory; (3) include a method for cooling the quantum memory; and
(4) provide a readout mechanism for subsets of the quantum memory. The criteria
are met when the device is scalable and operates fault-tolerantly. We discuss
various existing quantum computing paradigms, and how they fit within this
framework. Finally, we lay out a roadmap for selecting an avenue towards
building a quantum computer. This is summarized in a decision tree intended to
help experimentalists determine the most natural paradigm given a particular
physical implementation
- …