172 research outputs found
Minimization via duality
We show how to use duality theory to construct minimized versions of a wide class of automata. We work out three cases in detail: (a variant of) ordinary automata, weighted automata and probabilistic automata. The basic idea is that instead of constructing a maximal quotient we go to the dual and look for a minimal subalgebra and then return to the original category. Duality ensures that the minimal subobject becomes the maximally quotiented object
On the AGN radio luminosity distribution and the black hole fundamental plane
We have studied the dependence of the AGN nuclear radio (1.4 GHz) luminosity
on both the AGN 2-10 keV X-ray and the host-galaxy K-band luminosity. A
complete sample of 1268 X-ray selected AGN (both type 1 and type 2) has been
used, which is the largest catalogue of AGN belonging to statistically well
defined samples where radio, X and K band information exists. At variance with
previous studies, radio upper limits have been statistically taken into account
using a Bayesian Maximum Likelihood fitting method. It resulted that a good fit
is obtained assuming a plane in the 3D L_R-L_X-L_K space, namely logL_R= xi_X
logL_X + xi_K logL_K + xi_0, having a ~1 dex wide (1 sigma) spread in radio
luminosity. As already shown, no evidence of bimodality in the radio luminosity
distribution was found and therefore any definition of radio loudness in AGN is
arbitrary. Using scaling relations between the BH mass and the host galaxy
K-band luminosity, we have also derived a new estimate of the BH fundamental
plane (in the L_5GHz -L_X-M_BH space). Our analysis shows that previous
measures of the BH fundamental plane are biased by ~0.8 dex in favor of the
most luminous radio sources. Therefore, many AGN studies, where the BH
fundamental plane is used to investigate how AGN regulate their radiative and
mechanical luminosity as a function of the accretion rate, or many AGN/galaxy
co-evolution models, where radio-feedback is computed using the AGN fundamental
plane, should revise their conclusions.Comment: Submitted to MNRAS. Revised version after minor referee comments. 12
pages, 12 figure
The theory of traces for systems with nondeterminism and probability
This paper studies trace-based equivalences for systems combining nondeterministic and probabilistic choices. We show how trace semantics for such processes can be recovered by instantiating a coalgebraic construction known as the generalised powerset construction. We characterise and compare the resulting semantics to known definitions of trace equivalences appearing in the literature. Most of our results are based on the exciting interplay between monads and their presentations via algebraic theories
From Farkas’ lemma to linear programming: An exercise in diagrammatic algebra
Farkas’ lemma is a celebrated result on the solutions of systems of linear inequalities, which finds application pervasively in mathematics and computer science. In this work we show how to formulate and prove Farkas’ lemma in diagrammatic polyhedral algebra, a sound and complete graphical calculus for polyhedra. Furthermore, we show how linear programs can be modeled within the calculus and how some famous duality results can be proved
gPrune: A Constraint Pushing Framework for Graph Pattern Mining
Abstract. In graph mining applications, there has been an increasingly strong urge for imposing user-specified constraints on the mining results. However, unlike most traditional itemset constraints, structural constraints, such as density and diameter of a graph, are very hard to be pushed deep into the mining process. In this paper, we give the first comprehensive study on the pruning properties of both traditional and structural constraints aiming to reduce not only the pattern search space but the data search space as well. A new general framework, called gPrune, is proposed to incorporate all the constraints in such a way that they recursively reinforce each other through the entire mining process. A new concept, Pattern-inseparable Data-antimonotonicity, is proposed to handle the structural constraints unique in the context of graph, which, combined with known pruning properties, provides a comprehensive and unified classification framework for structural constraints. The exploration of these antimonotonicities in the context of graph pattern mining is a significant extension to the known classification of constraints, and deepens our understanding of the pruning properties of structural graph constraints.
Rewriting modulo symmetric monoidal structure
String diagrams are a powerful and intuitive graphical syntax for terms of symmetric monoidal categories (SMCs). They find many applications in computer science and are becoming increasingly relevant in other fields such as physics and control theory.
An important role in many such approaches is played by equational theories of diagrams, typically oriented and applied as rewrite rules. This paper lays a comprehensive foundation for this form of rewriting. We interpret diagrams combinatorially as typed hypergraphs and establish the precise correspondence between diagram rewriting modulo the laws of SMCs on the one hand and double pushout (DPO) rewriting of hypergraphs, subject to a soundness condition called convexity, on the other. This result rests on a more general characterisation theorem in which we show that typed hypergraph DPO rewriting amounts to diagram rewriting modulo the laws of SMCs with a chosen special Frobenius structure.
We illustrate our approach with a proof of termination for the theory of non-commutative bimonoids
How to kill epsilons with a dagger: a coalgebraic take on systems with algebraic label structure
We propose an abstract framework for modeling state-based systems with internal behavior as e.g. given by silent or ϵ-transitions. Our approach employs monads with a parametrized fixpoint operator † to give a semantics to those systems and implement a sound procedure of abstraction of the internal transitions, whose labels are seen as the unit of a free monoid. More broadly, our approach extends the standard coalgebraic framework for state-based systems by taking into account the algebraic structure of the labels of their transitions. This allows to consider a wide range of other examples, including Mazurkiewicz traces for concurrent systems.Funded by the ERDF through the Programme COMPETE and by the Portuguese Foundation for Science and Technology, project ref. FCOMP-01-0124-FEDER-020537 and SFRH/BPD/71956/2010. Acknowledge support by project ANR 12IS0 2001 PACE
Confluence of graph rewriting with interfaces
For terminating double-pushout (DPO) graph rewriting systems confluence is, in general, undecidable. We show that confluence is decidable for an extension of DPO rewriting to graphs with interfaces. This variant is important due to it being closely related to rewriting of string diagrams. We show that our result extends, under mild conditions, to decidability of confluence for terminating rewriting systems of string diagrams in symmetric monoidal categories
Flexible constrained sampling with guarantees for pattern mining
Pattern sampling has been proposed as a potential solution to the infamous
pattern explosion. Instead of enumerating all patterns that satisfy the
constraints, individual patterns are sampled proportional to a given quality
measure. Several sampling algorithms have been proposed, but each of them has
its limitations when it comes to 1) flexibility in terms of quality measures
and constraints that can be used, and/or 2) guarantees with respect to sampling
accuracy. We therefore present Flexics, the first flexible pattern sampler that
supports a broad class of quality measures and constraints, while providing
strong guarantees regarding sampling accuracy. To achieve this, we leverage the
perspective on pattern mining as a constraint satisfaction problem and build
upon the latest advances in sampling solutions in SAT as well as existing
pattern mining algorithms. Furthermore, the proposed algorithm is applicable to
a variety of pattern languages, which allows us to introduce and tackle the
novel task of sampling sets of patterns. We introduce and empirically evaluate
two variants of Flexics: 1) a generic variant that addresses the well-known
itemset sampling task and the novel pattern set sampling task as well as a wide
range of expressive constraints within these tasks, and 2) a specialized
variant that exploits existing frequent itemset techniques to achieve
substantial speed-ups. Experiments show that Flexics is both accurate and
efficient, making it a useful tool for pattern-based data exploration.Comment: Accepted for publication in Data Mining & Knowledge Discovery journal
(ECML/PKDD 2017 journal track
The contribution of faint AGNs to the ionizing background at z~4
Finding the sources responsible for the hydrogen reionization is one of the
most pressing issues in cosmology. Bright QSOs are known to ionize their
surrounding neighborhood, but they are too few to ensure the required HI
ionizing background. A significant contribution by faint AGNs, however, could
solve the problem, as recently advocated on the basis of a relatively large
space density of faint active nuclei at z>4. We have carried out an exploratory
spectroscopic program to measure the HI ionizing emission of 16 faint AGNs
spanning a broad U-I color interval, with I~21-23 and 3.6<z<4.2. These AGNs are
three magnitudes fainter than the typical SDSS QSOs (M1450<~-26) which are
known to ionize their surrounding IGM at z>~4. The LyC escape fraction has been
detected with S/N ratio of ~10-120 and is between 44 and 100% for all the
observed faint AGNs, with a mean value of 74% at 3.6<z<4.2 and
-25.1<M1450<-23.3, in agreement with the value found in the literature for much
brighter QSOs (M1450<~-26) at the same redshifts. The LyC escape fraction of
our faint AGNs does not show any dependence on the absolute luminosities or on
the observed U-I colors. Assuming that the LyC escape fraction remains close to
~75% down to M1450~-18, we find that the AGN population can provide between 16
and 73% (depending on the adopted luminosity function) of the whole ionizing UV
background at z~4, measured through the Lyman forest. This contribution
increases to 25-100% if other determinations of the ionizing UV background are
adopted. Extrapolating these results to z~5-7, there are possible indications
that bright QSOs and faint AGNs can provide a significant contribution to the
reionization of the Universe, if their space density is high at M1450~-23.Comment: Accepted for publication on A&A, 16 pages, 22 figure
- …