58 research outputs found

    Coherence for Skew-Monoidal Categories

    Full text link
    I motivate a variation (due to K. Szlach\'{a}nyi) of monoidal categories called skew-monoidal categories where the unital and associativity laws are not required to be isomorphisms, only natural transformations. Coherence has to be formulated differently than in the well-known monoidal case. In my (to my knowledge new) version, it becomes a statement of uniqueness of normalizing rewrites. I present a proof of this coherence theorem and also formalize it fully in the dependently typed programming language Agda.Comment: In Proceedings MSFP 2014, arXiv:1406.153

    From Subfactors to Categories and Topology I. Frobenius algebras in and Morita equivalence of tensor categories

    Full text link
    We consider certain categorical structures that are implicit in subfactor theory. Making the connection between subfactor theory (at finite index) and category theory explicit sheds light on both subjects. Furthermore, it allows various generalizations of these structures, e.g. to arbitrary ground fields, and the proof of new results about topological invariants in three dimensions. The central notion is that of a Frobenius algebra in a tensor category A, which reduces to the classical notion if A=F-Vect, where F is a field. An object X in A with two-sided dual X^ gives rise to a Frobenius algebra in A, and under weak additional conditions we prove a converse: There exists a bicategory E with Obj(E)={X,Y} such that End_E(X,X) is equivalent to A and such that there are J: Y->X, J^: X->Y producing the given Frobenius algebra. Many properties (additivity, sphericity, semisimplicity,...) of A carry over to E. We define weak monoidal Morita equivalence (wMe) of tensor categories and establish a correspondence between Frobenius algebras in A and tensor categories B wMe A. While considerably weaker than equivalence of tensor categories, weak monoidal Morita equivalence of A and B implies (for A,B semisimple and spherical or *-categories) that A and B have the same dimension, braided equivalent `center' (quantum double) and define the same state sum invariants of closed oriented 3-manifolds as defined by Barrett and Westbury. If H is a finite dimensional semisimple and cosemisimple Hopf algebra then H-mod and H^-mod are wMe. The present formalism permits a fairly complete analysis of the quantum double of a semisimple spherical category, which is the subject of the companion paper math.CT/0111205.Comment: latex2e, ca. 58 pages. Requires diagrams.tex V3.88. Proof of Thm. 6.20 and one reference include

    A correspondence between rooted planar maps and normal planar lambda terms

    Get PDF
    A rooted planar map is a connected graph embedded in the 2-sphere, with one edge marked and assigned an orientation. A term of the pure lambda calculus is said to be linear if every variable is used exactly once, normal if it contains no beta-redexes, and planar if it is linear and the use of variables moreover follows a deterministic stack discipline. We begin by showing that the sequence counting normal planar lambda terms by a natural notion of size coincides with the sequence (originally computed by Tutte) counting rooted planar maps by number of edges. Next, we explain how to apply the machinery of string diagrams to derive a graphical language for normal planar lambda terms, extracted from the semantics of linear lambda calculus in symmetric monoidal closed categories equipped with a linear reflexive object or a linear reflexive pair. Finally, our main result is a size-preserving bijection between rooted planar maps and normal planar lambda terms, which we establish by explaining how Tutte decomposition of rooted planar maps (into vertex maps, maps with an isthmic root, and maps with a non-isthmic root) may be naturally replayed in linear lambda calculus, as certain surgeries on the string diagrams of normal planar lambda terms.Comment: Corrected title field in metadat

    Topos and Stacks of Deep Neural Networks

    Full text link
    Every known artificial deep neural network (DNN) corresponds to an object in a canonical Grothendieck's topos; its learning dynamic corresponds to a flow of morphisms in this topos. Invariance structures in the layers (like CNNs or LSTMs) correspond to Giraud's stacks. This invariance is supposed to be responsible of the generalization property, that is extrapolation from learning data under constraints. The fibers represent pre-semantic categories (Culioli, Thom), over which artificial languages are defined, with internal logics, intuitionist, classical or linear (Girard). Semantic functioning of a network is its ability to express theories in such a language for answering questions in output about input data. Quantities and spaces of semantic information are defined by analogy with the homological interpretation of Shannon's entropy (P.Baudot and D.B. 2015). They generalize the measures found by Carnap and Bar-Hillel (1952). Amazingly, the above semantical structures are classified by geometric fibrant objects in a closed model category of Quillen, then they give rise to homotopical invariants of DNNs and of their semantic functioning. Intentional type theories (Martin-Loef) organize these objects and fibrations between them. Information contents and exchanges are analyzed by Grothendieck's derivators

    A Higher-Order Calculus for Categories

    Get PDF
    A calculus for a fragment of category theory is presented. The types in the language denote categories and the expressions functors. The judgements of the calculus systematise categorical arguments such as: an expression is functorial in its free variables; two expressions are naturally isomorphic in their free variables. There are special binders for limits and more general ends. The rules for limits and ends support an algebraic manipulation of universal constructions as opposed to a more traditional diagrammatic approach. Duality within the calculus and applications in proving continuity are discussed with examples. The calculus gives a basis for mechanising a theory of categories in a generic theorem prover like Isabelle

    Modular Normalization with Types

    Get PDF
    With the increasing use of software in today’s digital world, software is becoming more and more complex and the cost of developing and maintaining software has skyrocketed. It has become pressing to develop software using effective tools that reduce this cost. Programming language research aims to develop such tools using mathematically rigorous foundations. A recurring and central concept in programming language research is normalization: the process of transforming a complex expression in a language to a canonical form while preserving its meaning. Normalization has compelling benefits in theory and practice, but is extremely difficult to achieve. Several program transformations that are used to optimise programs, prove properties of languages and check program equivalence, for instance, are after all instances of normalization, but they are seldom viewed as such.Viewed through the lens of current methods, normalization lacks the ability to be broken into sub-problems and solved independently, i.e., lacks modularity. To make matters worse, such methods rely excessively on the syntax of the language, making the resulting normalization algorithms brittle and sensitive to changes in the syntax. When the syntax of the language evolves due to modification or extension, as it almost always does in practice, the normalization algorithm may need to be revisited entirely. To circumvent these problems, normalization is currently either abandoned entirely or concrete instances of normalization are achieved using ad hoc means specific to a particular language. Continuing this trend in programming language research poses the risk of building on a weak foundation where languages either lack fundamental properties that follow from normalization or several concrete instances end up repeated in an ad hoc manner that lacks reusability.This thesis advocates for the use of type-directed Normalization by Evaluation (NbE) to develop normalization algorithms. NbE is a technique that provides an opportunity for a modular implementation of normalization algorithms by allowing us to disentangle the syntax of a language from its semantics. Types further this opportunity by allowing us to dissect a language into isolated fragments, such as functions and products, with an individual specification of syntax and semantics. To illustrate type-directed NbE in context, we develop NbE algorithms and show their applicability for typed programming language calculi in three different domains (modal types, static information-flow control and categorical combinators) and for a family of embedded-domain specific languages in Haskell

    On unitary 2-representations of finite groups and topological quantum field theory

    Full text link
    This thesis contains various results on unitary 2-representations of finite groups and their 2-characters, as well as on pivotal structures for fusion categories. The motivation is extended topological quantum field theory (TQFT), where the 2-category of unitary 2-representations of a finite group is thought of as the `2-category assigned to the point' in the untwisted finite group model. The first result is that the braided monoidal category of transformations of the identity on the 2-category of unitary 2-representations of a finite group computes as the category of conjugation equivariant vector bundles over the group equipped with the fusion tensor product. This result is consistent with the extended TQFT hypotheses of Baez and Dolan, since it establishes that the category assigned to the circle can be obtained as the `higher trace of the identity' of the 2-category assigned to the point. The second result is about 2-characters of 2-representations, a concept which has been introduced independently by Ganter and Kapranov. It is shown that the 2-character of a unitary 2-representation can be made functorial with respect to morphisms of 2-representations, and that in fact the 2-character is a unitarily fully faithful functor from the complexified Grothendieck category of unitary 2-representations to the category of unitary equivariant vector bundles over the group. The final result is about pivotal structures on fusion categories, with a view towards a conjecture made by Etingof, Nikshych and Ostrik. It is shown that a pivotal structure on a fusion category cannot exist unless certain involutions on the hom-sets are plus or minus the identity map, in which case a pivotal structure is the same thing as a twisted monoidal natural transformation of the identity functor on the category. Moreover the pivotal structure can be made spherical if and only if these signs can be removed.Comment: PhD thesis, University of Sheffield, submitted Oct 2008. 243 pages, many figure

    Ribbon Proofs - A Proof System for the Logic of Bunched Implications

    Get PDF
    Submitted for the degree of Doctor of Philosophy, Queen Mary, University of London

    From Reduction-Based to Reduction-Free Normalization

    Get PDF
    We present a systematic construction of a reduction-free normalization function. Starting from a reduction-based normalization function, i.e., the transitive closure of a one-step reduction function, we successively subject it to refocusing (i.e., deforestation of the intermediate reduced terms), simplification (i.e., fusing auxiliary functions), refunctionalization (i.e., Church encoding), and direct-style transformation (i.e., the converse of the CPS transformation). We consider two simple examples and treat them in detail: for the first one, arithmetic expressions, we construct an evaluation function; for the second one, terms in the free monoid, we construct an accumulator-based flatten function. The resulting two functions are traditional reduction-free normalization functions. The construction builds on previous work on refocusing and on a functional correspondence between evaluators and abstract machines. It is also reversible
    • …
    corecore