2,513 research outputs found

    A compositional theory of digital circuits

    Full text link
    A theory is compositional if complex components can be constructed out of simpler ones on the basis of their interfaces, without inspecting their internals. Digital circuits, despite being studied for nearly a century and used at scale for about half that time, have until recently evaded a fully compositional theoretical understanding. The sticking point has been the need to avoid feedback loops that bypass memory elements, the so called 'combinational feedback' problem. This requires examining the internal structure of a circuit, defeating compositionality. Recent work remedied this theoretical shortcoming by showing how digital circuits can be presented compositionally as morphisms in a freely generated Cartesian traced (or dataflow) category. The focus was to support a better syntactical understanding of digital circuits, culminating in the formulation of novel operational semantics for digital circuits. In this paper we shift the focus onto the denotational theory of such circuits, interpreting them as functions on streams with to certain properties. These ensure that the model is fully abstract, i.e. the equational theory and the semantic model are in perfect agreement. To support this result we introduce two key equations: the first can reduce circuits with combinational feedback to circuits without combinational feedback via finite unfoldings of the loop, and the second can translate between open circuits with the same behaviour syntactically by reducing the problem to checking a finite number of closed circuits. The most important consequence of this new semantics is that we can now give a recipe that ensures a circuit always produces observable output, thus using the denotational model to inform and improve the operational semantics.Comment: Restructured and refined presentation, 21 page

    Feature extraction and classification of movie reviews

    Get PDF

    Algebra, coalgebra, and minimization in polynomial differential equations

    Full text link
    We consider reasoning and minimization in systems of polynomial ordinary differential equations (ode's). The ring of multivariate polynomials is employed as a syntax for denoting system behaviours. We endow this set with a transition system structure based on the concept of Lie-derivative, thus inducing a notion of L-bisimulation. We prove that two states (variables) are L-bisimilar if and only if they correspond to the same solution in the ode's system. We then characterize L-bisimilarity algebraically, in terms of certain ideals in the polynomial ring that are invariant under Lie-derivation. This characterization allows us to develop a complete algorithm, based on building an ascending chain of ideals, for computing the largest L-bisimulation containing all valid identities that are instances of a user-specified template. A specific largest L-bisimulation can be used to build a reduced system of ode's, equivalent to the original one, but minimal among all those obtainable by linear aggregation of the original equations. A computationally less demanding approximate reduction and linearization technique is also proposed.Comment: 27 pages, extended and revised version of FOSSACS 2017 pape

    Apperceptive patterning: Artefaction, extensional beliefs and cognitive scaffolding

    Get PDF
    In “Psychopower and Ordinary Madness” my ambition, as it relates to Bernard Stiegler’s recent literature, was twofold: 1) critiquing Stiegler’s work on exosomatization and artefactual posthumanism—or, more specifically, nonhumanism—to problematize approaches to media archaeology that rely upon technical exteriorization; 2) challenging how Stiegler engages with Giuseppe Longo and Francis Bailly’s conception of negative entropy. These efforts were directed by a prevalent techno-cultural qualifier: the rise of Synthetic Intelligence (including neural nets, deep learning, predictive processing and Bayesian models of cognition). This paper continues this project but first directs a critical analytic lens at the Derridean practice of the ontologization of grammatization from which Stiegler emerges while also distinguishing how metalanguages operate in relation to object-oriented environmental interaction by way of inferentialism. Stalking continental (Kapp, Simondon, Leroi-Gourhan, etc.) and analytic traditions (e.g., Carnap, Chalmers, Clark, Sutton, Novaes, etc.), we move from artefacts to AI and Predictive Processing so as to link theories related to technicity with philosophy of mind. Simultaneously drawing forth Robert Brandom’s conceptualization of the roles that commitments play in retrospectively reconstructing the social experiences that lead to our endorsement(s) of norms, we compliment this account with Reza Negarestani’s deprivatized account of intelligence while analyzing the equipollent role between language and media (both digital and analog)
    • …
    corecore