386 research outputs found
Discrete Mathematics
The purpose of the present work is to provide short and supple teaching notes
for a hours introductory course on elementary \textit{Enumerative
Algebraic Combinatorics}. We fully adopt the \textit{Rota way} (see, e.g.
\cite{KY}). The themes are organized into a suitable sequence that allows us to
derive any result from the preceding ones by elementary processes. Definitions
of \textit{combinatorial coefficients} are just by their \textit{combinatorial
meaning}. The derivation techniques of formulae/results are founded upon
constructions and two general and elementary principles/methods:
- The \textit{bad element} method (for \textit{recursive} formulae). As the
reader should recognize, the bad element method might be regarded as a
combinatorial companion of the idea of \textit{conditional probability}.
- The \textit{overcounting} principle (for \textit{close form} formulae).
Therefore, \textit{no computation} is required in \textit{proofs}:
\textit{computation formulae are byproducts of combinatorial constructions}. We
tried to provide a self-contained presentation: the only prerequisite is
standard high school mathematics. We limited ourselves to the
\textit{combinatorial point of view}: we invite the reader to draw the
(obvious) \textit{probabilistic interpretations}
Emergent Design
Explorations in Systems Phenomenology in Relation to Ontology, Hermeneutics and the Meta-dialectics of Design
SYNOPSIS
A Phenomenological Analysis of Emergent Design is performed based on the foundations of General Schemas Theory. The concept of Sign Engineering is explored in terms of Hermeneutics, Dialectics, and Ontology in order to define Emergent Systems and Metasystems Engineering based on the concept of Meta-dialectics.
ABSTRACT
Phenomenology, Ontology, Hermeneutics, and Dialectics will dominate our inquiry into
the nature of the Emergent Design of the System and its inverse dual, the Meta-system. This is an speculative dissertation that attempts to produce a philosophical, mathematical, and theoretical view of the nature of Systems Engineering Design. Emergent System Design, i.e., the design of yet unheard of and/or hitherto non-existent Systems and Metasystems is the focus. This study is a frontal assault on the hard problem of explaining how Engineering produces new things, rather than a repetition or reordering of concepts that already exist. In this work the philosophies of E. Husserl, A. Gurwitsch, M. Heidegger, J. Derrida, G. Deleuze, A. Badiou, G. Hegel, I. Kant and other Continental Philosophers are brought to bear on different aspects of how new technological systems come into existence through the midwifery of Systems Engineering. Sign Engineering is singled out as the most important aspect of Systems Engineering. We will build on the work of Pieter Wisse and extend his theory of Sign Engineering to define Meta-dialectics in the form of Quadralectics and then Pentalectics. Along the way the various ontological levels of Being are explored in conjunction with the discovery that the Quadralectic is related to the possibility of design primarily at the Third Meta-level of Being, called Hyper Being. Design Process is dependent upon the emergent possibilities that appear in Hyper Being. Hyper Being, termed by Heidegger as Being (Being crossed-out) and termed by Derrida as Differance, also appears as the widest space within the Design Field at the third meta-level of Being and therefore provides the most leverage that is needed to produce emergent effects. Hyper Being is where possibilities appear within our worldview. Possibility is necessary for emergent events to occur. Hyper Being possibilities are extended by Wild Being propensities to allow the embodiment of new things. We discuss how this philosophical background relates to meta-methods such as the Gurevich Abstract State Machine and the Wisse Metapattern methods, as well as real-time architectural design methods as described in the Integral Software Engineering Methodology. One aim of this research is to find the foundation for extending the ISEM methodology to become a general purpose Systems Design Methodology. Our purpose is also to bring these philosophical considerations into the practical realm by examining P. Bourdieu’s ideas on the relationship between theoretical and practical reason and M. de Certeau’s ideas on practice. The relationship between design and implementation is seen in terms of the Set/Mass conceptual opposition. General Schemas Theory is used as a way of critiquing the dependence of Set based mathematics as a basis for Design. The dissertation delineates a new foundation for Systems Engineering as Emergent Engineering based on General Schemas Theory, and provides an advanced theory of Design based on the understanding of the meta-levels of Being, particularly focusing upon the relationship between Hyper Being and Wild Being in the context of Pure and Process Being
Preliminares al estudio de la huella en lingüística
The present paper constitutes a brief advance of much longer and more detailed ongoing work on the concept of “trace” in contemporary linguistic theory, particularly in syntax. It is commonly believed that the idea was coined by Noam Chomsky. However, we already detect its use, with a very accurate value, in the early work of Zellig Harris on mathematical linguistics or, to be more precise, on mathematical structures of language. In its origins, rather than being an index responsible for marking the location occupied by a unit previous to its syntactic movement (which always takes the form of fronting ), the trace was the result of a matrix product between n-adic functions. Thus, in Harris the trace is primarily a concept anchored in matrix calculus, or, put it differently, an algebraic notion. Chomsky’s notion, on its turn, is closely related with the LISP programming language. This text seeks to provide a preliminary analysis of the conceptual complexity implied in the concept of trace, which linguists should become aware of, for otherwise they will be doomed to be entangled in misunderstandings unfruitful to our discipline for decades to come.El presente documento constituye un breve avance de una obra en curso mucho más larga y más detallada sobre el concepto de “huella” en la teoría lingüística contemporánea, particularmente en la sintaxis. Se cree, por lo común, que la idea fue acuñada por Noam Chomsky. Sin embargo, ya detectamos su uso, con un valor muy preciso, en los primeros trabajos de Zellig Harris sobre lingüística matemática o, para ser más exactos, sobre estructuras 2matemáticas del lenguaje. En sus orígenes, en lugar de ser un índice responsable de marcar la ubicación de una unidad antes de su movimiento sintáctico (que siempre toma la forma de fronting), la traza o huella era el resultado de un producto matricial entre funciones n-ádicas. Por lo tanto, en Harris la huella es principalmente un concepto anclado en el cálculo matricial o, dicho de otro modo, una noción algebraica. La noción de Chomsky, por su parte, está estrechamente relacionada con el lenguaje de programación LISP. EL presente texto busca proporcionar un análisis preliminar de la complejidad conceptual implícita en el concepto de huella, del cual los lingüistas deben tomar conciencia, porque de lo contrario estarán condenados a enredarse en malentendidos infructuosos para nuestra disciplina durante las próximas décadas
Long Sequence Hopfield Memory
Sequence memory is an essential attribute of natural and artificial
intelligence that enables agents to encode, store, and retrieve complex
sequences of stimuli and actions. Computational models of sequence memory have
been proposed where recurrent Hopfield-like neural networks are trained with
temporally asymmetric Hebbian rules. However, these networks suffer from
limited sequence capacity (maximal length of the stored sequence) due to
interference between the memories. Inspired by recent work on Dense Associative
Memories, we expand the sequence capacity of these models by introducing a
nonlinear interaction term, enhancing separation between the patterns. We
derive novel scaling laws for sequence capacity with respect to network size,
significantly outperforming existing scaling laws for models based on
traditional Hopfield networks, and verify these theoretical results with
numerical simulation. Moreover, we introduce a generalized pseudoinverse rule
to recall sequences of highly correlated patterns. Finally, we extend this
model to store sequences with variable timing between states' transitions and
describe a biologically-plausible implementation, with connections to motor
neuroscience.Comment: NeurIPS 2023 Camera-Ready, 41 page
On Musical Self-Similarity : Intersemiosis as Synecdoche and Analogy
Self-similarity, a concept borrowed from mathematics, is gradually becoming a keyword in musicology. Although a polysemic term, self-similarity often refers to the multi-scalar feature repetition in a set of relationships, and it is commonly valued as an indication for musical ‘coherence’ and ‘consistency’. In this study, Gabriel Pareyon presents a theory of musical meaning formation in the context of intersemiosis, that is, the translation of meaning from one cognitive domain to another cognitive domain (e.g. from mathematics to music, or to speech or graphic forms). From this perspective, the degree of coherence of a musical system relies on a synecdochic intersemiosis: a system of related signs within other comparable and correlated systems. The author analyzes the modalities of such correlations, exploring their general and particular traits, and their operational bounds. Accordingly, the notion of analogy is used as a rich concept through its two definitions quoted by the Classical literature—proportion and paradigm, enormously valuable in establishing measurement, likeness and affinity criteria. At the same time, original arguments by Benoît B. Mandelbrot (1924–2010) are revised, alongside a systematic critique of the literature on the subject. In fact, connecting Charles S. Peirce’s ‘synechism’ with Mandelbrot’s ‘fractality’ is one of the main developments of the present study
Graph Theory and Universal Grammar
Tese arquivada ao abrigo da Portaria nº 227/2017 de 25 de Julho-Registo de Grau EstrangeiroIn the last few years, Noam Chomsky (1994; 1995; 2000; 2001) has gone quite far in
the direction of simplifying syntax, including eliminating X-bar theory and the levels
of D-structure and S-structure entirely, as well as reducing movement rules to a
combination of the more primitive operations of Copy and Merge. What remain in
the Minimalist Program are the operations Merge and Agree and the levels of LF
(Logical Form) and PF (Phonological form).
My doctoral thesis attempts to offer an economical theory of syntactic structure
from a graph-theoretic point of view (cf. Diestel, 2005), with special emphases on the
elimination of category and projection labels and the Inclusiveness Condition
(Chomsky 1994). The major influences for the development of such a theory have
been Chris Collins’ (2002) seminal paper “Eliminating labels”, John Bowers (2001)
unpublished manuscript “Syntactic Relations” and the Cartographic Paradigm (see
Belletti, Cinque and Rizzi’s volumes on OUP for a starting point regarding this
paradigm).
A syntactic structure will be regarded here as a graph consisting of the set of
lexical items, the set of relations among them and nothing more
- …