24,186 research outputs found

    Modal Logics of Topological Relations

    Full text link
    Logical formalisms for reasoning about relations between spatial regions play a fundamental role in geographical information systems, spatial and constraint databases, and spatial reasoning in AI. In analogy with Halpern and Shoham's modal logic of time intervals based on the Allen relations, we introduce a family of modal logics equipped with eight modal operators that are interpreted by the Egenhofer-Franzosa (or RCC8) relations between regions in topological spaces such as the real plane. We investigate the expressive power and computational complexity of logics obtained in this way. It turns out that our modal logics have the same expressive power as the two-variable fragment of first-order logic, but are exponentially less succinct. The complexity ranges from (undecidable and) recursively enumerable to highly undecidable, where the recursively enumerable logics are obtained by considering substructures of structures induced by topological spaces. As our undecidability results also capture logics based on the real line, they improve upon undecidability results for interval temporal logics by Halpern and Shoham. We also analyze modal logics based on the five RCC5 relations, with similar results regarding the expressive power, but weaker results regarding the complexity

    On the possible Computational Power of the Human Mind

    Full text link
    The aim of this paper is to address the question: Can an artificial neural network (ANN) model be used as a possible characterization of the power of the human mind? We will discuss what might be the relationship between such a model and its natural counterpart. A possible characterization of the different power capabilities of the mind is suggested in terms of the information contained (in its computational complexity) or achievable by it. Such characterization takes advantage of recent results based on natural neural networks (NNN) and the computational power of arbitrary artificial neural networks (ANN). The possible acceptance of neural networks as the model of the human mind's operation makes the aforementioned quite relevant.Comment: Complexity, Science and Society Conference, 2005, University of Liverpool, UK. 23 page

    The complexity of coverability in ν-Petri nets

    Get PDF
    We show that the coverability problem in ν-Petri nets is complete for ‘double Ackermann’ time, thus closing an open complexity gap between an Ackermann lower bound and a hyper-Ackermann upper bound. The coverability problem captures the verification of safety properties in this nominal extension of Petri nets with name management and fresh name creation. Our completeness result establishes ν-Petri nets as a model of intermediate power among the formalisms of nets enriched with data, and relies on new algorithmic insights brought by the use of well-quasi-order ideals

    Afterschool Matters Spring 2005

    Get PDF
    So You Want to Be a Superhero? How Making Comics in an Afterschool Setting Can Develop Young People’s Creativity, Literacy, and Identity By Sarita KhuranaA unique afterschool class in making comic strips and comic books, taught by a professional comic artist, encourages both literacy development and identity development in adolescent participants. 9 pages. It Means Thank You : Culturally Sensitive Literacy Pedagogy in a Migrant Education ProgramBy Theresa McGinnisMultilingual and multimodal literacy practices in a out-of-school migrant education program support Cambodian (ethnic Khmer) youth in using diverse modes of communication, revealing the intimate connections among literacy, language, culture, and identity. 7 pages. Co-constructing Space for Literacy and Identity Work with LGBTQ YouthBy Mollie V. BlackburnAdult facilitators in afterschool programs can work with LGBTQ youth to construct a safe space in which the youth can validate their identities in the process of doing literacy work. 7 pages. Fabulous Fashions: Links to Learning, Literacy, and LifeBy Anne L. ThompsonStudents will apply themselves to learning if the context interests them. Focusing on a subject close to middle school students’ hearts, such as fashion, rather than on specific academic tasks such as writing or researching, builds intrinsic motivation for learning. 9 pages. Embedding Seeds for Better Learning: Sneaking up on Education in a Youth Gardening ProgramBy Jrène Rahm and Kenneth GrimesA 4-H program embeds science learning in an entrepreneurial program in which youth plant, harvest, and market their own produce. 9 pages. Doing Hair and Literacy in an Afterschool Reading and Writing Workshop for African-American Adolescent GirlsBy Daneell EdwardsAfrican-American adolescent girls who expressed little interest in literacy activities nevertheless enthusiastically engaged in reading and writing around a topic that mattered to them—doing hair—particularly when they were allowed to determine the format of the literacy activities. 9 pages.https://repository.wellesley.edu/afterschoolmatters/1007/thumbnail.jp

    On the Computational Complexity and Formal Hierarchy of Second Order Recurrent Neural Networks

    Full text link
    Artificial neural networks (ANNs) with recurrence and self-attention have been shown to be Turing-complete (TC). However, existing work has shown that these ANNs require multiple turns or unbounded computation time, even with unbounded precision in weights, in order to recognize TC grammars. However, under constraints such as fixed or bounded precision neurons and time, ANNs without memory are shown to struggle to recognize even context-free languages. In this work, we extend the theoretical foundation for the 2nd2^{nd}-order recurrent network (2nd2^{nd} RNN) and prove there exists a class of a 2nd2^{nd} RNN that is Turing-complete with bounded time. This model is capable of directly encoding a transition table into its recurrent weights, enabling bounded time computation and is interpretable by design. We also demonstrate that 22nd order RNNs, without memory, under bounded weights and time constraints, outperform modern-day models such as vanilla RNNs and gated recurrent units in recognizing regular grammars. We provide an upper bound and a stability analysis on the maximum number of neurons required by 22nd order RNNs to recognize any class of regular grammar. Extensive experiments on the Tomita grammars support our findings, demonstrating the importance of tensor connections in crafting computationally efficient RNNs. Finally, we show 2nd2^{nd} order RNNs are also interpretable by extraction and can extract state machines with higher success rates as compared to first-order RNNs. Our results extend the theoretical foundations of RNNs and offer promising avenues for future explainable AI research.Comment: 12 pages, 5 tables, 1 figur

    On security analysis of periodic systems: expressiveness and complexity

    Get PDF
    Development of automated technological systems has seen the increase in interconnectivity among its components. This includes Internet of Things (IoT) and Industry 4.0 (I4.0) and the underlying communication between sensors and controllers. This paper is a step toward a formal framework for specifying such systems and analyzing underlying properties including safety and security. We introduce automata systems (AS) motivated by I4.0 applications. We identify various subclasses of AS that reflect different types of requirements on I4.0. We investigate the complexity of the problem of functional correctness of these systems as well as their vulnerability to attacks. We model the presence of various levels of threats to the system by proposing a range of intruder models, based on the number of actions intruders can use
    • …
    corecore