1,512 research outputs found

    Function definitions in term rewriting and applicative programming

    Get PDF
    The frameworks of unconditional and conditional Term Rewriting and Applicative systems are explored with the objective of using them for defining functions. In particular, a new operational semantics, Tue-Reduction, is elaborated for conditional term rewriting systems. For each framework, the concept of evaluation of terms invoking defined functions is formalized. We then discuss how it may be ensured that a function definition in each of these frameworks is meaningful, by defining restrictions that may be imposed to guarantee termination, unambiguity, and completeness of definition. The three frameworks are then compared, studying when a definition may be translated from one formalism to another

    Embodied Question Answering

    Full text link
    We present a new AI task -- Embodied Question Answering (EmbodiedQA) -- where an agent is spawned at a random location in a 3D environment and asked a question ("What color is the car?"). In order to answer, the agent must first intelligently navigate to explore the environment, gather information through first-person (egocentric) vision, and then answer the question ("orange"). This challenging task requires a range of AI skills -- active perception, language understanding, goal-driven navigation, commonsense reasoning, and grounding of language into actions. In this work, we develop the environments, end-to-end-trained reinforcement learning agents, and evaluation protocols for EmbodiedQA.Comment: 20 pages, 13 figures, Webpage: https://embodiedqa.org

    Automata-Based Stream Processing

    Get PDF
    We propose an automata-theoretic framework for modularly expressing computations on streams of data. With weighted automata as a starting point, we identify three key features that are useful for an automaton model for stream processing: expressing the regular decomposition of streams whose data items are elements of a complex type (e.g., tuple of values), allowing the hierarchical nesting of several different kinds of aggregations, and specifying modularly the parallel execution and combination of various subcomputations. The combination of these features leads to subtle efficiency considerations that concern the interaction between nondeterminism, hierarchical nesting, and parallelism. We identify a syntactic restriction where the nondeterminism is unambiguous and parallel subcomputations synchronize their outputs. For automata satisfying these restrictions, we show that there is a space- and time-efficient streaming evaluation algorithm. We also prove that when these restrictions are relaxed, the evaluation problem becomes inherently computationally expensive

    Unsupervised learning of probabilistic grammars

    Get PDF
    Probabilistic grammars define joint probability distributions over sentences and their grammatical structures. They have been used in many areas, such as natural language processing, bioinformatics and pattern recognition, mainly for the purpose of deriving grammatical structures from data (sentences). Unsupervised approaches to learning probabilistic grammars induce a grammar from unannotated sentences, which eliminates the need for manual annotation of grammatical structures that can be laborious and error-prone. In this thesis we study unsupervised learning of probabilistic context-free grammars and probabilistic dependency grammars, both of which are expressive enough for many real-world languages but remain tractable in inference. We investigate three different approaches. The first approach is a structure search approach for learning probabilistic context-free grammars. It acquires rules of an unknown probabilistic context-free grammar through iterative coherent biclustering of the bigrams in the training corpus. A greedy procedure is used in our approach to add rules from biclusters such that each set of rules being added into the grammar results in the largest increase in the posterior of the grammar given the training corpus. Our experiments on several benchmark datasets show that this approach is competitive with existing methods for unsupervised learning of context-free grammars. The second approach is a parameter learning approach for learning natural language grammars based on the idea of unambiguity regularization. We make the observation that natural language is remarkably unambiguous in the sense that each natural language sentence has a large number of possible parses but only a few of the parses are syntactically valid. We incorporate this prior information into parameter learning by means of posterior regularization. The resulting algorithm family contains classic EM and Viterbi EM, as well as a novel softmax-EM algorithm that can be implemented with a simple and efficient extension to classic EM. Our experiments show that unambiguity regularization improves natural language grammar learning, and when combined with other techniques our approach achieves the state-of-the-art grammar learning results. The third approach is grammar learning with a curriculum. A curriculum is a means of presenting training samples in a meaningful order. We introduce the incremental construction hypothesis that explains the benefits of a curriculum in learning grammars and offers some useful insights into the design of curricula as well as learning algorithms. We present results of experiments with (a) carefully crafted synthetic data that provide support for our hypothesis and (b) natural language corpus that demonstrate the utility of curricula in unsupervised learning of real-world probabilistic grammars

    The Emergence of Leader-Society Value Congruence: A Cross-Cultural Perspective

    Get PDF
    Previous research on cross-cultural leadership has focused on the outcomes associated with leadership factors consistent with national cultural values without exploring how leaders’ individual cultural orientations become congruent with the societal culture in different national settings. The purpose of this paper is to provide a deeper understanding of how leader-society value congruence is produced and how the degree of such congruency varies across cultures. This paper conceptually clarifies the mechanisms that mediate the influence of cultural context on leader-society value congruence; suggests that the effects of societal context are only distal antecedents of producing congruence between leaders’ individual and societal level cultural values; and concludes that their effects are manifest via their impact on self-construal and communication patterns

    Zombies and clones in diversity management

    Get PDF

    FHJ: A Formal Model for Hierarchical Dispatching and Overriding

    Get PDF
    Multiple inheritance is a valuable feature for Object-Oriented Programming. However, it is also tricky to get right, as illustrated by the extensive literature on the topic. A key issue is the ambiguity arising from inheriting multiple parents, which can have conflicting methods. Numerous existing work provides solutions for conflicts which arise from diamond inheritance: i.e. conflicts that arise from implementations sharing a common ancestor. However, most mechanisms are inadequate to deal with unintentional method conflicts: conflicts which arise from two unrelated methods that happen to share the same name and signature. This paper presents a new model called Featherweight Hierarchical Java (FHJ) that deals with unintentional method conflicts. In our new model, which is partly inspired by C++, conflicting methods arising from unrelated methods can coexist in the same class, and hierarchical dispatching supports unambiguous lookups in the presence of such conflicting methods. To avoid ambiguity, hierarchical information is employed in method dispatching, which uses a combination of static and dynamic type information to choose the implementation of a method at run-time. Furthermore, unlike all existing inheritance models, our model supports hierarchical method overriding: that is, methods can be independently overridden along the multiple inheritance hierarchy. We give illustrative examples of our language and features and formalize FHJ as a minimal Featherweight-Java style calculus
    • …
    corecore