13 research outputs found

    Computer recognition of occluded curved line drawings

    Get PDF
    A computer program has been designed to interpret scenes from PEANUTS cartoons, viewing each scene as a two-dimensional representation of an event in the three-dimensional world. Characters are identified by name, their orientation and body position is described, and their relationship to other objects in the scene is indicated. This research is seen as an investigation of the problems in recognising flexible non-geometric objects which are subject to self-occlusion as well as occlusion by other objects. A hierarchy of models containing both shape and relational information has been developed to deal with the flexible cartoon bodies. Although the region is the basic unit used in the analysis, the hierarchy makes use of intermediate models to group individual regions into larger more meaningful functional units. These structures may be shared at a higher level in the hierarchy. Knowledge of model similarities may be applied to select alternative models and conserve some results of an incorrect model application. The various groupings account for differences among the characters or modifications in appearance due to changes in attitude. Context information plays a key role in the selection of models to deal with ambiguous shapes. By emphasising relationships between regions, the need for a precise description of shape is reduced. Occlusion interferes with the model-based analysis by obscuring the essential features required by the models. Both the perceived shape of the regions and the inter-relationships between them are altered. An heuristic based on the analysis of line junctions is used to confirm occlusion as the cause of the failure of a model-to-region match. This heuristic, an extension of the T-joint techniques of polyhedral domains, deals with "curved" junctions and can be applied to cases of multi-layered occlusion. The heuristic was found to be most effective in dealing with occlusion between separate objects; standard instances of self-occlusion were more effectively handled at the model level. This thesis describes the development of the program, structuring the discussion around three main problem areas: models, occlusion, and the control aspects of the system. Relevant portions of the programs analyses are used to illustrate each problem area

    Discrete Mathematics and Symmetry

    Get PDF
    Some of the most beautiful studies in Mathematics are related to Symmetry and Geometry. For this reason, we select here some contributions about such aspects and Discrete Geometry. As we know, Symmetry in a system means invariance of its elements under conditions of transformations. When we consider network structures, symmetry means invariance of adjacency of nodes under the permutations of node set. The graph isomorphism is an equivalence relation on the set of graphs. Therefore, it partitions the class of all graphs into equivalence classes. The underlying idea of isomorphism is that some objects have the same structure if we omit the individual character of their components. A set of graphs isomorphic to each other is denominated as an isomorphism class of graphs. The automorphism of a graph will be an isomorphism from G onto itself. The family of all automorphisms of a graph G is a permutation group

    Analogy and mathematical reasoning : a survey

    Get PDF
    We survey the literature of Artificial Intelligence, and other related work, pertaining to the modelling of mathematical reasoning and its relationship with the use of analogy. In particular, we discuss the contribution of Lenat's program AM to models of mathematical discovery and concept-formation. We consider the use of similarity measures to structure a knowledge space and their role in concept acquisition

    The Significance of Evidence-based Reasoning for Mathematics, Mathematics Education, Philosophy and the Natural Sciences

    Get PDF
    In this multi-disciplinary investigation we show how an evidence-based perspective of quantification---in terms of algorithmic verifiability and algorithmic computability---admits evidence-based definitions of well-definedness and effective computability, which yield two unarguably constructive interpretations of the first-order Peano Arithmetic PA---over the structure N of the natural numbers---that are complementary, not contradictory. The first yields the weak, standard, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically verifiable Tarskian truth values to the formulas of PA under the interpretation. The second yields a strong, finitary, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically computable Tarskian truth values to the formulas of PA under the interpretation. We situate our investigation within a broad analysis of quantification vis a vis: * Hilbert's epsilon-calculus * Goedel's omega-consistency * The Law of the Excluded Middle * Hilbert's omega-Rule * An Algorithmic omega-Rule * Gentzen's Rule of Infinite Induction * Rosser's Rule C * Markov's Principle * The Church-Turing Thesis * Aristotle's particularisation * Wittgenstein's perspective of constructive mathematics * An evidence-based perspective of quantification. By showing how these are formally inter-related, we highlight the fragility of both the persisting, theistic, classical/Platonic interpretation of quantification grounded in Hilbert's epsilon-calculus; and the persisting, atheistic, constructive/Intuitionistic interpretation of quantification rooted in Brouwer's belief that the Law of the Excluded Middle is non-finitary. We then consider some consequences for mathematics, mathematics education, philosophy, and the natural sciences, of an agnostic, evidence-based, finitary interpretation of quantification that challenges classical paradigms in all these disciplines

    Examining the effect of V3 interneurons and astrocytes on embryonic stem cell-derived motor neuron maturation in vitro

    Get PDF
    Motor function is fundamental to human survival and behaviour. Consequently, muscle impairment and paralysis can severely impact quality of life, as in patients with Amyotrophic Lateral Sclerosis (ALS) and spinal cord injury. Restoring muscle control following damage to motor circuits has proven an elusive goal. Strategies to artificially restore function to paralysed muscles are currently being investigated, and most methods rely on stimulation of host nerves. However, this is only effective in conditions where motoneurons and Neuromuscular Junctions (NMJs) remain intact, which is not the case in neurodegenerative disorders such as ALS. A novel strategy to overcome muscle paralysis involves two emerging technologies; the use of stem cell replacement strategies and optogenetics. Embryonic Stem Cell (ESC)-derived cells can be genetically manipulated to express light-sensitive genes such as ChannelRhodopsin-2 (ChR2). When grafted into a peripheral nerve within embryoid bodies (EBs), ESC-derived, ChR2-expressing motoneurons can grow axons that form functional NMJs, enabling optical control of muscle contraction by light stimulation of the graft. Surprisingly, engraftment of purified motoneuron aggregates does not result in the formation of functional NMJs. This suggests that other, non-motoneuronal cells within EBs contribute to the ability of ESC-derived motoneurons to mature and functionally innervate host muscle. In this Thesis, I examine the possibility that spontaneous activity arising from intra-graft microcircuits is necessary for motoneuron maturation. To investigate this possibility, I generated co-cultures in vitro with pure populations of ESC-derived motoneurons, astrocytes and V3 interneurons. Immunocytochemistry was used to assess the morphology and synapse formation of motoneurons when cultured alone or in combination with other cell types. The effect of co-culture on the electrophysiological properties and spontaneous activity of motoneurons was investigated by single cell patch-clamping and calcium imaging. My results show that co-culture of motoneurons with astrocytes promotes motoneuron survival and morphological maturation, accelerates their electrophysiological development, increases the density of motoneuronal cholinergic synapses, and enables the development of glutamatergic, motoneuronal spontaneous activity. Motoneurons in astrocyte-containing co-cultures also display spontaneous calcium activity, though calcium activity emerges a week later than spontaneous activity recorded by patch clamping and has different burst characteristics. In contrast to astrocytes, V3 interneurons alone have little, if any, effect on motoneuron maturation. However, together with astrocytes, V3 interneurons accelerate the emergence of spontaneous, glutamatergic activity, alter the density of glutamatergic and cholinergic synapses onto motoneurons, and lead to more mature patterns of spontaneous glutamatergic activity in the motoneuron co-cultures. These results provide insight into the influence of other cell types on motoneuron maturation and suggest that astrocytes and V3 interneurons could be added to motoneurons to produce a more mature, spontaneously active graft for use in cell replacement strategies to overcome muscle paralysis

    Development of a stochastic simulator for biological systems based on Calculus of Looping Sequences.

    Get PDF
    Molecular Biology produces a huge amount of data concerning the behavior of the single constituents of living organisms. Nevertheless, this reductionism view is not sucient to gain a deep comprehension of how such components interact together at the system level, generating the set of complex behavior we observe in nature. This is the main motivation of the rising of one of the most interesting and recent applications of computer science: Computational Systems Biology, a new science integrating experimental activity and mathematical modeling in order to study the organization principles and the dynamic behavior of biological systems. Among the formalisms that either have been applied to or have been inspired by biological systems there are automata based models, rewrite systems, and process calculi. Here we consider a formalism based on term rewriting called Calculus of Looping Sequences (CLS) aimed to model chemical and biological systems. In order to quantitatively simulate biological systems a stochastic extension of CLS has been developed; it allows to express rule schemata with the simplicity of notation of term rewriting and has some semantic means which are common in process calculi. In this thesis we carry out the study of the implementation of a stochastic simulator for the CLS formalism. We propose an extension of Gillespie's stochastic simulation algorithm that handles rule schemata with rate functions, and we present an efficient bottom-up, pre-processing based, CLS pattern matching algorithm. A simulator implementing the ideas introduced in this thesis, has been developed in F#, a multi-paradigm programming language for .NET framework modeled on OCaml. Although F# is a research project, still under continuous development, it has a product quality performance. It merges seamlessly the object oriented, the functional and the imperative programming paradigms, allowing to exploit the performance, the portability and the tools of .NET framework

    Coping with Uncertainty: Noun Phrase Interpretation and Early Semantic Analysis

    Get PDF
    A computer program which can "understand" natural language texts must have both syntactic knowledge about the language concerned and semantic knowledge of how what is written relates to its internal representation of the world. It has been a matter of some controversy how these sources of information can best be integrated to translate from an input text to a formal meaning representation. The controversy has concerned largely the question as to what degree of syntactic analysis must be performed before any semantic analysis can take place. An extreme position in this debate is that a syntactic parse tree for a complete sentence must be produced before any investigation of that sentence's meaning is appropriate. This position has been criticised by those who see understanding as a process that takes place gradually as the text is read, rather than in sudden bursts of activity at the ends of sentences. These people advocate a model where semantic analysis can operate on fragments of text before the global syntactic structure is determined - a strategy which we will call early semantic analysis. In this thesis, we investigate the implications of early semantic analysis in the interpretation of noun phrases. One possible approach is to say that a noun phrase is a self-contained unit and can be fully interpreted by the time it has been read. Thus it can always be determined what objects a noun phrase refers to without consulting much more than the structure of the phrase itself. This approach was taken in part by Winograd [Winograd 72], who saw the constraint that a noun phrase have a referent as a valuable aid in resolving local syntactic ambiguity. Unfortunately, Winograd's work has been criticised by Ritchie, because it is not always possible to determine what a noun phrase refers to purely on the basis of local information. In this thesis, we will go further than this and claim that, because the meaning of a noun phrase can be affected by so many factors outside the phrase itself, it makes no sense to talk about "the referent" as a function of -a noun phrase. Instead, the notion of "referent" is something defined by global issues of structure and consistency. Having rejected one approach to the early semantic analysis of noun phrases, we go on to develop an alternative, which we call incremental evaluation. The basic idea is that a noun phrase does provide some information about what it refers to. It should be possible to represent this partial information and gradually refine it as relevant implications of the context are followed up. Moreover, the partial information should be available to an inference system, which, amongst other things, can detect the absence of a referent and provide the advantages of Winograd's system. In our system, noun phrase interpretation does take place locally, but the point is that it does not finish there. Instead, the determination of the meaning of a noun phrase is spread over the subsequent analysis of how it contributes to the meaning of the text as a whole

    Computer Aided Verification

    Get PDF
    This open access two-volume set LNCS 11561 and 11562 constitutes the refereed proceedings of the 31st International Conference on Computer Aided Verification, CAV 2019, held in New York City, USA, in July 2019. The 52 full papers presented together with 13 tool papers and 2 case studies, were carefully reviewed and selected from 258 submissions. The papers were organized in the following topical sections: Part I: automata and timed systems; security and hyperproperties; synthesis; model checking; cyber-physical systems and machine learning; probabilistic systems, runtime techniques; dynamical, hybrid, and reactive systems; Part II: logics, decision procedures; and solvers; numerical programs; verification; distributed systems and networks; verification and invariants; and concurrency
    corecore