55 research outputs found

    Structural Properties of the Caenorhabditis elegans Neuronal Network

    Get PDF
    Despite recent interest in reconstructing neuronal networks, complete wiring diagrams on the level of individual synapses remain scarce and the insights into function they can provide remain unclear. Even for Caenorhabditis elegans, whose neuronal network is relatively small and stereotypical from animal to animal, published wiring diagrams are neither accurate nor complete and self-consistent. Using materials from White et al. and new electron micrographs we assemble whole, self-consistent gap junction and chemical synapse networks of hermaphrodite C. elegans. We propose a method to visualize the wiring diagram, which reflects network signal flow. We calculate statistical and topological properties of the network, such as degree distributions, synaptic multiplicities, and small-world properties, that help in understanding network signal propagation. We identify neurons that may play central roles in information processing and network motifs that could serve as functional modules of the network. We explore propagation of neuronal activity in response to sensory or artificial stimulation using linear systems theory and find several activity patterns that could serve as substrates of previously described behaviors. Finally, we analyze the interaction between the gap junction and the chemical synapse networks. Since several statistical properties of the C. elegans network, such as multiplicity and motif distributions are similar to those found in mammalian neocortex, they likely point to general principles of neuronal networks. The wiring diagram reported here can help in understanding the mechanistic basis of behavior by generating predictions about future experiments involving genetic perturbations, laser ablations, or monitoring propagation of neuronal activity in response to stimulation

    Transforming non textually aligned SPMD programs into textually aligned SPMD programs by using rewriting rules

    Get PDF
    International audienceThe problem of analyzing parallel programs that access shared memory and use barrier synchronization is known to be hard to study. For a special case of those programs with minimal SPMD (Single Program Multiple Data) constructs, a formal definition of textually aligned barriers with an operational semantics has been proposed in previous work. Then, the textual alignement of the synchronization barriers that is defined prevents deadlocks. However, the textual alignement property is not verified by all SPMD programs. We propose a set of transformation rules using rewriting techniques which allows to turn a non-textually aligned program to be textually aligned. So, we can benefit of a simple static analysis for deadlock detection. We show that the rewrite rules form a terminating confluent system and we prove that the transformation rules preserve the semantics of the programs

    The Role of Information in Multi-Agent Decision Making

    Get PDF
    Networked multi-agent systems have become an integral part of many engineering systems. Collaborative decision making in multi-agent systems poses many challenges. In this thesis, we study the impact of information and its availability to agents on collaborative decision making in multi-agent systems. We consider the problem of detecting Markov and Gaussian models from observed data using two observers. We consider two Markov chains and two observers. Each observer observes a different function of the state of the true unknown Markov chain. Given the observations, the aim is to find which of the two Markov chains has generated the observations. We formulate block binary hypothesis testing problem for each observer and show that the decision for each observer is a function of the local likelihood ratio. We present a consensus scheme for the observers to agree on their beliefs and the asymptotic convergence of the consensus decision to the true hypothesis is proven. A similar problem framework is considered for the detection of Gaussian models using two observers. Sequential hypothesis testing problem is formulated for each observer and solved using local likelihood ratio. We present a consensus scheme taking into account the random and asymmetric stopping time of the observers. The notion of ``value of information" is introduced to understand the ``usefulness" of the information exchanged to achieve consensus. Next, we consider the binary hypothesis testing problem with two observers. There are two possible states of nature. There are two observers which collect observations that are statistically related to the true state of nature. The two observers are assumed to be synchronous. Given the observations, the objective of the observers is to collaboratively find the true state of nature. We consider centralized and decentralized approaches to solve the problem. In each approach there are two phases: (1) probability space construction: the true hypothesis is known, observations are collected to build empirical joint distributions between hypothesis and the observations; (2) given a new set of observations, hypothesis testing problems are formulated for the observers to find their individual beliefs about the true hypothesis. Consensus schemes for the observers to agree on their beliefs about the true hypothesis are presented. The rate of decay of the probability of error in the centralized approach and rate of decay of the probability of agreement on the wrong belief in the decentralized approach are compared. Numerical results comparing the centralized and decentralized approaches are presented. All propositions from the set of events for an agent in a multi-agent system might not be simultaneously verifiable. We study the concepts of \textit{event-state-operation structure} and \textit{relationship of incompatibility} from literature and use them as a tool to study the structure of the set of events. We present an example from multi-agent hypothesis testing where the set of events do not form a boolean algebra, but form an ortholattice. A possible construction of a 'noncommutative probability space', accounting for \textit{incompatible events} (events which cannot be simultaneously verified) is discussed. As a possible decision-making problem in such a probability space, we consider the binary hypothesis testing problem. We present two approaches to this decision-making problem. In the first approach, we represent the available data as coming from measurements modeled via projection valued measures (PVM) and retrieve the results of the underlying detection problem solved using classical probability models. In the second approach, we represent the measurements using positive operator valued measures (POVM). We prove that the minimum probability of error achieved in the second approach is the same as in the first approach. Finally, we consider the binary hypothesis testing problem with learning of empirical distributions. The true distributions of the observations under either hypothesis are unknown. Empirical distributions are estimated from observations. A sequence of detection problems is solved using the sequence of empirical distributions. The convergence of the information state and optimal detection cost under empirical distributions to the information state and optimal detection cost under the true distribution are shown. Numerical results on the convergence of optimal detection cost are presented

    Engineering Physics and Mathematics Division progress report for period ending December 31, 1994

    Full text link

    Code-Optimierung im Polyedermodell - Effizienzsteigerung von parallelen Schleifensätzen

    Get PDF
    A safe basis for automatic loop parallelization is the polyhedron model which represents the iteration domain of a loop nest as a polyhedron in Zn\mathbb{Z}^n. However, turning the parallel loop program in the model to efficient code meets with several obstacles, due to which performance may deteriorate seriously -- especially on distributed memory architectures. We introduce a fine-grained model of the computation performed and show how this model can be applied to create efficient code

    Compiling concurrency correctly: verifying software transactional memory

    Get PDF
    Concurrent programming is notoriously difficult, but with multi-core processors becoming the norm, is now a reality that every programmer must face. Concurrency has traditionally been managed using low-level mutual exclusion /locks/, which are error-prone and do not naturally support the compositional style of programming that is becoming indispensable for today's large-scale software projects. A novel, high-level approach that has emerged in recent years is that of /software transactional memory/ (STM), which avoids the need for explicit locking, instead presenting the programmer with a declarative approach to concurrency. However, its implementation is much more complex and subtle, and ensuring its correctness places significant demands on the compiler writer. This thesis considers the problem of formally verifying an implementation of STM. Utilising a minimal language incorporating only the features that we are interested in studying, we first explore various STM design choices, along with the issue of compiler correctness via the use of automated testing tools. Then we outline a new approach to concurrent compiler correctness using the notion of bisimulation, implemented using the Agda theorem prover. Finally, we show how bisimulation can be used to establish the correctness of a low-level implementation of software transactional memory
    • …
    corecore