375,005 research outputs found

    Active Virtual Network Management Prediction: Complexity as a Framework for Prediction, Optimization, and Assurance

    Full text link
    Research into active networking has provided the incentive to re-visit what has traditionally been classified as distinct properties and characteristics of information transfer such as protocol versus service; at a more fundamental level this paper considers the blending of computation and communication by means of complexity. The specific service examined in this paper is network self-prediction enabled by Active Virtual Network Management Prediction. Computation/communication is analyzed via Kolmogorov Complexity. The result is a mechanism to understand and improve the performance of active networking and Active Virtual Network Management Prediction in particular. The Active Virtual Network Management Prediction mechanism allows information, in various states of algorithmic and static form, to be transported in the service of prediction for network management. The results are generally applicable to algorithmic transmission of information. Kolmogorov Complexity is used and experimentally validated as a theory describing the relationship among algorithmic compression, complexity, and prediction accuracy within an active network. Finally, the paper concludes with a complexity-based framework for Information Assurance that attempts to take a holistic view of vulnerability analysis

    Self-directedness, integration and higher cognition

    Get PDF
    In this paper I discuss connections between self-directedness, integration and higher cognition. I present a model of self-directedness as a basis for approaching higher cognition from a situated cognition perspective. According to this model increases in sensorimotor complexity create pressure for integrative higher order control and learning processes for acquiring information about the context in which action occurs. This generates complex articulated abstractive information processing, which forms the major basis for higher cognition. I present evidence that indicates that the same integrative characteristics found in lower cognitive process such as motor adaptation are present in a range of higher cognitive process, including conceptual learning. This account helps explain situated cognition phenomena in humans because the integrative processes by which the brain adapts to control interaction are relatively agnostic concerning the source of the structure participating in the process. Thus, from the perspective of the motor control system using a tool is not fundamentally different to simply controlling an arm

    Behavorial Effects in Individual Decisions of Network Formation

    Get PDF
    Network formation constitutes an important part of many social and economic processes, but relatively little is known about how individuals make their linking decisions. This article provides an experimental investigation of behavioral effects in individual decisions of network formation. Our findings demonstrate that individuals systematically simplify more complex components of network payoff in their linking decisions. Specifically, they focus on only part of the normative payoff, namely on their own direct payoff and tend to ignore indirect payoff and payoff for others in the network. Additionally, individuals use descriptive behavioral traits of link choice alternatives to guide their choices. They are sensitive to whether an alternative involves link deletion or creation and whether it concerns an isolated or a central node. Furthermore, we find that complexity of one type can moderate individuals’ dealing with a complex feature of another type. These behavioral effects have important implications for researchers and managers working in areas that involve network formation.Economics (Jel: A)

    Scalable Greedy Algorithms for Transfer Learning

    Full text link
    In this paper we consider the binary transfer learning problem, focusing on how to select and combine sources from a large pool to yield a good performance on a target task. Constraining our scenario to real world, we do not assume the direct access to the source data, but rather we employ the source hypotheses trained from them. We propose an efficient algorithm that selects relevant source hypotheses and feature dimensions simultaneously, building on the literature on the best subset selection problem. Our algorithm achieves state-of-the-art results on three computer vision datasets, substantially outperforming both transfer learning and popular feature selection baselines in a small-sample setting. We also present a randomized variant that achieves the same results with the computational cost independent from the number of source hypotheses and feature dimensions. Also, we theoretically prove that, under reasonable assumptions on the source hypotheses, our algorithm can learn effectively from few examples

    Reconstruction of the primordial Universe by a Monge--Ampere--Kantorovich optimisation scheme

    Get PDF
    A method for the reconstruction of the primordial density fluctuation field is presented. Various previous approaches to this problem rendered {\it non-unique} solutions. Here, it is demonstrated that the initial positions of dark matter fluid elements, under the hypothesis that their displacement is the gradient of a convex potential, can be reconstructed uniquely. In our approach, the cosmological reconstruction problem is reformulated as an assignment problem in optimisation theory. When tested against numerical simulations, our scheme yields excellent reconstruction on scales larger than a few megaparsecs.Comment: 14 pages, 10 figure

    Complexity, parallel computation and statistical physics

    Full text link
    The intuition that a long history is required for the emergence of complexity in natural systems is formalized using the notion of depth. The depth of a system is defined in terms of the number of parallel computational steps needed to simulate it. Depth provides an objective, irreducible measure of history applicable to systems of the kind studied in statistical physics. It is argued that physical complexity cannot occur in the absence of substantial depth and that depth is a useful proxy for physical complexity. The ideas are illustrated for a variety of systems in statistical physics.Comment: 21 pages, 7 figure

    Computational statistics using the Bayesian Inference Engine

    Full text link
    This paper introduces the Bayesian Inference Engine (BIE), a general parallel, optimised software package for parameter inference and model selection. This package is motivated by the analysis needs of modern astronomical surveys and the need to organise and reuse expensive derived data. The BIE is the first platform for computational statistics designed explicitly to enable Bayesian update and model comparison for astronomical problems. Bayesian update is based on the representation of high-dimensional posterior distributions using metric-ball-tree based kernel density estimation. Among its algorithmic offerings, the BIE emphasises hybrid tempered MCMC schemes that robustly sample multimodal posterior distributions in high-dimensional parameter spaces. Moreover, the BIE is implements a full persistence or serialisation system that stores the full byte-level image of the running inference and previously characterised posterior distributions for later use. Two new algorithms to compute the marginal likelihood from the posterior distribution, developed for and implemented in the BIE, enable model comparison for complex models and data sets. Finally, the BIE was designed to be a collaborative platform for applying Bayesian methodology to astronomy. It includes an extensible object-oriented and easily extended framework that implements every aspect of the Bayesian inference. By providing a variety of statistical algorithms for all phases of the inference problem, a scientist may explore a variety of approaches with a single model and data implementation. Additional technical details and download details are available from http://www.astro.umass.edu/bie. The BIE is distributed under the GNU GPL.Comment: Resubmitted version. Additional technical details and download details are available from http://www.astro.umass.edu/bie. The BIE is distributed under the GNU GP

    The evolutionary origins of volition

    Get PDF
    It appears to be a straightforward implication of distributed cognition principles that there is no integrated executive control system (e.g. Brooks 1991, Clark 1997). If distributed cognition is taken as a credible paradigm for cognitive science this in turn presents a challenge to volition because the concept of volition assumes integrated information processing and action control. For instance the process of forming a goal should integrate information about the available action options. If the goal is acted upon these processes should control motor behavior. If there were no executive system then it would seem that processes of action selection and performance couldn’t be functionally integrated in the right way. The apparently centralized decision and action control processes of volition would be an illusion arising from the competitive and cooperative interaction of many relatively simple cognitive systems. Here I will make a case that this conclusion is not well-founded. Prima facie it is not clear that distributed organization can achieve coherent functional activity when there are many complex interacting systems, there is high potential for interference between systems, and there is a need for focus. Resolving conflict and providing focus are key reasons why executive systems have been proposed (Baddeley 1986, Norman and Shallice 1986, Posner and Raichle 1994). This chapter develops an extended theoretical argument based on this idea, according to which selective pressures operating in the evolution of cognition favor high order control organization with a ‘highest-order’ control system that performs executive functions

    Optimal Testing for Planted Satisfiability Problems

    Get PDF
    We study the problem of detecting planted solutions in a random satisfiability formula. Adopting the formalism of hypothesis testing in statistical analysis, we describe the minimax optimal rates of detection. Our analysis relies on the study of the number of satisfying assignments, for which we prove new results. We also address algorithmic issues, and give a computationally efficient test with optimal statistical performance. This result is compared to an average-case hypothesis on the hardness of refuting satisfiability of random formulas
    • …
    corecore