428 research outputs found

    Canonical Models and the Law of Requisite Variety

    Get PDF
    The law of requisite variety from cybernetics is shown to be related to the reachability and observability properties of a dynamical control system. In particular, it is established that the transmission of all input variety to the system output is possible if and only if the system is canonical, i.e., completely reachable and completely observable

    Negative Energy Modes and Gravitational Instability of Interpenetrating Fluids

    Get PDF
    We study the longitudinal instabilities of two interpenetrating fluids interacting only through gravity. When one of the constituents is of relatively low density, it is possible to have a band of unstable wave numbers well separated from those involved in the usual Jeans instability. If the initial streaming is large enough, and there is no linear instability, the indefinite sign of the free energy has the possible consequence of explosive interactions between positive and negative energy modes in the nonlinear regime. The effect of dissipation on the negative energy modes is also examined

    Towards Autopoietic Computing

    Full text link
    A key challenge in modern computing is to develop systems that address complex, dynamic problems in a scalable and efficient way, because the increasing complexity of software makes designing and maintaining efficient and flexible systems increasingly difficult. Biological systems are thought to possess robust, scalable processing paradigms that can automatically manage complex, dynamic problem spaces, possessing several properties that may be useful in computer systems. The biological properties of self-organisation, self-replication, self-management, and scalability are addressed in an interesting way by autopoiesis, a descriptive theory of the cell founded on the concept of a system's circular organisation to define its boundary with its environment. In this paper, therefore, we review the main concepts of autopoiesis and then discuss how they could be related to fundamental concepts and theories of computation. The paper is conceptual in nature and the emphasis is on the review of other people's work in this area as part of a longer-term strategy to develop a formal theory of autopoietic computing.Comment: 10 Pages, 3 figure

    Multi-Agent Complex Systems and Many-Body Physics

    Full text link
    Multi-agent complex systems comprising populations of decision-making particles, have many potential applications across the biological, informational and social sciences. We show that the time-averaged dynamics in such systems bear a striking resemblance to conventional many-body physics. For the specific example of the Minority Game, this analogy enables us to obtain analytic expressions which are in excellent agreement with numerical simulations.Comment: Accepted for publication in Europhysics Letter

    Towards model-based control of Parkinson's disease

    Get PDF
    Modern model-based control theory has led to transformative improvements in our ability to track the nonlinear dynamics of systems that we observe, and to engineer control systems of unprecedented efficacy. In parallel with these developments, our ability to build computational models to embody our expanding knowledge of the biophysics of neurons and their networks is maturing at a rapid rate. In the treatment of human dynamical disease, our employment of deep brain stimulators for the treatment of Parkinson’s disease is gaining increasing acceptance. Thus, the confluence of these three developments—control theory, computational neuroscience and deep brain stimulation—offers a unique opportunity to create novel approaches to the treatment of this disease. This paper explores the relevant state of the art of science, medicine and engineering, and proposes a strategy for model-based control of Parkinson’s disease. We present a set of preliminary calculations employing basal ganglia computational models, structured within an unscented Kalman filter for tracking observations and prescribing control. Based upon these findings, we will offer suggestions for future research and development

    Deep-Manager: a versatile tool for optimal feature selection in live-cell imaging analysis

    Get PDF
    One of the major problems in bioimaging, often highly underestimated, is whether features extracted for a discrimination or regression task will remain valid for a broader set of similar experiments or in the presence of unpredictable perturbations during the image acquisition process. Such an issue is even more important when it is addressed in the context of deep learning features due to the lack of a priori known relationship between the black-box descriptors (deep features) and the phenotypic properties of the biological entities under study. In this regard, the widespread use of descriptors, such as those coming from pre-trained Convolutional Neural Networks (CNNs), is hindered by the fact that they are devoid of apparent physical meaning and strongly subjected to unspecific biases, i.e., features that do not depend on the cell phenotypes, but rather on acquisition artifacts, such as brightness or texture changes, focus shifts, autofluorescence or photobleaching. The proposed Deep-Manager software platform offers the possibility to efficiently select those features having lower sensitivity to unspecific disturbances and, at the same time, a high discriminating power. Deep-Manager can be used in the context of both handcrafted and deep features. The unprecedented performances of the method are proven using five different case studies, ranging from selecting handcrafted green fluorescence protein intensity features in chemotherapy-related breast cancer cell death investigation to addressing problems related to the context of Deep Transfer Learning. Deep-Manager, freely available at https://github.com/BEEuniroma2/Deep-Manager, is suitable for use in many fields of bioimaging and is conceived to be constantly upgraded with novel image acquisition perturbations and modalities

    Designing Research

    Get PDF
    The aim of this chapter is to set out a process that researchers can follow to design a robust quantitative research study of occupant behavior in buildings. Central to this approach is an emphasis on intellectual clarity around what is being measured and why. To help achieve this clarity, researchers are encouraged to literally draw these relationships out in the form of a concept map capturing the theoretical model of the cause and effect between occupant motivations and energy use. Having captured diagrammatically how the system is thought to work, the next step is to formulate research questions or hypotheses capturing the relationship between variables in the theoretical model, and to start to augment the diagram with the measurands (things that can actually be measured) that are good proxies for each concept. Once these are identified, the diagram can be further augmented with one or more methods of measuring each measurand. The chapter argues that it is necessary to carefully define concepts and their presumed relationships, and to clearly state research questions and identify what the researcher intends to measure before starting data collection. The chapter also explains the ideas of reliability, validity, and uncertainty, and why knowledge about them is essential for any researcher

    Computing the Noncomputable

    Get PDF
    We explore in the framework of Quantum Computation the notion of computability, which holds a central position in Mathematics and Theoretical Computer Science. A quantum algorithm that exploits the quantum adiabatic processes is considered for the Hilbert's tenth problem, which is equivalent to the Turing halting problem and known to be mathematically noncomputable. Generalised quantum algorithms are also considered for some other mathematical noncomputables in the same and of different noncomputability classes. The key element of all these algorithms is the measurability of both the values of physical observables and of the quantum-mechanical probability distributions for these values. It is argued that computability, and thus the limits of Mathematics, ought to be determined not solely by Mathematics itself but also by physical principles.Comment: Extensively revised and enlarged with: 2 new subsections, 4 new figures, 1 new reference, and a short biography as requested by the journal edito
    corecore