268 research outputs found

    Canonical Models and the Law of Requisite Variety

    Get PDF
    The law of requisite variety from cybernetics is shown to be related to the reachability and observability properties of a dynamical control system. In particular, it is established that the transmission of all input variety to the system output is possible if and only if the system is canonical, i.e., completely reachable and completely observable

    Multi-Agent Complex Systems and Many-Body Physics

    Full text link
    Multi-agent complex systems comprising populations of decision-making particles, have many potential applications across the biological, informational and social sciences. We show that the time-averaged dynamics in such systems bear a striking resemblance to conventional many-body physics. For the specific example of the Minority Game, this analogy enables us to obtain analytic expressions which are in excellent agreement with numerical simulations.Comment: Accepted for publication in Europhysics Letter

    Towards Autopoietic Computing

    Full text link
    A key challenge in modern computing is to develop systems that address complex, dynamic problems in a scalable and efficient way, because the increasing complexity of software makes designing and maintaining efficient and flexible systems increasingly difficult. Biological systems are thought to possess robust, scalable processing paradigms that can automatically manage complex, dynamic problem spaces, possessing several properties that may be useful in computer systems. The biological properties of self-organisation, self-replication, self-management, and scalability are addressed in an interesting way by autopoiesis, a descriptive theory of the cell founded on the concept of a system's circular organisation to define its boundary with its environment. In this paper, therefore, we review the main concepts of autopoiesis and then discuss how they could be related to fundamental concepts and theories of computation. The paper is conceptual in nature and the emphasis is on the review of other people's work in this area as part of a longer-term strategy to develop a formal theory of autopoietic computing.Comment: 10 Pages, 3 figure

    Towards model-based control of Parkinson's disease

    Get PDF
    Modern model-based control theory has led to transformative improvements in our ability to track the nonlinear dynamics of systems that we observe, and to engineer control systems of unprecedented efficacy. In parallel with these developments, our ability to build computational models to embody our expanding knowledge of the biophysics of neurons and their networks is maturing at a rapid rate. In the treatment of human dynamical disease, our employment of deep brain stimulators for the treatment of Parkinson’s disease is gaining increasing acceptance. Thus, the confluence of these three developments—control theory, computational neuroscience and deep brain stimulation—offers a unique opportunity to create novel approaches to the treatment of this disease. This paper explores the relevant state of the art of science, medicine and engineering, and proposes a strategy for model-based control of Parkinson’s disease. We present a set of preliminary calculations employing basal ganglia computational models, structured within an unscented Kalman filter for tracking observations and prescribing control. Based upon these findings, we will offer suggestions for future research and development

    Deep-Manager: a versatile tool for optimal feature selection in live-cell imaging analysis

    Get PDF
    One of the major problems in bioimaging, often highly underestimated, is whether features extracted for a discrimination or regression task will remain valid for a broader set of similar experiments or in the presence of unpredictable perturbations during the image acquisition process. Such an issue is even more important when it is addressed in the context of deep learning features due to the lack of a priori known relationship between the black-box descriptors (deep features) and the phenotypic properties of the biological entities under study. In this regard, the widespread use of descriptors, such as those coming from pre-trained Convolutional Neural Networks (CNNs), is hindered by the fact that they are devoid of apparent physical meaning and strongly subjected to unspecific biases, i.e., features that do not depend on the cell phenotypes, but rather on acquisition artifacts, such as brightness or texture changes, focus shifts, autofluorescence or photobleaching. The proposed Deep-Manager software platform offers the possibility to efficiently select those features having lower sensitivity to unspecific disturbances and, at the same time, a high discriminating power. Deep-Manager can be used in the context of both handcrafted and deep features. The unprecedented performances of the method are proven using five different case studies, ranging from selecting handcrafted green fluorescence protein intensity features in chemotherapy-related breast cancer cell death investigation to addressing problems related to the context of Deep Transfer Learning. Deep-Manager, freely available at https://github.com/BEEuniroma2/Deep-Manager, is suitable for use in many fields of bioimaging and is conceived to be constantly upgraded with novel image acquisition perturbations and modalities

    Computing the Noncomputable

    Get PDF
    We explore in the framework of Quantum Computation the notion of computability, which holds a central position in Mathematics and Theoretical Computer Science. A quantum algorithm that exploits the quantum adiabatic processes is considered for the Hilbert's tenth problem, which is equivalent to the Turing halting problem and known to be mathematically noncomputable. Generalised quantum algorithms are also considered for some other mathematical noncomputables in the same and of different noncomputability classes. The key element of all these algorithms is the measurability of both the values of physical observables and of the quantum-mechanical probability distributions for these values. It is argued that computability, and thus the limits of Mathematics, ought to be determined not solely by Mathematics itself but also by physical principles.Comment: Extensively revised and enlarged with: 2 new subsections, 4 new figures, 1 new reference, and a short biography as requested by the journal edito

    Smoke consequences of new wildfire regimes driven by climate change

    Get PDF
    Smoke from wildfires has adverse biological and social consequences, and various lines of evidence suggest that smoke from wildfires in the future may be more intense and widespread, demanding that methods be developed to address its effects on people, ecosystems, and the atmosphere. In this paper, we present the essential ingredients of a modeling system for projecting smoke consequences in a rapidly warming climate that is expected to change wildfire regimes significantly. We describe each component of the system, offer suggestions for the elements of a modeling agenda, and provide some general guidelines for making choices among potential components. We address a prospective audience of researchers whom we expect to be fluent already in building some or many of these components, so we neither prescribe nor advocate particular models or software. Instead, our intent is to highlight fruitful ways of thinking about the task as a whole and its components, while providing substantial, if not exhaustive, documentation from the primary literature as reference. This paper provides a guide to the complexities of smoke modeling under climate change, and a research agenda for developing a modeling system that is equal to the task while being feasible with current resources

    Proximity curves for potential-based clustering

    Get PDF
    YesThe concept of proximity curve and a new algorithm are proposed for obtaining clusters in a finite set of data points in the finite dimensional Euclidean space. Each point is endowed with a potential constructed by means of a multi-dimensional Cauchy density, contributing to an overall anisotropic potential function. Guided by the steepest descent algorithm, the data points are successively visited and removed one by one, and at each stage the overall potential is updated and the magnitude of its local gradient is calculated. The result is a finite sequence of tuples, the proximity curve, whose pattern is analysed to give rise to a deterministic clustering. The finite set of all such proximity curves in conjunction with a simulation study of their distribution results in a probabilistic clustering represented by a distribution on the set of dendrograms. A two-dimensional synthetic data set is used to illustrate the proposed potential-based clustering idea. It is shown that the results achieved are plausible since both the ‘geographic distribution’ of data points as well as the ‘topographic features’ imposed by the potential function are well reflected in the suggested clustering. Experiments using the Iris data set are conducted for validation purposes on classification and clustering benchmark data. The results are consistent with the proposed theoretical framework and data properties, and open new approaches and applications to consider data processing from different perspectives and interpret data attributes contribution to patterns

    A framework for the local information dynamics of distributed computation in complex systems

    Full text link
    The nature of distributed computation has often been described in terms of the component operations of universal computation: information storage, transfer and modification. We review the first complete framework that quantifies each of these individual information dynamics on a local scale within a system, and describes the manner in which they interact to create non-trivial computation where "the whole is greater than the sum of the parts". We describe the application of the framework to cellular automata, a simple yet powerful model of distributed computation. This is an important application, because the framework is the first to provide quantitative evidence for several important conjectures about distributed computation in cellular automata: that blinkers embody information storage, particles are information transfer agents, and particle collisions are information modification events. The framework is also shown to contrast the computations conducted by several well-known cellular automata, highlighting the importance of information coherence in complex computation. The results reviewed here provide important quantitative insights into the fundamental nature of distributed computation and the dynamics of complex systems, as well as impetus for the framework to be applied to the analysis and design of other systems.Comment: 44 pages, 8 figure

    In-flight radiometric calibration of the Metis Visible Light channel using stars and comparison with STEREO-A/COR2 data

    Get PDF
    Context. We present the results for the in-flight radiometric calibration performed for the Visible Light (VL) channel of the Metis coronagraph on board Solar Orbiter. Aims. The radiometric calibration is a fundamental step in building the official pipeline of the instrument, devoted to producing the calibrated data in physical units (L2 data). Methods. To obtain the radiometric calibration factor (ÏμVL), we used stellar targets transiting the Metis field of view. We derived ÏμVLby determining the signal of each calibration star by means of the aperture photometry and calculating its expected flux in the Metis band pass. The analyzed data set covers the time range from the beginning of the Cruise Phase of the mission (June 2020) until March 2021. Results. Considering the uncertainties, the estimated factor ÏμVLis in a good agreement with that obtained during the on-ground calibration campaign. This implies that up to March 2021 there was no measurable reduction in the VL channel throughput. Finally, we compared the total and polarized brightness visible light images of the solar corona acquired with Metis and STEREO-A/COR2 during the November 2020 superior conjunction of these instruments. A general good agreement was obtained between the images of these instruments for both the total and polarized brightness
    corecore