72,938 research outputs found
Analytical computation of the epidemic threshold on temporal networks
The time variation of contacts in a networked system may fundamentally alter
the properties of spreading processes and affect the condition for large-scale
propagation, as encoded in the epidemic threshold. Despite the great interest
in the problem for the physics, applied mathematics, computer science and
epidemiology communities, a full theoretical understanding is still missing and
currently limited to the cases where the time-scale separation holds between
spreading and network dynamics or to specific temporal network models. We
consider a Markov chain description of the Susceptible-Infectious-Susceptible
process on an arbitrary temporal network. By adopting a multilayer perspective,
we develop a general analytical derivation of the epidemic threshold in terms
of the spectral radius of a matrix that encodes both network structure and
disease dynamics. The accuracy of the approach is confirmed on a set of
temporal models and empirical networks and against numerical results. In
addition, we explore how the threshold changes when varying the overall time of
observation of the temporal network, so as to provide insights on the optimal
time window for data collection of empirical temporal networked systems. Our
framework is both of fundamental and practical interest, as it offers novel
understanding of the interplay between temporal networks and spreading
dynamics.Comment: 22 pages, 6 figure
Computing, Modelling, and Scientific Practice: Foundational Analyses and Limitations
This dissertation examines aspects of the interplay between computing and scientific practice. The appropriate foundational framework for such an endeavour is rather real computability than the classical computability theory. This is so because physical sciences, engineering, and applied mathematics mostly employ functions defined in continuous domains. But, contrary to the case of computation over natural numbers, there is no universally accepted framework for real computation; rather, there are two incompatible approaches --computable analysis and BSS model--, both claiming to formalise algorithmic computation and to offer foundations for scientific computing.
The dissertation consists of three parts. In the first part, we examine what notion of 'algorithmic computation' underlies each approach and how it is respectively formalised. It is argued that the very existence of the two rival frameworks indicates that 'algorithm' is not one unique concept in mathematics, but it is used in more than one way. We test this hypothesis for consistency with mathematical practice as well as with key foundational works that aim to define the term. As a result, new connections between certain subfields of mathematics and computer science are drawn, and a distinction between 'algorithms' and 'effective procedures' is proposed.
In the second part, we focus on the second goal of the two rival approaches to real computation; namely, to provide foundations for scientific computing. We examine both frameworks in detail, what idealisations they employ, and how they relate to floating-point arithmetic systems used in real computers. We explore limitations and advantages of both frameworks, and answer questions about which one is preferable for computational modelling and which one for addressing general computability issues.
In the third part, analog computing and its relation to analogue (physical) modelling in science are investigated. Based on some paradigmatic cases of the former, a certain view about the nature of computation is defended, and the indispensable role of representation in it is emphasized and accounted for. We also propose a novel account of the distinction between analog and digital computation and, based on it, we compare analog computational modelling to physical modelling. It is concluded that the two practices, despite their apparent similarities, are orthogonal
COHESION, CONSENSUS AND EXTREME INFORMATION IN OPINION DYNAMICS
Opinion formation is an important element of social dynamics. It has been widely studied in the last years with tools from physics, mathematics and computer science. Here, a continuous model of opinion dynamics for multiple possible choices is analysed. Its main features are the inclusion of disagreement and possibility of modulating external information/media effects, both from one and multiple sources. The interest is in identifying the effect of the initial cohesion of the population, the interplay between cohesion and media extremism, and the effect of using multiple external sources of information that can influence the system. Final consensus, especially with the external message, depends highly on these factors, as numerical simulations show. When no external input is present, consensus or segregation is determined by the initial cohesion of the population. Interestingly, when only one external source of information is present, consensus can be obtained, in general, only when this is extremely neutral, i.e., there is not a single opinion strongly promoted, or in the special case of a large initial cohesion and low exposure to the external message. On the contrary, when multiple external sources are allowed, consensus can emerge with one of them even when this is not extremely neutral, i.e., it carries a strong message, for a large range of initial conditions
Some resonances between Eastern thought and Integral Biomathics in the framework of the WLIMES formalism for modelling living systems
Forty-two years ago, Capra published âThe Tao of Physicsâ (Capra, 1975). In this book (page 17) he writes: âThe exploration of the atomic and subatomic world in the twentieth century has âŠ. necessitated a radical revision of many of our basic conceptsâ and that, unlike âclassicalâ physics, the sub-atomic and quantum âmodern physicsâ shows resonances with Eastern thoughts and âleads us to a view of the world which is very similar to the views held by mystics of all ages and traditions.â This article stresses an analogous situation in biology with respect to a new theoretical approach for studying living systems, Integral Biomathics (IB), which also exhibits some resonances with Eastern thought. Stepping on earlier research in cybernetics1 and theoretical biology,2 IB has been developed since 2011 by over 100 scientists from a number of disciplines who have been exploring a substantial set of theoretical frameworks. From that effort, the need for a robust core model utilizing advanced mathematics and computation adequate for understanding the behavior of organisms as dynamic wholes was identified. At this end, the authors of this article have proposed WLIMES (Ehresmann and Simeonov, 2012), a formal theory for modeling living systems integrating both the Memory Evolutive Systems (Ehresmann and Vanbremeersch, 2007) and the Wandering Logic Intelligence (Simeonov, 2002b). Its principles will be recalled here with respect to their
resonances to Eastern thought
XFlow: Benchmarking Flow Behaviors over Graphs
The occurrence of diffusion on a graph is a prevalent and significant
phenomenon, as evidenced by the spread of rumors, influenza-like viruses, smart
grid failures, and similar events. Comprehending the behaviors of flow is a
formidable task, due to the intricate interplay between the distribution of
seeds that initiate flow propagation, the propagation model, and the topology
of the graph. The study of networks encompasses a diverse range of academic
disciplines, including mathematics, physics, social science, and computer
science. This interdisciplinary nature of network research is characterized by
a high degree of specialization and compartmentalization, and the cooperation
facilitated by them is inadequate. From a machine learning standpoint, there is
a deficiency in a cohesive platform for assessing algorithms across various
domains. One of the primary obstacles to current research in this field is the
absence of a comprehensive curated benchmark suite to study the flow behaviors
under network scenarios.
To address this disparity, we propose the implementation of a novel benchmark
suite that encompasses a variety of tasks, baseline models, graph datasets, and
evaluation tools. In addition, we present a comprehensive analytical framework
that offers a generalized approach to numerous flow-related tasks across
diverse domains, serving as a blueprint and roadmap. Drawing upon the outcomes
of our empirical investigation, we analyze the advantages and disadvantages of
current foundational models, and we underscore potential avenues for further
study. The datasets, code, and baseline models have been made available for the
public at: https://github.com/XGraphing/XFlo
- âŠ