45 research outputs found
Cascading dominates large-scale disruptions in transport over complex networks
The core functionality of many socio-technical systems, such as supply chains, (inter) national trade and human mobility, concern transport over large geographically-spread complex networks. The dynamical intertwining of many heterogeneous operational elements, agents and locations are oft-cited generic factors to make these systems prone to large-scale disruptions: initially localised perturbations amplify and spread over the network, leading to a complete standstill of transport. Our level of understanding of such phenomena, let alone the ability to anticipate or predict their evolution in time, remains rudimentary. We approach the problem with a prime example: railways. Analysing spreading of train delays on the network by building a physical model, supported by data, reveals that the emergence of large-scale disruptions rests on the dynamic interdependencies among multiple ‘layers’ of operational elements (resources and services). The interdependencies provide pathways for the so-called delay cascading mechanism, which gets activated when, constrained by local unavailability of on-time resources, already-delayed ones are used to operate new services. Cascading locally amplifies delays, which in turn get transported over the network to give rise to new constraints elsewhere. This mechanism is a rich addition to some well-understood ones in, e.g., epidemiological spreading, or the spreading of rumours and opinions over (contact) networks, and stimulates rethinking spreading dynamics on complex networks. Having these concepts built into the model provides it with the ability to predict the evolution of large-scale disruptions in the railways up to 30-60 minutes up front. For transport systems, our work suggests that possible alleviation of constraints as well as a modular operational approach would arrest cascading, and therefore be effective measures against large-scale disruptions
A reliable ensemble based approach to semi-supervised learning
Semi-supervised learning (SSL) methods attempt to achieve better classification of unseen data through the use of unlabeled data than can be achieved by learning from the available labeled data alone. Most SSL methods require the user to familiarize themselves with novel, complex concepts and to ensure the underlying assumptions made by these methods match the problem structure, or they risk a decrease in predictive performance. In this paper, we present the reliable semi-supervised ensemble learning (RESSEL) method, which exploits unlabeled data by using it to generate diverse classifiers through self-training and combines these classifiers into an ensemble for prediction. Our method functions as a wrapper around a supervised base classifier and refrains from introducing additional problem dependent assumptions. We conduct experiments on a number of commonly used data sets to prove its merit. The results show RESSEL improves significantly upon the supervised alternatives, provided that the base classifier which is used is able to produce adequate probability-based rankings. It is shown that RESSEL is reliable in that it delivers results comparable to supervised learning methods if this requirement is not met, while the method also broadens the range of good parameter values. Furthermore, RESSEL is demonstrated to outperform existing self-labeled wrapper approaches
Super slowing down in the bond-diluted Ising model
In models in statistical physics, the dynamics often slows down tremendously near the critical point. Usually, the correlation time τ at the critical point increases with system size L in power-law fashion: τ ∼ L z , which defines the critical dynamical exponent z . We show that this also holds for the two-dimensional bond-diluted Ising model in the regime p > p c , where p is the parameter denoting the bond concentration, but with a dynamical critical exponent z ( p ) which shows a strong p dependence. Moreover, we show numerically that z ( p ) , as obtained from the autocorrelation of the total magnetization, diverges when the percolation threshold p c = 1 / 2 is approached: z ( p ) − z ( 1 ) ∼ ( p − p c ) − 2 . We refer to this observed extremely fast increase of the correlation time with size as super slowing down. Independent measurement data from the mean-square deviation of the total magnetization, which exhibits anomalous diffusion at the critical point, support this result
Discontinuous evolution of the structure of stretching polycrystalline graphene
Polycrystalline graphene has an inherent tendency to buckle, i.e., develop out-of-plane, three-dimensional structure. A force applied to stretch a piece of polycrystalline graphene influences the out-of-plane structure. Even if the graphene is well relaxed, this happens in nonlinear fashion: Occasionally, a tiny increase in stretching force induces a significant displacement, in close analogy to avalanches, which in turn can create vibrations in the surrounding medium. We establish this effect in computer simulations: By continuously changing the strain, we follow the displacements of the carbon atoms that turn out to exhibit a discontinuous evolution. Furthermore, the displacements exhibit a hysteretic behavior upon the change from low to high stress and back. These behaviors open up another direction in studying dynamical elasticity of polycrystalline quasi-two-dimensional systems, and in particular the implications on their mechanical and thermal properties
Structural dynamics of polycrystalline graphene
The exceptional properties of the two-dimensional material graphene make it attractive for multiple functional applications, whose large-area samples are typically polycrystalline. Here, we study the mechanical properties of graphene in computer simulations and connect these to the experimentally relevant mechanical properties. In particular, we study the fluctuations in the lateral dimensions of the periodic simulation cell. We show that over short timescales, both the area A and the aspect ratio B of the rectangular periodic box show diffusive behavior under zero external field during dynamical evolution, with diffusion coefficients DA and DB that are related to each other. At longer times, fluctuations in A are bounded, while those in B are not. This makes the direct determination of DB much more accurate, from which DA can then be derived indirectly. We then show that the dynamic behavior of polycrystalline graphene under external forces can also be derived from DA and DB via the Nernst-Einstein relation. Additionally, we study how the diffusion coefficients depend on structural properties of the polycrystalline graphene, in particular, the density of defects
Approximate dynamical eigenmodes of the Ising model with local spin-exchange moves
We establish that the Fourier modes of the magnetization serve as the dynamical eigenmodes for the twodimensional Ising model at the critical temperature with local spin-exchange moves, i.e., Kawasaki dynamics. We obtain the dynamical scaling properties for these modes and use them to calculate the time evolution of two dynamical quantities for the system, namely, the autocorrelation function and the mean-square deviation of the line magnetizations. At intermediate times 1 t Lzc , where zc = 4 − η = 15/4 is the dynamical critical exponent of the model, we find that the line magnetization undergoes anomalous diffusion. Following our recent work on anomalous diffusion in spin models, we demonstrate that the generalized Langevin equation with a memory kernel consistently describes the anomalous diffusion, verifying the corresponding fluctuationdissipation theorem with the calculation of the force autocorrelation functio
Efficient Structural Relaxation of Polycrystalline Graphene Models
Large samples of experimentally produced graphene are polycrystalline. For the study of this material, it helps to have realistic computer samples that are also polycrystalline. A common approach to produce such samples in computer simulations is based on the method of Wooten, Winer, and Weaire, originally introduced for the simulation of amorphous silicon. We introduce an early rejection variation of their method, applied to graphene, which exploits the local nature of the structural changes to achieve a significant speed-up in the relaxation of the material, without compromising the dynamics. We test it on a 3200 atoms sample, obtaining a speed-up between one and two orders of magnitude. We also introduce a further variation called early decision specifically for relaxing large samples even faster, and we test it on two samples of 10,024 and 20,000 atoms, obtaining a further speed-up of an order of magnitude. Furthermore, we provide a graphical manipulation tool to remove unwanted artifacts in a sample, such as bond crossing
