12,210 research outputs found

    Bursting activity spreading through asymmetric interactions

    Full text link
    People communicate with those who have the same background or share a common interest by using a social networking service (SNS). News or messages propagate through inhomogeneous connections in an SNS by sharing or facilitating additional comments. Such human activity is known to lead to endogenous bursting in the rate of message occurrences. We analyze a multi-dimensional self-exciting process to reveal dependence of the bursting activity on the topology of connections and the distribution of interaction strength on the connections. We determine the critical conditions for the cases where interaction strength is regulated at either the point of input or output for each person. In the input regulation condition, the network may exhibit bursting with infinitesimal interaction strength, if the dispersion of the degrees diverges as in the scale-free networks. In contrast, in the output regulation condition, the critical value of interaction strength, represented by the average number of events added by a single event, is a constant 1−1/2≈0.31-1/\sqrt{2} \approx 0.3, independent of the degree dispersion. Thus, the stability in human activity crucially depends on not only the topology of connections but also the manner in which interactions are distributed among the connections.Comment: 8 pages, 8 figure

    Reactive point processes: A new approach to predicting power failures in underground electrical systems

    Full text link
    Reactive point processes (RPPs) are a new statistical model designed for predicting discrete events in time based on past history. RPPs were developed to handle an important problem within the domain of electrical grid reliability: short-term prediction of electrical grid failures ("manhole events"), including outages, fires, explosions and smoking manholes, which can cause threats to public safety and reliability of electrical service in cities. RPPs incorporate self-exciting, self-regulating and saturating components. The self-excitement occurs as a result of a past event, which causes a temporary rise in vulner ability to future events. The self-regulation occurs as a result of an external inspection which temporarily lowers vulnerability to future events. RPPs can saturate when too many events or inspections occur close together, which ensures that the probability of an event stays within a realistic range. Two of the operational challenges for power companies are (i) making continuous-time failure predictions, and (ii) cost/benefit analysis for decision making and proactive maintenance. RPPs are naturally suited for handling both of these challenges. We use the model to predict power-grid failures in Manhattan over a short-term horizon, and to provide a cost/benefit analysis of different proactive maintenance programs.Comment: Published at http://dx.doi.org/10.1214/14-AOAS789 in the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    The Block Point Process Model for Continuous-Time Event-Based Dynamic Networks

    Full text link
    We consider the problem of analyzing timestamped relational events between a set of entities, such as messages between users of an on-line social network. Such data are often analyzed using static or discrete-time network models, which discard a significant amount of information by aggregating events over time to form network snapshots. In this paper, we introduce a block point process model (BPPM) for continuous-time event-based dynamic networks. The BPPM is inspired by the well-known stochastic block model (SBM) for static networks. We show that networks generated by the BPPM follow an SBM in the limit of a growing number of nodes. We use this property to develop principled and efficient local search and variational inference procedures initialized by regularized spectral clustering. We fit BPPMs with exponential Hawkes processes to analyze several real network data sets, including a Facebook wall post network with over 3,500 nodes and 130,000 events.Comment: To appear at The Web Conference 201

    Modelling Direct Messaging Networks with Multiple Recipients for Cyber Deception

    Full text link
    Cyber deception is emerging as a promising approach to defending networks and systems against attackers and data thieves. However, despite being relatively cheap to deploy, the generation of realistic content at scale is very costly, due to the fact that rich, interactive deceptive technologies are largely hand-crafted. With recent improvements in Machine Learning, we now have the opportunity to bring scale and automation to the creation of realistic and enticing simulated content. In this work, we propose a framework to automate the generation of email and instant messaging-style group communications at scale. Such messaging platforms within organisations contain a lot of valuable information inside private communications and document attachments, making them an enticing target for an adversary. We address two key aspects of simulating this type of system: modelling when and with whom participants communicate, and generating topical, multi-party text to populate simulated conversation threads. We present the LogNormMix-Net Temporal Point Process as an approach to the first of these, building upon the intensity-free modeling approach of Shchur et al. to create a generative model for unicast and multi-cast communications. We demonstrate the use of fine-tuned, pre-trained language models to generate convincing multi-party conversation threads. A live email server is simulated by uniting our LogNormMix-Net TPP (to generate the communication timestamp, sender and recipients) with the language model, which generates the contents of the multi-party email threads. We evaluate the generated content with respect to a number of realism-based properties, that encourage a model to learn to generate content that will engage the attention of an adversary to achieve a deception outcome

    Correlated bursts and the role of memory range

    Get PDF
    Inhomogeneous temporal processes in natural and social phenomena have been described by bursts that are rapidly occurring events within short time periods alternating with long periods of low activity. In addition to the analysis of heavy-tailed inter-event time distributions, higher-order correlations between inter-event times, called correlated bursts, have been studied only recently. As the possible mechanisms underlying such correlated bursts are far from being fully understood, we devise a simple model for correlated bursts by using a self-exciting point process with variable memory range. Here the probability that a new event occurs is determined by a memory function that is the sum of decaying memories of the past events. In order to incorporate the noise and/or limited memory capacity of systems, we apply two memory loss mechanisms, namely either fixed number or variable number of memories. By using theoretical analysis and numerical simulations we find that excessive amount of memory effect may lead to a Poissonian process, which implies that for memory effect there exists an intermediate range that will generate correlated bursts of magnitude comparable to empirical findings. Hence our results provide deeper understanding of how long-range memory affects correlated bursts.Comment: 9 pages, 7 figure
    • …
    corecore