9,798 research outputs found

    Predictive intelligence to the edge through approximate collaborative context reasoning

    Get PDF
    We focus on Internet of Things (IoT) environments where a network of sensing and computing devices are responsible to locally process contextual data, reason and collaboratively infer the appearance of a specific phenomenon (event). Pushing processing and knowledge inference to the edge of the IoT network allows the complexity of the event reasoning process to be distributed into many manageable pieces and to be physically located at the source of the contextual information. This enables a huge amount of rich data streams to be processed in real time that would be prohibitively complex and costly to deliver on a traditional centralized Cloud system. We propose a lightweight, energy-efficient, distributed, adaptive, multiple-context perspective event reasoning model under uncertainty on each IoT device (sensor/actuator). Each device senses and processes context data and infers events based on different local context perspectives: (i) expert knowledge on event representation, (ii) outliers inference, and (iii) deviation from locally predicted context. Such novel approximate reasoning paradigm is achieved through a contextualized, collaborative belief-driven clustering process, where clusters of devices are formed according to their belief on the presence of events. Our distributed and federated intelligence model efficiently identifies any localized abnormality on the contextual data in light of event reasoning through aggregating local degrees of belief, updates, and adjusts its knowledge to contextual data outliers and novelty detection. We provide comprehensive experimental and comparison assessment of our model over real contextual data with other localized and centralized event detection models and show the benefits stemmed from its adoption by achieving up to three orders of magnitude less energy consumption and high quality of inference

    Training Gaussian Mixture Models at Scale via Coresets

    Get PDF
    How can we train a statistical mixture model on a massive data set? In this work we show how to construct coresets for mixtures of Gaussians. A coreset is a weighted subset of the data, which guarantees that models fitting the coreset also provide a good fit for the original data set. We show that, perhaps surprisingly, Gaussian mixtures admit coresets of size polynomial in dimension and the number of mixture components, while being independent of the data set size. Hence, one can harness computationally intensive algorithms to compute a good approximation on a significantly smaller data set. More importantly, such coresets can be efficiently constructed both in distributed and streaming settings and do not impose restrictions on the data generating process. Our results rely on a novel reduction of statistical estimation to problems in computational geometry and new combinatorial complexity results for mixtures of Gaussians. Empirical evaluation on several real-world datasets suggests that our coreset-based approach enables significant reduction in training-time with negligible approximation error

    Stacking Gravitational Wave Signals from Soft Gamma Repeater Bursts

    Full text link
    Soft gamma repeaters (SGRs) have unique properties that make them intriguing targets for gravitational wave (GW) searches. They are nearby, their burst emission mechanism may involve neutron star crust fractures and excitation of quasi-normal modes, and they burst repeatedly and sometimes spectacularly. A recent LIGO search for transient GW from these sources placed upper limits on a set of almost 200 individual SGR bursts. These limits were within the theoretically predicted range of some models. We present a new search strategy which builds upon the method used there by "stacking" potential GW signals from multiple SGR bursts. We assume that variation in the time difference between burst electromagnetic emission and burst GW emission is small relative to the GW signal duration, and we time-align GW excess power time-frequency tilings containing individual burst triggers to their corresponding electromagnetic emissions. Using Monte Carlo simulations, we confirm that gains in GW energy sensitivity of N^{1/2} are possible, where N is the number of stacked SGR bursts. Estimated sensitivities for a mock search for gravitational waves from the 2006 March 29 storm from SGR 1900+14 are also presented, for two GW emission models, "fluence-weighted" and "flat" (unweighted).Comment: 17 pages, 16 figures, submitted to PR
    • …
    corecore