29,962 research outputs found

    Detecting change points in the large-scale structure of evolving networks

    Full text link
    Interactions among people or objects are often dynamic in nature and can be represented as a sequence of networks, each providing a snapshot of the interactions over a brief period of time. An important task in analyzing such evolving networks is change-point detection, in which we both identify the times at which the large-scale pattern of interactions changes fundamentally and quantify how large and what kind of change occurred. Here, we formalize for the first time the network change-point detection problem within an online probabilistic learning framework and introduce a method that can reliably solve it. This method combines a generalized hierarchical random graph model with a Bayesian hypothesis test to quantitatively determine if, when, and precisely how a change point has occurred. We analyze the detectability of our method using synthetic data with known change points of different types and magnitudes, and show that this method is more accurate than several previously used alternatives. Applied to two high-resolution evolving social networks, this method identifies a sequence of change points that align with known external "shocks" to these networks

    Polygons vs. clumps of discs: a numerical study of the influence of grain shape on the mechanical behaviour of granular materials

    Full text link
    We performed a series of numerical vertical compression tests on assemblies of 2D granular material using a Discrete Element code and studied the results with regard to the grain shape. The samples consist of 5,000 grains made from either 3 overlapping discs (clumps - grains with concavities) or six-edged polygons (convex grains). These two grain type have similar external envelopes, which is a function of a geometrical parameter α\alpha. In this paper, the numerical procedure applied is briefly presented followed by the description of the granular model used. Observations and mechanical analysis of dense and loose granular assemblies under isotropic loading are made. The mechanical response of our numerical granular samples is studied in the framework of the classical vertical compression test with constant lateral stress (biaxial test). The comparison of macroscopic responses of dense and loose samples with various grain shapes shows that when α\alpha is considered a concavity parameter, it is therefore a relevant variable for increasing mechanical performances of dense samples. When α\alpha is considered an envelope deviation from perfect sphericity, it can control mechanical performances for large strains. Finally, we present some remarks concerning the kinematics of the deformed samples: while some polygon samples subjected to a vertical compression present large damage zones (any polygon shape), dense samples made of clumps always exhibit thin reflecting shear bands. This paper was written as part of a CEGEO research project www.granuloscience.comComment: This version of the paper doesn't include figures. Visit the journal web site to download the final version of the paper with the figure

    Making Good on LSTMs' Unfulfilled Promise

    Get PDF
    LSTMs promise much to financial time-series analysis, temporal and cross-sectional inference, but we find that they do not deliver in a real-world financial management task. We examine an alternative called Continual Learning (CL), a memory-augmented approach, which can provide transparent explanations, i.e. which memory did what and when. This work has implications for many financial applications including credit, time-varying fairness in decision making and more. We make three important new observations. Firstly, as well as being more explainable, time-series CL approaches outperform LSTMs as well as a simple sliding window learner using feed-forward neural networks (FFNN). Secondly, we show that CL based on a sliding window learner (FFNN) is more effective than CL based on a sequential learner (LSTM). Thirdly, we examine how real-world, time-series noise impacts several similarity approaches used in CL memory addressing. We provide these insights using an approach called Continual Learning Augmentation (CLA) tested on a complex real-world problem, emerging market equities investment decision making. CLA provides a test-bed as it can be based on different types of time-series learners, allowing testing of LSTM and FFNN learners side by side. CLA is also used to test several distance approaches used in a memory recall-gate: Euclidean distance (ED), dynamic time warping (DTW), auto-encoders (AE) and a novel hybrid approach, warp-AE. We find that ED under-performs DTW and AE but warp-AE shows the best overall performance in a real-world financial task

    A Wavelet-Based Algorithm for the Spatial Analysis of Poisson Data

    Get PDF
    Wavelets are scaleable, oscillatory functions that deviate from zero only within a limited spatial regime and have average value zero. In addition to their use as source characterizers, wavelet functions are rapidly gaining currency within the source detection field. Wavelet-based source detection involves the correlation of scaled wavelet functions with binned, two-dimensional image data. If the chosen wavelet function exhibits the property of vanishing moments, significantly non-zero correlation coefficients will be observed only where there are high-order variations in the data; e.g., they will be observed in the vicinity of sources. In this paper, we describe the mission-independent, wavelet-based source detection algorithm WAVDETECT, part of the CIAO software package. Aspects of our algorithm include: (1) the computation of local, exposure-corrected normalized (i.e. flat-fielded) background maps; (2) the correction for exposure variations within the field-of-view; (3) its applicability within the low-counts regime, as it does not require a minimum number of background counts per pixel for the accurate computation of source detection thresholds; (4) the generation of a source list in a manner that does not depend upon a detailed knowledge of the point spread function (PSF) shape; and (5) error analysis. These features make our algorithm considerably more general than previous methods developed for the analysis of X-ray image data, especially in the low count regime. We demonstrate the algorithm's robustness by applying it to various images.Comment: Accepted for publication in Ap. J. Supp. (v. 138 Jan. 2002). 61 pages, 23 figures, expands to 3.8 Mb. Abstract abridged for astro-ph submissio

    Cumulative object categorization in clutter

    Get PDF
    In this paper we present an approach based on scene- or part-graphs for geometrically categorizing touching and occluded objects. We use additive RGBD feature descriptors and hashing of graph configuration parameters for describing the spatial arrangement of constituent parts. The presented experiments quantify that this method outperforms our earlier part-voting and sliding window classification. We evaluated our approach on cluttered scenes, and by using a 3D dataset containing over 15000 Kinect scans of over 100 objects which were grouped into general geometric categories. Additionally, color, geometric, and combined features were compared for categorization tasks
    corecore