104,068 research outputs found

    Improved Parallel Algorithms for Spanners and Hopsets

    Full text link
    We use exponential start time clustering to design faster and more work-efficient parallel graph algorithms involving distances. Previous algorithms usually rely on graph decomposition routines with strict restrictions on the diameters of the decomposed pieces. We weaken these bounds in favor of stronger local probabilistic guarantees. This allows more direct analyses of the overall process, giving: * Linear work parallel algorithms that construct spanners with O(k)O(k) stretch and size O(n1+1/k)O(n^{1+1/k}) in unweighted graphs, and size O(n1+1/klogk)O(n^{1+1/k} \log k) in weighted graphs. * Hopsets that lead to the first parallel algorithm for approximating shortest paths in undirected graphs with O(m  polylog  n)O(m\;\mathrm{polylog}\;n) work

    Fuzzy Clustering Image Segmentation Based on Particle Swarm Optimization

    Get PDF
    Image segmentation refers to the technology to segment the image into different regions with different characteristics and to extract useful objectives, and it is a key step from image processing to image analysis. Based on the comprehensive study of image segmentation technology, this paper analyzes the advantages and disadvantages of the existing fuzzy clustering algorithms; integrates the particle swarm optimization (PSO) with the characteristics of global optimization and rapid convergence and fuzzy clustering (FC) algorithm with fuzzy clustering effects starting from the perspective of particle swarm and fuzzy membership restrictions and gets a PSO-FC image segmentation algorithm so as to effectively avoid being trapped into the local optimum and improve the stability and reliability of clustering algorithm. The experimental results show that this new PSO-FC algorithm has excellent image segmentation effects

    Spike Clustering and Neuron Tracking over Successive Time Windows

    Get PDF
    This paper introduces a new methodology for tracking signals from individual neurons over time in multiunit extracellular recordings. The core of our strategy relies upon an extension of a traditional mixture model approach, with parameter optimization via expectation-maximimization (EM), to incorporate clustering results from the preceding time period in a Bayesian manner. EM initialization is also achieved by utilizing these prior clustering results. After clustering, we match the current and prior clusters to track persisting neurons. Applications of this spike sorting method to recordings from macaque parietal cortex show that it provides significantly more consistent clustering and tracking results

    Non-Parametric Probabilistic Image Segmentation

    Get PDF
    We propose a simple probabilistic generative model for image segmentation. Like other probabilistic algorithms (such as EM on a Mixture of Gaussians) the proposed model is principled, provides both hard and probabilistic cluster assignments, as well as the ability to naturally incorporate prior knowledge. While previous probabilistic approaches are restricted to parametric models of clusters (e.g., Gaussians) we eliminate this limitation. The suggested approach does not make heavy assumptions on the shape of the clusters and can thus handle complex structures. Our experiments show that the suggested approach outperforms previous work on a variety of image segmentation tasks

    A robust automatic clustering scheme for image segmentation using wavelets

    Get PDF

    Automated Global Feature Analyzer - A Driver for Tier-Scalable Reconnaissance

    Get PDF
    For the purposes of space flight, reconnaissance field geologists have trained to become astronauts. However, the initial forays to Mars and other planetary bodies have been done by purely robotic craft. Therefore, training and equipping a robotic craft with the sensory and cognitive capabilities of a field geologist to form a science craft is a necessary prerequisite. Numerous steps are necessary in order for a science craft to be able to map, analyze, and characterize a geologic field site, as well as effectively formulate working hypotheses. We report on the continued development of the integrated software system AGFA: automated global feature analyzerreg, originated by Fink at Caltech and his collaborators in 2001. AGFA is an automatic and feature-driven target characterization system that operates in an imaged operational area, such as a geologic field site on a remote planetary surface. AGFA performs automated target identification and detection through segmentation, providing for feature extraction, classification, and prioritization within mapped or imaged operational areas at different length scales and resolutions, depending on the vantage point (e.g., spaceborne, airborne, or ground). AGFA extracts features such as target size, color, albedo, vesicularity, and angularity. Based on the extracted features, AGFA summarizes the mapped operational area numerically and flags targets of "interest", i.e., targets that exhibit sufficient anomaly within the feature space. AGFA enables automated science analysis aboard robotic spacecraft, and, embedded in tier-scalable reconnaissance mission architectures, is a driver of future intelligent and autonomous robotic planetary exploration
    corecore