132,203 research outputs found

    Clustering of loose groups and galaxies from the Perseus--Pisces Survey

    Get PDF
    We investigate the clustering properties of loose groups in the Perseus--Pisces redshift Survey (PPS). Previous analyses based on CfA and SSRS surveys led to apparently contradictory results. We investigate the source of such discrepancies, finding satisfactory explanations for them. Furthermore, we find a definite signal of group clustering, whose amplitude AGA_G exceeds the amplitude AgA_g of galaxy clustering (AG=14.53.0+3.8A_G=14.5^{+3.8}_{-3.0}, Ag=7.420.19+0.20A_g=7.42^{+0.20}_{-0.19} for the most significant case; distances are measured in \hMpc). Groups are identified with the adaptive Friends--Of--Friends (FOF) algorithms HG (Huchra \& Geller 1982) and NW (Nolthenius \& White 1987), systematically varying all search parameters. Correlation strenght is especially sensitive to the sky--link DLD_L (increasing for stricter normalization D0D_0), and to the (depth \mlim of the) galaxy data. It is only moderately dependent on the galaxy luminosity function ϕ(L)\phi(L), while it is almost insensitive to the redshift--link VLV_L (both to the normalization V0V_0 and to the scaling recipes HG or NW).Comment: 28 pages (LaTeX aasms4 style) + 5 Postscript figures ; ApJ submitted on May 4th, 1996; group catalogs available upon request ([email protected]

    Adaptive Unified Differential Evolution for Clustering

    Get PDF
    Various clustering methods to obtain optimal information continues to evolve one of its development is Evolutionary Algorithm (EA). Adaptive Unified Differential Evolution (AuDE), is the development of Differential Evolution (DE) which is one of the EA techniques. AuDE has self adaptive scale factor control parameters (F) and crossover-rate (Cr).. It also has a single mutation strategy that represents the most commonly used standard mutation strategies from previous studies.The AuDE clustering method was tested using 4 datasets. Silhouette Index and CS Measure is a fitness function used as a measure of the quality of clustering results. The quality of the AuDE clustering results is then compared against the quality of clustering results using the DE method.The results show that the AuDE mutation strategy can expand the cluster central search produced by ED so that better clustering quality can be obtained. The comparison of the quality of AuDE and DE using Silhoutte Index is 1:0.816, whereas the use of CS Measure shows a comparison of 0.565:1. The execution time required AuDE shows better but Number significant results, aimed at the comparison of Silhoutte Index usage of 0.99:1 , Whereas on the use of CS Measure obtained the comparison of 0.184:1

    AMD-DBSCAN: An Adaptive Multi-density DBSCAN for datasets of extremely variable density

    Full text link
    DBSCAN has been widely used in density-based clustering algorithms. However, with the increasing demand for Multi-density clustering, previous traditional DSBCAN can not have good clustering results on Multi-density datasets. In order to address this problem, an adaptive Multi-density DBSCAN algorithm (AMD-DBSCAN) is proposed in this paper. An improved parameter adaptation method is proposed in AMD-DBSCAN to search for multiple parameter pairs (i.e., Eps and MinPts), which are the key parameters to determine the clustering results and performance, therefore allowing the model to be applied to Multi-density datasets. Moreover, only one hyperparameter is required for AMD-DBSCAN to avoid the complicated repetitive initialization operations. Furthermore, the variance of the number of neighbors (VNN) is proposed to measure the difference in density between each cluster. The experimental results show that our AMD-DBSCAN reduces execution time by an average of 75% due to lower algorithm complexity compared with the traditional adaptive algorithm. In addition, AMD-DBSCAN improves accuracy by 24.7% on average over the state-of-the-art design on Multi-density datasets of extremely variable density, while having no performance loss in Single-density scenarios. Our code and datasets are available at https://github.com/AlexandreWANG915/AMD-DBSCAN.Comment: Accepted at DSAA202

    Fuzzy adaptive resonance theory: Applications and extensions

    Get PDF
    Adaptive Resonance Theory, ART, is a powerful clustering tool for learning arbitrary patterns in a self-organizing manner. In this research, two papers are presented that examine the extensibility and applications of ART. The first paper examines a means to boost ART performance by assigning each cluster a vigilance value, instead of a single value for the whole ART module. A Particle Swarm Optimization technique is used to search for desirable vigilance values. In the second paper, it is shown how ART, and clustering in general, can be a useful tool in preprocessing time series data. Clustering quantization attempts to meaningfully group data for preprocessing purposes, and improves results over the absence of quantization with statistical significance. --Abstract, page iv

    A New Adaptive Elastic Net Method for Cluster Analysis

    Get PDF
    Clustering is inherently a highly challenging research problem. The elastic net algorithm is designed to solve the traveling salesman problem initially, now is verified to be an efficient tool for data clustering in n-dimensional space. In this paper, by introducing a nearest neighbor learning method and a local search preferred strategy, we proposed a new Self-Organizing NN approach, called the Adaptive Clustering Elastic Net (ACEN) to solve the cluster analysis problems. ACEN consists of the adaptive clustering elastic net phase and a local search preferred phase. The first phase is used to find a cyclic permutation of the points as to minimize the total distances of the adjacent points, and adopts the Euclidean distance as the criteria to assign each point. The local search preferred phase aims to minimize the total dissimilarity within each clusters. Simulations were made on a large number of homogeneous and nonhomogeneous artificial clusters in n dimensions and a set of publicly standard problems available from UCI. Simulation results show that compared with classical partitional clustering methods, ACEN can provide better clustering solutions and do more efficiently

    Planning as Optimization: Dynamically Discovering Optimal Configurations for Runtime Situations

    Full text link
    The large number of possible configurations of modern software-based systems, combined with the large number of possible environmental situations of such systems, prohibits enumerating all adaptation options at design time and necessitates planning at run time to dynamically identify an appropriate configuration for a situation. While numerous planning techniques exist, they typically assume a detailed state-based model of the system and that the situations that warrant adaptations are known. Both of these assumptions can be violated in complex, real-world systems. As a result, adaptation planning must rely on simple models that capture what can be changed (input parameters) and observed in the system and environment (output and context parameters). We therefore propose planning as optimization: the use of optimization strategies to discover optimal system configurations at runtime for each distinct situation that is also dynamically identified at runtime. We apply our approach to CrowdNav, an open-source traffic routing system with the characteristics of a real-world system. We identify situations via clustering and conduct an empirical study that compares Bayesian optimization and two types of evolutionary optimization (NSGA-II and novelty search) in CrowdNav
    corecore