6,634 research outputs found

    A Novel Normalized-Cut Solver with Nearest Neighbor Hierarchical Initialization

    Full text link
    Normalized-Cut (N-Cut) is a famous model of spectral clustering. The traditional N-Cut solvers are two-stage: 1) calculating the continuous spectral embedding of normalized Laplacian matrix; 2) discretization via KK-means or spectral rotation. However, this paradigm brings two vital problems: 1) two-stage methods solve a relaxed version of the original problem, so they cannot obtain good solutions for the original N-Cut problem; 2) solving the relaxed problem requires eigenvalue decomposition, which has O(n3)\mathcal{O}(n^3) time complexity (nn is the number of nodes). To address the problems, we propose a novel N-Cut solver designed based on the famous coordinate descent method. Since the vanilla coordinate descent method also has O(n3)\mathcal{O}(n^3) time complexity, we design various accelerating strategies to reduce the time complexity to O(∣E∣)\mathcal{O}(|E|) (∣E∣|E| is the number of edges). To avoid reliance on random initialization which brings uncertainties to clustering, we propose an efficient initialization method that gives deterministic outputs. Extensive experiments on several benchmark datasets demonstrate that the proposed solver can obtain larger objective values of N-Cut, meanwhile achieving better clustering performance compared to traditional solvers

    Discretize Relaxed Solution of Spectral Clustering via a Non-Heuristic Algorithm

    Full text link
    Spectral clustering and its extensions usually consist of two steps: (1) constructing a graph and computing the relaxed solution; (2) discretizing relaxed solutions. Although the former has been extensively investigated, the discretization techniques are mainly heuristic methods, e.g., k-means, spectral rotation. Unfortunately, the goal of the existing methods is not to find a discrete solution that minimizes the original objective. In other words, the primary drawback is the neglect of the original objective when computing the discrete solution. Inspired by the first-order optimization algorithms, we propose to develop a first-order term to bridge the original problem and discretization algorithm, which is the first non-heuristic to the best of our knowledge. Since the non-heuristic method is aware of the original graph cut problem, the final discrete solution is more reliable and achieves the preferable loss value. We also theoretically show that the continuous optimum is beneficial to discretization algorithms though simply finding its closest discrete solution is an existing heuristic algorithm which is also unreliable. Sufficient experiments significantly show the superiority of our method

    Light trapping in ultrathin plasmonic solar cells

    Get PDF
    We report on the design, fabrication, and measurement of ultrathin film a-Si:H solar cells with nanostructured plasmonic back contacts, which demonstrate enhanced short circuit current densities compared to cells having flat or randomly textured back contacts. The primary photocurrent enhancement occurs in the spectral range from 550 nm to 800 nm. We use angle-resolved photocurrent spectroscopy to confirm that the enhanced absorption is due to coupling to guided modes supported by the cell. Full-field electromagnetic simulation of the absorption in the active a-Si:H layer agrees well with the experimental results. Furthermore, the nanopatterns were fabricated via an inexpensive, scalable, and precise nanopatterning method. These results should guide design of optimized, non-random nanostructured back reflectors for thin film solar cells

    Hashing for Similarity Search: A Survey

    Full text link
    Similarity search (nearest neighbor search) is a problem of pursuing the data items whose distances to a query item are the smallest from a large database. Various methods have been developed to address this problem, and recently a lot of efforts have been devoted to approximate search. In this paper, we present a survey on one of the main solutions, hashing, which has been widely studied since the pioneering work locality sensitive hashing. We divide the hashing algorithms two main categories: locality sensitive hashing, which designs hash functions without exploring the data distribution and learning to hash, which learns hash functions according the data distribution, and review them from various aspects, including hash function design and distance measure and search scheme in the hash coding space
    • …
    corecore