3,521 research outputs found

    Coherent optical control of polarization with a critical metasurface

    Full text link
    We describe the mechanism by which a metamaterial surface can act as an ideal phase-controlled rotatable linear polarizer. With equal-power linearly polarized beams incident on each side of the surface, varying the relative phase rotates the polarization angles of the output beams, while keeping the polarization exactly linear. The explanation is based on coupled-mode theory and the idea of coherent perfect absorption into auxiliary polarization channels. The polarization-rotating behavior occurs at a critical point of the coupled-mode theory, which can be associated with the exceptional point of a parity-time (PT) symmetric effective Hamiltonian

    Top-N Recommender System via Matrix Completion

    Full text link
    Top-N recommender systems have been investigated widely both in industry and academia. However, the recommendation quality is far from satisfactory. In this paper, we propose a simple yet promising algorithm. We fill the user-item matrix based on a low-rank assumption and simultaneously keep the original information. To do that, a nonconvex rank relaxation rather than the nuclear norm is adopted to provide a better rank approximation and an efficient optimization strategy is designed. A comprehensive set of experiments on real datasets demonstrates that our method pushes the accuracy of Top-N recommendation to a new level.Comment: AAAI 201

    Twin Learning for Similarity and Clustering: A Unified Kernel Approach

    Full text link
    Many similarity-based clustering methods work in two separate steps including similarity matrix computation and subsequent spectral clustering. However, similarity measurement is challenging because it is usually impacted by many factors, e.g., the choice of similarity metric, neighborhood size, scale of data, noise and outliers. Thus the learned similarity matrix is often not suitable, let alone optimal, for the subsequent clustering. In addition, nonlinear similarity often exists in many real world data which, however, has not been effectively considered by most existing methods. To tackle these two challenges, we propose a model to simultaneously learn cluster indicator matrix and similarity information in kernel spaces in a principled way. We show theoretical relationships to kernel k-means, k-means, and spectral clustering methods. Then, to address the practical issue of how to select the most suitable kernel for a particular clustering task, we further extend our model with a multiple kernel learning ability. With this joint model, we can automatically accomplish three subtasks of finding the best cluster indicator matrix, the most accurate similarity relations and the optimal combination of multiple kernels. By leveraging the interactions between these three subtasks in a joint framework, each subtask can be iteratively boosted by using the results of the others towards an overall optimal solution. Extensive experiments are performed to demonstrate the effectiveness of our method.Comment: Published in AAAI 201

    The Gaussian Multiple Access Diamond Channel

    Full text link
    In this paper, we study the capacity of the diamond channel. We focus on the special case where the channel between the source node and the two relay nodes are two separate links with finite capacities and the link from the two relay nodes to the destination node is a Gaussian multiple access channel. We call this model the Gaussian multiple access diamond channel. We first propose an upper bound on the capacity. This upper bound is a single-letterization of an nn-letter upper bound proposed by Traskov and Kramer, and is tighter than the cut-set bound. As for the lower bound, we propose an achievability scheme based on sending correlated codes through the multiple access channel with superposition structure. We then specialize this achievable rate to the Gaussian multiple access diamond channel. Noting the similarity between the upper and lower bounds, we provide sufficient and necessary conditions that a Gaussian multiple access diamond channel has to satisfy such that the proposed upper and lower bounds meet. Thus, for a Gaussian multiple access diamond channel that satisfies these conditions, we have found its capacity.Comment: submitted to IEEE Transactions on Information Theor

    Unified Spectral Clustering with Optimal Graph

    Full text link
    Spectral clustering has found extensive use in many areas. Most traditional spectral clustering algorithms work in three separate steps: similarity graph construction; continuous labels learning; discretizing the learned labels by k-means clustering. Such common practice has two potential flaws, which may lead to severe information loss and performance degradation. First, predefined similarity graph might not be optimal for subsequent clustering. It is well-accepted that similarity graph highly affects the clustering results. To this end, we propose to automatically learn similarity information from data and simultaneously consider the constraint that the similarity matrix has exact c connected components if there are c clusters. Second, the discrete solution may deviate from the spectral solution since k-means method is well-known as sensitive to the initialization of cluster centers. In this work, we transform the candidate solution into a new one that better approximates the discrete one. Finally, those three subtasks are integrated into a unified framework, with each subtask iteratively boosted by using the results of the others towards an overall optimal solution. It is known that the performance of a kernel method is largely determined by the choice of kernels. To tackle this practical problem of how to select the most suitable kernel for a particular data set, we further extend our model to incorporate multiple kernel learning ability. Extensive experiments demonstrate the superiority of our proposed method as compared to existing clustering approaches.Comment: Accepted by AAAI 201

    LogDet Rank Minimization with Application to Subspace Clustering

    Full text link
    Low-rank matrix is desired in many machine learning and computer vision problems. Most of the recent studies use the nuclear norm as a convex surrogate of the rank operator. However, all singular values are simply added together by the nuclear norm, and thus the rank may not be well approximated in practical problems. In this paper, we propose to use a log-determinant (LogDet) function as a smooth and closer, though non-convex, approximation to rank for obtaining a low-rank representation in subspace clustering. Augmented Lagrange multipliers strategy is applied to iteratively optimize the LogDet-based non-convex objective function on potentially large-scale data. By making use of the angular information of principal directions of the resultant low-rank representation, an affinity graph matrix is constructed for spectral clustering. Experimental results on motion segmentation and face clustering data demonstrate that the proposed method often outperforms state-of-the-art subspace clustering algorithms.Comment: 10 pages, 4 figure
    • …
    corecore