3,852 research outputs found

    Learning Two-layer Neural Networks with Symmetric Inputs

    Full text link
    We give a new algorithm for learning a two-layer neural network under a general class of input distributions. Assuming there is a ground-truth two-layer network y=Aσ(Wx)+ξ, y = A \sigma(Wx) + \xi, where A,WA,W are weight matrices, ξ\xi represents noise, and the number of neurons in the hidden layer is no larger than the input or output, our algorithm is guaranteed to recover the parameters A,WA,W of the ground-truth network. The only requirement on the input xx is that it is symmetric, which still allows highly complicated and structured input. Our algorithm is based on the method-of-moments framework and extends several results in tensor decompositions. We use spectral algorithms to avoid the complicated non-convex optimization in learning neural networks. Experiments show that our algorithm can robustly learn the ground-truth neural network with a small number of samples for many symmetric input distributions

    The Relationship between the Factors of the Ownership Structure and the Earnings Per Share

    Get PDF
    The 100 listed companies were selected randomly in 2008 in the Shanghai Stock Exchange. The two linear independent factor 1 and factor 2 were extracted through analyzing the top ten shareholders of the 100 listed companies. At the same time the 100 listed companies two factors scores o were computed. The two factors were divided into the larger classes and the smaller classes according to the two factors scores order. The earnings per share were divided into the larger class and the smaller class in accordance with the earnings per share rank. The log-linear model was established by the two factors as the independent variables and the earnings per share as the dependent variable. Finally three variables were done as the single factor multivariate covariance analysis. The factor 1 was significant on the earnings per share. But the factor 2 and the interaction of the rank factor1 and the factor 2 were not significant on the earnings per share. Key words: the log-linear model; the covariance analysis; the ownership structure; the factor; the earnings per shar

    The Necessary and Sufficient Conditions of Separability for Multipartite Pure States

    Get PDF
    In this paper we present the necessary and sufficient conditions of separability for multipartite pure states. These conditions are very simple, and they don't require Schmidt decomposition or tracing out operations. We also give a necessary condition for a local unitary equivalence class for a bipartite system in terms of the determinant of the matrix of amplitudes and explore a variance as a measure of entanglement for multipartite pure states.Comment: Submitted to PRL in Sep. 2004, the paper No is LV9637. Submitted to SIAM on computing, in Jan., 2005, the paper No. is SICOMP 44687. Under reviewing no
    corecore