20,072 research outputs found

    Effective Results on the Waring Problem for Finite Simple Groups

    Full text link
    Let G be a finite quasisimple group of Lie type. We show that there are regular semisimple elements x,y in G, x of prime order, and |y| is divisible by at most two primes, such that the product of the conjugacy classes of x and y contain all non-central elements of G. In fact in all but four cases, y can be chosen to be of square-free order. Using this result, we prove an effective version of one of the main results of Larsen, Shalev and Tiep by showing that, given any positive integer m, if the order of a finite simple group S is at least f(m) for a specified function f, then every element in S is a product of two mth powers. Furthermore, the verbal width of the mth power word on any finite simple group S is at most g(m) for a specified function g. We also show that, given any two non-trivial words v, w, if G is a finite quasisimple group of large enough order, then v(G)w(G) contains all non-central elements of G.Comment: Note title change from version

    Mixing and non-mixing local minima of the entropy contrast for blind source separation

    Full text link
    In this paper, both non-mixing and mixing local minima of the entropy are analyzed from the viewpoint of blind source separation (BSS); they correspond respectively to acceptable and spurious solutions of the BSS problem. The contribution of this work is twofold. First, a Taylor development is used to show that the \textit{exact} output entropy cost function has a non-mixing minimum when this output is proportional to \textit{any} of the non-Gaussian sources, and not only when the output is proportional to the lowest entropic source. Second, in order to prove that mixing entropy minima exist when the source densities are strongly multimodal, an entropy approximator is proposed. The latter has the major advantage that an error bound can be provided. Even if this approximator (and the associated bound) is used here in the BSS context, it can be applied for estimating the entropy of any random variable with multimodal density.Comment: 11 pages, 6 figures, To appear in IEEE Transactions on Information Theor
    corecore