18 research outputs found

    A Generalized^2 Linear^2 Models módszer implementálása Octave rendszerben

    Get PDF
    Dolgozatomban egy olyan eljárást implementáltam, mely segítségével csökkenthető a nagy adathalmazok feldolgozásához szükséges erőforrás-igény, hiszen egy nagy adatmátrix redukált komponensű becslésével a számítási igény csökken. Erre létezik több megoldás is, ám az adatok sokfélesége miatt egy általános megoldás a G^2L^2M módszer, amely több modellt is magába foglal. Úgy gondolom, hogy az információs társadalom tagjait az szolgálja legjobban, ha az egyes fejlesztésekhez nem csak egy korlátozott csoport fér hozzá, ezért egy olyan matematikai rendszert választottam, melynek bármely platformra létezik implementációja, azok működése nem tér el egymástól, és végül nem utolsó sorban ingyenesen hozzáférhető. Ezeket a kritériumokat a GNU Octave rendszer mind magába foglalja. Az implementált algoritmus kompatibilis a MatLab rendszerrel is, így az azzal rendelkező felhasználók is használhatják a programomat.M

    Poisson noise reduction with non-local PCA

    Full text link
    Photon-limited imaging arises when the number of photons collected by a sensor array is small relative to the number of detector elements. Photon limitations are an important concern for many applications such as spectral imaging, night vision, nuclear medicine, and astronomy. Typically a Poisson distribution is used to model these observations, and the inherent heteroscedasticity of the data combined with standard noise removal methods yields significant artifacts. This paper introduces a novel denoising algorithm for photon-limited images which combines elements of dictionary learning and sparse patch-based representations of images. The method employs both an adaptation of Principal Component Analysis (PCA) for Poisson noise and recently developed sparsity-regularized convex optimization algorithms for photon-limited images. A comprehensive empirical evaluation of the proposed method helps characterize the performance of this approach relative to other state-of-the-art denoising methods. The results reveal that, despite its conceptual simplicity, Poisson PCA-based denoising appears to be highly competitive in very low light regimes.Comment: erratum: Image man is wrongly name pepper in the journal versio

    Closed-form supervised dimensionality reduction with generalized linear models

    Full text link

    Generalized Low Rank Models

    Full text link
    Principal components analysis (PCA) is a well-known technique for approximating a tabular data set by a low rank matrix. Here, we extend the idea of PCA to handle arbitrary data sets consisting of numerical, Boolean, categorical, ordinal, and other data types. This framework encompasses many well known techniques in data analysis, such as nonnegative matrix factorization, matrix completion, sparse and robust PCA, kk-means, kk-SVD, and maximum margin matrix factorization. The method handles heterogeneous data sets, and leads to coherent schemes for compressing, denoising, and imputing missing entries across all data types simultaneously. It also admits a number of interesting interpretations of the low rank factors, which allow clustering of examples or of features. We propose several parallel algorithms for fitting generalized low rank models, and describe implementations and numerical results.Comment: 84 pages, 19 figure

    Binary Component Decomposition. Part I: The Positive-Semidefinite Case

    Get PDF
    This paper studies the problem of decomposing a low-rank positive-semidefinite matrix into symmetric factors with binary entries, either {±1} or {0,1}. This research answers fundamental questions about the existence and uniqueness of these decompositions. It also leads to tractable factorization algorithms that succeed under a mild deterministic condition. A companion paper addresses the related problem of decomposing a low-rank rectangular matrix into a binary factor and an unconstrained factor
    corecore