63,906 research outputs found

    A Review on Non Linear Dimensionality Reduction Techniques for Face Recognition

    Get PDF
    Principal component Analysis (PCA) has gained much attention among researchers to address the pboblem of high dimensional data sets.during last decade a non-linear variantof PCA has been used to reduce the dimensions on a non linear hyperplane.This paper reviews the various Non linear techniques ,applied on real and artificial data .It is observed that Non-Linear PCA outperform in the counterpart in most cases .However exceptions are noted

    Importance of Dimensionality Reduction in Image Processing

    Get PDF
    This paper presents a survey on various techniques of compression methods. Linear Discriminant analysis (LDA) is a method used in statistics, pattern recognition and machine learning to find a linear combination of features that classifies an object into two or more classes. This results in a dimensionality reduction before later classification.Principal component analysis (PCA) uses an orthogonal transformation to convert a set of correlated variables into a set of values of linearly uncorrelated variables called principal components. The purpose of the review is to explore the possibility of image compression for multiple images

    Information Retrieval Performance Enhancement Using The Average Standard Estimator And The Multi-criteria Decision Weighted Set

    Get PDF
    Information retrieval is much more challenging than traditional small document collection retrieval. The main difference is the importance of correlations between related concepts in complex data structures. These structures have been studied by several information retrieval systems. This research began by performing a comprehensive review and comparison of several techniques of matrix dimensionality estimation and their respective effects on enhancing retrieval performance using singular value decomposition and latent semantic analysis. Two novel techniques have been introduced in this research to enhance intrinsic dimensionality estimation, the Multi-criteria Decision Weighted model to estimate matrix intrinsic dimensionality for large document collections and the Average Standard Estimator (ASE) for estimating data intrinsic dimensionality based on the singular value decomposition (SVD). ASE estimates the level of significance for singular values resulting from the singular value decomposition. ASE assumes that those variables with deep relations have sufficient correlation and that only those relationships with high singular values are significant and should be maintained. Experimental results over all possible dimensions indicated that ASE improved matrix intrinsic dimensionality estimation by including the effect of both singular values magnitude of decrease and random noise distracters. Analysis based on selected performance measures indicates that for each document collection there is a region of lower dimensionalities associated with improved retrieval performance. However, there was clear disagreement between the various performance measures on the model associated with best performance. The introduction of the multi-weighted model and Analytical Hierarchy Processing (AHP) analysis helped in ranking dimensionality estimation techniques and facilitates satisfying overall model goals by leveraging contradicting constrains and satisfying information retrieval priorities. ASE provided the best estimate for MEDLINE intrinsic dimensionality among all other dimensionality estimation techniques, and further, ASE improved precision and relative relevance by 10.2% and 7.4% respectively. AHP analysis indicates that ASE and the weighted model ranked the best among other methods with 30.3% and 20.3% in satisfying overall model goals in MEDLINE and 22.6% and 25.1% for CRANFIELD. The weighted model improved MEDLINE relative relevance by 4.4%, while the scree plot, weighted model, and ASE provided better estimation of data intrinsic dimensionality for CRANFIELD collection than Kaiser-Guttman and Percentage of variance. ASE dimensionality estimation technique provided a better estimation of CISI intrinsic dimensionality than all other tested methods since all methods except ASE tend to underestimate CISI document collection intrinsic dimensionality. ASE improved CISI average relative relevance and average search length by 28.4% and 22.0% respectively. This research provided evidence supporting a system using a weighted multi-criteria performance evaluation technique resulting in better overall performance than a single criteria ranking model. Thus, the weighted multi-criteria model with dimensionality reduction provides a more efficient implementation for information retrieval than using a full rank model

    “Brainland” vs. “flatland”: How many dimensions do we need in brain dynamics?: Comment on the paper “The unreasonable effectiveness of small neural ensembles in high-dimensional brain” by Alexander N. Gorban et al.

    Get PDF
    In their review article (this issue) [1], Gorban, Makarov and Tyukin develop a successful effort to show in biological, physical and mathematical problems the relevant question of how high-dimensional brain can organise reliable and fast learning in the high-dimensional world of data using reduction tools. In fact, this paper, and several recent studies, focuses on the crucial problem of how the brain manages the information it receives, how it is organized, and how mathematics can learn about this and use dimension related techniques in other fields. Moreover, the opposite problem is also relevant, that is, how we can recover high-dimensional information from low-dimensional ones, the relevant problem of embedding dimensions (the other side of reducing dimensions). The human brain is a real open problem and a great challenge in human knowledge. The way the memory is codified is a fundamental problem in Neuroscience. As mentioned by the authors, the idea of blessing the dimensionality (and the opposite curse of dimensionality), are becoming more and more relevant in machine learning..
    • …
    corecore