3,318,783 research outputs found

    Big Data Dimensional Analysis

    Full text link
    The ability to collect and analyze large amounts of data is a growing problem within the scientific community. The growing gap between data and users calls for innovative tools that address the challenges faced by big data volume, velocity and variety. One of the main challenges associated with big data variety is automatically understanding the underlying structures and patterns of the data. Such an understanding is required as a pre-requisite to the application of advanced analytics to the data. Further, big data sets often contain anomalies and errors that are difficult to know a priori. Current approaches to understanding data structure are drawn from the traditional database ontology design. These approaches are effective, but often require too much human involvement to be effective for the volume, velocity and variety of data encountered by big data systems. Dimensional Data Analysis (DDA) is a proposed technique that allows big data analysts to quickly understand the overall structure of a big dataset, determine anomalies. DDA exploits structures that exist in a wide class of data to quickly determine the nature of the data and its statical anomalies. DDA leverages existing schemas that are employed in big data databases today. This paper presents DDA, applies it to a number of data sets, and measures its performance. The overhead of DDA is low and can be applied to existing big data systems without greatly impacting their computing requirements.Comment: From IEEE HPEC 201

    N-Dimensional Principal Component Analysis

    Get PDF
    In this paper, we first briefly introduce the multidimensional Principal Component Analysis (PCA) techniques, and then amend our previous N-dimensional PCA (ND-PCA) scheme by introducing multidirectional decomposition into ND-PCA implementation. For the case of high dimensionality, PCA technique is usually extended to an arbitrary n-dimensional space by the Higher-Order Singular Value Decomposition (HO-SVD) technique. Due to the size of tensor, HO-SVD implementation usually leads to a huge matrix along some direction of tensor, which is always beyond the capacity of an ordinary PC. The novelty of this paper is to amend our previous ND-PCA scheme to deal with this challenge and further prove that the revised ND-PCA scheme can provide a near optimal linear solution under the given error bound. To evaluate the numerical property of the revised ND-PCA scheme, experiments are performed on a set of 3D volume datasets
    corecore