4 research outputs found

    Integration of a big data emerging on large sparse simulation and its application on green computing platform

    Get PDF
    The process of analyzing large data and verifying a big data set are a challenge for understanding the fundamental concept behind it. Many big data analysis techniques suffer from the poor scalability, variation inequality, instability, lower convergence, and weak accuracy of the large-scale numerical algorithms. Due to these limitations, a wider opportunity for numerical analysts to develop the efficiency and novel parallel algorithms has emerged. Big data analytics plays an important role in the field of sciences and engineering for extracting patterns, trends, actionable information from large sets of data and improving strategies for making a decision. A large data set consists of a large-scale data collection via sensor network, transformation from signal to digital images, high resolution of a sensing system, industry forecasts, existing customer records to predict trends and prepare for new demand. This paper proposes three types of big data analytics in accordance to the analytics requirement involving a large-scale numerical simulation and mathematical modeling for solving a complex problem. First is a big data analytics for theory and fundamental of nanotechnology numerical simulation. Second, big data analytics for enhancing the digital images in 3D visualization, performance analysis of embedded system based on the large sparse data sets generated by the device. Lastly, extraction of patterns from the electroencephalogram (EEG) data set for detecting the horizontal-vertical eye movements. Thus, the process of examining a big data analytics is to investigate the behavior of hidden patterns, unknown correlations, identify anomalies, and discover structure inside unstructured data and extracting the essence, trend prediction, multi-dimensional visualization and real-time observation using the mathematical model. Parallel algorithms, mesh generation, domain-function decomposition approaches, inter-node communication design, mapping the subdomain, numerical analysis and parallel performance evaluations (PPE) are the processes of the big data analytics implementation. The superior of parallel numerical methods such as AGE, Brian and IADE were proven for solving a large sparse model on green computing by utilizing the obsolete computers, the old generation servers and outdated hardware, a distributed virtual memory and multi-processors. The integration of low-cost communication of message passing software and green computing platform is capable of increasing the PPE up to 60% when compared to the limited memory of a single processor. As a conclusion, large-scale numerical algorithms with great performance in scalability, equality, stability, convergence, and accuracy are important features in analyzing big data simulation

    Integration of a big data emerging on large sparse simulation and its application on green computing platform

    Get PDF
    The process of analyzing large data and verifying a big data set are a challenge for understanding the fundamental concept behind it. Many big data analysis techniques suffer from the poor scalability, variation inequality, instability, lower convergence, and weak accuracy of the large-scale numerical algorithms. Due to these limitations, a wider opportunity for numerical analysts to develop the efficiency and novel parallel algorithms has emerged. Big data analytics plays an important role in the field of sciences and engineering for extracting patterns, trends, actionable information from large sets of data and improving strategies for making a decision. A large data set consists of a large-scale data collection via sensor network, transformation from signal to digital images, high resolution of a sensing system, industry forecasts, existing customer records to predict trends and prepare for new demand. This paper proposes three types of big data analytics in accordance to the analytics requirement involving a large-scale numerical simulation and mathematical modeling for solving a complex problem. First is a big data analytics for theory and fundamental of nanotechnology numerical simulation. Second, big data analytics for enhancing the digital images in 3D visualization, performance analysis of embedded system based on the large sparse data sets generated by the device. Lastly, extraction of patterns from the electroencephalogram (EEG) data set for detecting the horizontal-vertical eye movements. Thus, the process of examining a big data analytics is to investigate the behavior of hidden patterns, unknown correlations, identify anomalies, and discover structure inside unstructured data and extracting the essence, trend prediction, multi-dimensional visualization and real-time observation using the mathematical model. Parallel algorithms, mesh generation, domain-function decomposition approaches, inter-node communication design, mapping the subdomain, numerical analysis and parallel performance evaluations (PPE) are the processes of the big data analytics implementation. The superior of parallel numerical methods such as AGE, Brian and IADE were proven for solving a large sparse model on green computing by utilizing the obsolete computers, the old generation servers and outdated hardware, a distributed virtual memory and multi-processors. The integration of low-cost communication of message passing software and green computing platform is capable of increasing the PPE up to 60% when compared to the limited memory of a single processor. As a conclusion, large-scale numerical algorithms with great performance in scalability, equality, stability, convergence, and accuracy are important features in analyzing big data simulation

    Parallelization of multidimensional hyperbolic partial differential equation on détente instantanée contrôlée dehydration process

    Get PDF
    The purpose of this research is to propose some new modified mathematical models to enhance the previous model in simulating, visualizing and predicting the heat and mass transfer in dehydration process using instant controlled pressure drop (DIC) technique. The main contribution of this research is the mathematical models which are formulated from the regression model (Haddad et al., 2007) to multidimensional hyperbolic partial differential equation (HPDE) involving dependent parameters; moisture content, temperature, and pressure, and independent parameters; time and dimension of region. The HPDE model is performed in multidimensional; one, two and three dimensions using finite difference method with central difference formula is used to discretize the mathematical models. The implementation of numerical methods such as Alternating Group Explicit with Brian (AGEB) and Douglas-Rachford (AGED) variances, Red Black Gauss Seidel (RBGS) and Jacobi (JB) method to solve the system of linear equation is another contribution of this research. The sequential algorithm is developed by using Matlab R2011a software. The numerical results are analyzed based on execution time, number of iterations, maximum error, root mean square error, and computational complexity. The grid generation process involved a fine grained large sparse data by minimizing the size of interval, increasing the dimension of the model and level of time steps. Another contribution is the implementation of the parallel algorithm to increase the speedup of computation and to reduce computational complexity problem. The parallelization of the mathematical model is run on Matlab Distributed Computing Server with Linux operating system. The parallel performance evaluation of multidimensional simulation in terms of execution time, speedup, efficiency, effectiveness, temporal performance, granularity, computational complexity and communication cost are analyzed for the performance of parallel algorithm. As a conclusion, the thesis proved that the multidimensional HPDE is able to be parallelized and PAGEB method is the alternative solution for the large sparse simulation. Based on the numerical results and parallel performance evaluations, the parallel algorithm is able to reduce the execution time and computational complexity compared to the sequential algorithm

    The AGEB Algorithm for Solving the Heat Equation in Three Space Dimensions and Its Parallelization Using PVM

    Get PDF
    In this paper, a new algorithm in the class of the AGE In this paper, a new algorithm in the class of the AGE method based on the Brian variant (AGEB) of the ADI is developed to solve the heat equation in 3 space dimensions. The method is iterative, convergent, stable and second order accurate with respect to space and time. It is inherently explicit and is therefore well suited for parallel implementation on the PVM where data decomposition is run asynchronously and concurrently at every time level. Its performance is assessed in terms of speed-up, efficiency and effectiveness
    corecore