1,904,764 research outputs found

    Novel Metaknowledge-based Processing Technique for Multimedia Big Data clustering challenges

    Full text link
    Past research has challenged us with the task of showing relational patterns between text-based data and then clustering for predictive analysis using Golay Code technique. We focus on a novel approach to extract metaknowledge in multimedia datasets. Our collaboration has been an on-going task of studying the relational patterns between datapoints based on metafeatures extracted from metaknowledge in multimedia datasets. Those selected are significant to suit the mining technique we applied, Golay Code algorithm. In this research paper we summarize findings in optimization of metaknowledge representation for 23-bit representation of structured and unstructured multimedia data in order toComment: IEEE Multimedia Big Data (BigMM 2015

    Allocating the chains of consecutive additions for optimal fixed-point data path synthesis

    Full text link
    Minimization of computational errors in the fixed-point data path is often difficult task. Many signal processing algorithms use chains of consecutive additions. The analyzing technique that can be applied to fixed-point data path synthesis has been proposed. This technique takes advantage of allocating the chains of consecutive additions in order to predict growing width of the data path and minimize the design complexity and computational errors

    Multispectral scanner data processing over Sam Houston National Forest

    Get PDF
    The Edit 9 forest scene, a computer processing technique, and its capability to map timber types in the Sam Houston National Forest, are evaluated. Special efforts were made to evaluate existing computer processing techniques in mapping timber types using ERTS-1 and aircraft data, and to provide an opportunity to open up new research and development areas in forestry data

    An automatic technique for visual quality classification for MPEG-1 video

    Get PDF
    The Centre for Digital Video Processing at Dublin City University developed Fischlar [1], a web-based system for recording, analysis, browsing and playback of digitally captured television programs. One major issue for Fischlar is the automatic evaluation of video quality in order to avoid processing and storage of corrupted data. In this paper we propose an automatic classification technique that detects the video content quality in order to provide a decision criterion for the processing and storage stages

    Radiometric correction procedure study

    Get PDF
    A comparison of MSS radiometric processing techniques identified as a preferred radiometric processing technique a procedure which equalizes the mean and standard deviation of detector-specific histograms of uncalibrated scene data. Evaluation of MSS calibration data demonstrated that the relationship between detector responses is essentially linear over the range of intensities typically observed in MSS data, and that the calibration wedge data possess a high degree of temporal stability. An analysis of the preferred radiometric processing technique showed that it could be incorporated into the MDP-MSS system without a major redesign of the system, and with minimal impact on system throughput

    Forecasting Stock Time-Series using Data Approximation and Pattern Sequence Similarity

    Get PDF
    Time series analysis is the process of building a model using statistical techniques to represent characteristics of time series data. Processing and forecasting huge time series data is a challenging task. This paper presents Approximation and Prediction of Stock Time-series data (APST), which is a two step approach to predict the direction of change of stock price indices. First, performs data approximation by using the technique called Multilevel Segment Mean (MSM). In second phase, prediction is performed for the approximated data using Euclidian distance and Nearest-Neighbour technique. The computational cost of data approximation is O(n ni) and computational cost of prediction task is O(m |NN|). Thus, the accuracy and the time required for prediction in the proposed method is comparatively efficient than the existing Label Based Forecasting (LBF) method [1].Comment: 11 page

    A vector scanning processing technique for pulsed laser velocimetry

    Get PDF
    Pulsed laser sheet velocimetry yields nonintrusive measurements of two-dimensional velocity vectors across an extended planar region of a flow. Current processing techniques offer high precision (1 pct) velocity estimates, but can require several hours of processing time on specialized array processors. Under some circumstances, a simple, fast, less accurate (approx. 5 pct), data reduction technique which also gives unambiguous velocity vector information is acceptable. A direct space domain processing technique was examined. The direct space domain processing technique was found to be far superior to any other techniques known, in achieving the objectives listed above. It employs a new data coding and reduction technique, where the particle time history information is used directly. Further, it has no 180 deg directional ambiguity. A complex convection vortex flow was recorded and completely processed in under 2 minutes on an 80386 based PC, producing a 2-D velocity vector map of the flow field. Hence, using this new space domain vector scanning (VS) technique, pulsed laser velocimetry data can be reduced quickly and reasonably accurately, without specialized array processing hardware
    corecore