57,973 research outputs found

    Modified repeated median filters

    Get PDF
    We discuss moving window techniques for fast extraction of a signal comprising monotonic trends and abrupt shifts from a noisy time series with irrelevant spikes. Running medians remove spikes and preserve shifts, but they deteriorate in trend periods. Modified trimmed mean filters use a robust scale estimate such as the median absolute deviation about the median (MAD) to select an adaptive amount of trimming. Application of robust regression, particularly of the repeated median, has been suggested for improving upon the median in trend periods. We combine these ideas and construct modified filters based on the repeated median offering better shift preservation. All these filters are compared w.r.t. fundamental analytical properties and in basic data situations. An algorithm for the update of the MAD running in time O(log n) for window width n is presented as well. --signal extraction,robust filtering,drifts,jumps,outliers,computational geometry,update algorithm

    Modified Repeated Median Filters

    Get PDF
    We discuss moving window techniques for fast extraction of a signal comprising monotonic trends and abrupt shifts from a noisy time series with irrelevant spikes. Running medians remove spikes and preserve shifts, but they deteriorate in trend periods. Modified trimmed mean filters use a robust scale estimate such as the median absolute deviation about the median (MAD) to select an adaptive amount of trimming. Application of robust regression, particularly of the repeated median, has been suggested for improving upon the median in trend periods. We combine these ideas and construct modified filters based on the repeated median offering better shift preservation. All these filters are compared w.r.t. fundamental analytical properties and in basic data situations. An algorithm for the update of the MAD running in time O(log n) for window width n is presented as well

    Multi Stage based Time Series Analysis of User Activity on Touch Sensitive Surfaces in Highly Noise Susceptible Environments

    Full text link
    This article proposes a multistage framework for time series analysis of user activity on touch sensitive surfaces in noisy environments. Here multiple methods are put together in multi stage framework; including moving average, moving median, linear regression, kernel density estimation, partial differential equations and Kalman filter. The proposed three stage filter consisting of partial differential equation based denoising, Kalman filter and moving average method provides ~25% better noise reduction than other methods according to Mean Squared Error (MSE) criterion in highly noise susceptible environments. Apart from synthetic data, we also obtained real world data like hand writing, finger/stylus drags etc. on touch screens in the presence of high noise such as unauthorized charger noise or display noise and validated our algorithms. Furthermore, the proposed algorithm performs qualitatively better than the existing solutions for touch panels of the high end hand held devices available in the consumer electronics market qualitatively.Comment: 9 pages (including 9 figures and 3 tables); International Journal of Computer Applications (published

    Robust Filters for Intensive Care Monitoring: Beyond the Running Median

    Get PDF
    Current alarm systems on intensive care units create a very high rate of false positive alarms because most of them simply compare the physiological measurements to fixed thresholds. An improvement can be expected when the actual measurements are replaced by smoothed estimates of the underlying signal. However, classical filtering procedures are not appropriate for signal extraction as standard assumptions, like stationarity, do no hold here: the measured time series often show long periods without change, but also upward or downward trends, sudden shifts and numerous large measurement artefacts. Alternative approaches are needed to extract the relevant information from the data, i.e. the underlying signal of the monitored variables and the relevant patterns of change, like abrupt shifts and trends. This article reviews recent research on filter based online signal extraction methods which are designed for application in intensive care. --

    Deep neural networks for direct, featureless learning through observation: the case of 2d spin models

    Full text link
    We demonstrate the capability of a convolutional deep neural network in predicting the nearest-neighbor energy of the 4x4 Ising model. Using its success at this task, we motivate the study of the larger 8x8 Ising model, showing that the deep neural network can learn the nearest-neighbor Ising Hamiltonian after only seeing a vanishingly small fraction of configuration space. Additionally, we show that the neural network has learned both the energy and magnetization operators with sufficient accuracy to replicate the low-temperature Ising phase transition. We then demonstrate the ability of the neural network to learn other spin models, teaching the convolutional deep neural network to accurately predict the long-range interaction of a screened Coulomb Hamiltonian, a sinusoidally attenuated screened Coulomb Hamiltonian, and a modified Potts model Hamiltonian. In the case of the long-range interaction, we demonstrate the ability of the neural network to recover the phase transition with equivalent accuracy to the numerically exact method. Furthermore, in the case of the long-range interaction, the benefits of the neural network become apparent; it is able to make predictions with a high degree of accuracy, and do so 1600 times faster than a CUDA-optimized exact calculation. Additionally, we demonstrate how the neural network succeeds at these tasks by looking at the weights learned in a simplified demonstration

    Comparison of Algorithms for Baseline Correction of LIBS Spectra for Quantifying Total Carbon in Brazilian Soils

    Full text link
    LIBS is a promising and versatile technique for multi-element analysis that usually takes less than a minute and requires minimal sample preparation and no reagents. Despite the recent advances in elemental quantification, the LIBS still faces issues regarding the baseline produced by background radiation, which adds non-linear interference to the emission lines. In order to create a calibration model to quantify elements using LIBS spectra, the baseline has to be properly corrected. In this paper, we compared the performance of three filters to remove random noise and five methods to correct the baseline of LIBS spectra for the quantification of total carbon in soil samples. All combinations of filters and methods were tested, and their parameters were optimized to result in the best correlation between the corrected spectra and the carbon content in a training sample set. Then all combinations with the optimized parameters were compared with a separate test sample set. A combination of Savitzky-Golay filter and 4S Peak Filling method resulted in the best correction: Pearson's correlation coefficient of 0.93 with root mean square error of 0.21. The result was better than using a linear regression model with the carbon emission line 193.04 nm (correlation of 0.91 with error of 0.26). The procedure proposed here opens a new possibility to correct the baseline of LIBS spectra and to create multivariate methods based on the a given spectral range.Comment: 13 pages, 5 figure

    Selection from read-only memory with limited workspace

    Full text link
    Given an unordered array of NN elements drawn from a totally ordered set and an integer kk in the range from 11 to NN, in the classic selection problem the task is to find the kk-th smallest element in the array. We study the complexity of this problem in the space-restricted random-access model: The input array is stored on read-only memory, and the algorithm has access to a limited amount of workspace. We prove that the linear-time prune-and-search algorithm---presented in most textbooks on algorithms---can be modified to use Θ(N)\Theta(N) bits instead of Θ(N)\Theta(N) words of extra space. Prior to our work, the best known algorithm by Frederickson could perform the task with Θ(N)\Theta(N) bits of extra space in O(NlgN)O(N \lg^{*} N) time. Our result separates the space-restricted random-access model and the multi-pass streaming model, since we can surpass the Ω(NlgN)\Omega(N \lg^{*} N) lower bound known for the latter model. We also generalize our algorithm for the case when the size of the workspace is Θ(S)\Theta(S) bits, where lg3NSN\lg^3{N} \leq S \leq N. The running time of our generalized algorithm is O(Nlg(N/S)+N(lgN)/lgS)O(N \lg^{*}(N/S) + N (\lg N) / \lg{} S), slightly improving over the O(Nlg(N(lgN)/S)+N(lgN)/lgS)O(N \lg^{*}(N (\lg N)/S) + N (\lg N) / \lg{} S) bound of Frederickson's algorithm. To obtain the improvements mentioned above, we developed a new data structure, called the wavelet stack, that we use for repeated pruning. We expect the wavelet stack to be a useful tool in other applications as well.Comment: 16 pages, 1 figure, Preliminary version appeared in COCOON-201
    corecore