6,215 research outputs found

    An Efficient Approach of Removing the High Density Salt

    Get PDF
    Images are often corrupted by impulse noise, also known as salt and pepper noise. Salt and pepper noise can corrupt the images where the corrupted pixel takes either maximum or minimum gray level. Amongst these standard median filter has been established as reliable - method to remove the salt and pepper noise without harming the edge details. However, the major problem of standard Median Filter (MF) is that the filter is effective only at low noise densities. When the noise level is over 50% the edge details of the original image will not be preserved by standard median filter. Adaptive Median Filter (AMF) performs well at low noise densities. In our proposed method, first we apply the Stationary Wavelet Transform (SWT) for noise added image. It will separate into four bands like LL, LH, HL and HH. Further, we calculate the window size 3x3 for LL band image by Reading the pixels from the window, computing the minimum, maximum and median values from inside the window. Then we find out the noise and noise free pixels inside the window by applying our algorithm which replaces the noise pixels. The higher bands are smoothing by soft thresholding method. Then all the coefficients are decomposed by inverse stationary wavelet transform. The performance of the proposed algorithm is tested for various levels of noise corruption and compared with standard filters namely standard median filter (SMF), weighted median filter (WMF). Our proposed method performs well in removing low to medium density impulse noise with detail preservation up to a noise density of 70% and it gives better Peak Signal-to-Noise Ratio (PSNR) and Mean square error (MSE) values

    Robust Time Series Dissimilarity Measure for Outlier Detection and Periodicity Detection

    Full text link
    Dynamic time warping (DTW) is an effective dissimilarity measure in many time series applications. Despite its popularity, it is prone to noises and outliers, which leads to singularity problem and bias in the measurement. The time complexity of DTW is quadratic to the length of time series, making it inapplicable in real-time applications. In this paper, we propose a novel time series dissimilarity measure named RobustDTW to reduce the effects of noises and outliers. Specifically, the RobustDTW estimates the trend and optimizes the time warp in an alternating manner by utilizing our designed temporal graph trend filtering. To improve efficiency, we propose a multi-level framework that estimates the trend and the warp function at a lower resolution, and then repeatedly refines them at a higher resolution. Based on the proposed RobustDTW, we further extend it to periodicity detection and outlier time series detection. Experiments on real-world datasets demonstrate the superior performance of RobustDTW compared to DTW variants in both outlier time series detection and periodicity detection

    Blazars in the Fermi Era: The OVRO 40-m Telescope Monitoring Program

    Get PDF
    The Large Area Telescope (LAT) aboard the Fermi Gamma-ray Space Telescope provides an unprecedented opportunity to study gamma-ray blazars. To capitalize on this opportunity, beginning in late 2007, about a year before the start of LAT science operations, we began a large-scale, fast-cadence 15 GHz radio monitoring program with the 40-m telescope at the Owens Valley Radio Observatory (OVRO). This program began with the 1158 northern (declination>-20 deg) sources from the Candidate Gamma-ray Blazar Survey (CGRaBS) and now encompasses over 1500 sources, each observed twice per week with a ~4 mJy (minimum) and 3% (typical) uncertainty. Here, we describe this monitoring program and our methods, and present radio light curves from the first two years (2008 and 2009). As a first application, we combine these data with a novel measure of light curve variability amplitude, the intrinsic modulation index, through a likelihood analysis to examine the variability properties of subpopulations of our sample. We demonstrate that, with high significance (7-sigma), gamma-ray-loud blazars detected by the LAT during its first 11 months of operation vary with about a factor of two greater amplitude than do the gamma-ray-quiet blazars in our sample. We also find a significant (3-sigma) difference between variability amplitude in BL Lacertae objects and flat-spectrum radio quasars (FSRQs), with the former exhibiting larger variability amplitudes. Finally, low-redshift (z<1) FSRQs are found to vary more strongly than high-redshift FSRQs, with 3-sigma significance. These findings represent an important step toward understanding why some blazars emit gamma-rays while others, with apparently similar properties, remain silent.Comment: 23 pages, 24 figures. Submitted to ApJ

    Keyframe-based visual–inertial odometry using nonlinear optimization

    Get PDF
    Combining visual and inertial measurements has become popular in mobile robotics, since the two sensing modalities offer complementary characteristics that make them the ideal choice for accurate visual–inertial odometry or simultaneous localization and mapping (SLAM). While historically the problem has been addressed with filtering, advancements in visual estimation suggest that nonlinear optimization offers superior accuracy, while still tractable in complexity thanks to the sparsity of the underlying problem. Taking inspiration from these findings, we formulate a rigorously probabilistic cost function that combines reprojection errors of landmarks and inertial terms. The problem is kept tractable and thus ensuring real-time operation by limiting the optimization to a bounded window of keyframes through marginalization. Keyframes may be spaced in time by arbitrary intervals, while still related by linearized inertial terms. We present evaluation results on complementary datasets recorded with our custom-built stereo visual–inertial hardware that accurately synchronizes accelerometer and gyroscope measurements with imagery. A comparison of both a stereo and monocular version of our algorithm with and without online extrinsics estimation is shown with respect to ground truth. Furthermore, we compare the performance to an implementation of a state-of-the-art stochastic cloning sliding-window filter. This competitive reference implementation performs tightly coupled filtering-based visual–inertial odometry. While our approach declaredly demands more computation, we show its superior performance in terms of accuracy

    Importance sampling schemes for evidence approximation in mixture models

    Full text link
    The marginal likelihood is a central tool for drawing Bayesian inference about the number of components in mixture models. It is often approximated since the exact form is unavailable. A bias in the approximation may be due to an incomplete exploration by a simulated Markov chain (e.g., a Gibbs sequence) of the collection of posterior modes, a phenomenon also known as lack of label switching, as all possible label permutations must be simulated by a chain in order to converge and hence overcome the bias. In an importance sampling approach, imposing label switching to the importance function results in an exponential increase of the computational cost with the number of components. In this paper, two importance sampling schemes are proposed through choices for the importance function; a MLE proposal and a Rao-Blackwellised importance function. The second scheme is called dual importance sampling. We demonstrate that this dual importance sampling is a valid estimator of the evidence and moreover show that the statistical efficiency of estimates increases. To reduce the induced high demand in computation, the original importance function is approximated but a suitable approximation can produce an estimate with the same precision and with reduced computational workload.Comment: 24 pages, 5 figure
    • …
    corecore