210,763 research outputs found

    Approximation Algorithms for Confidence Bands for Time Series

    Get PDF
    Confidence intervals are a standard technique for analyzing data. When applied to time series, confidence intervals are computed for each time point separately. Alternatively, we can compute confidence bands, where we are required to find the smallest area enveloping k time series, where k is a user parameter. Confidence bands can be then used to detect abnormal time series, not just individual observations within the time series. We will show that despite being an NP-hard problem it is possible to find optimal confidence band for some k. We do this by considering a different problem: discovering regularized bands, where we minimize the envelope area minus the number of included time series weighted by a parameter a. Unlike normal confidence bands we can solve the problem exactly by using a minimum cut. By varying a we can obtain solutions for various k. If we have a constraint k for which we cannot find appropriate a, we demonstrate a simple algorithm that yields O(root n) approximation guarantee by connecting the problem to a minimum k-union problem. This connection also implies that we cannot approximate the problem better than O (n(1/4)) under some (mild) assumptions. Finally, we consider a variant where instead of minimizing the area we minimize the maximum width. Here, we demonstrate a simple 2-approximation algorithm and show that we cannot achieve better approximation guarantee.Peer reviewe

    Approximation Algorithms for Confidence Bands for Time Series

    Get PDF
    Confidence intervals are a standard technique for analyzing data. When applied to time series, confidence intervals are computed for each time point separately. Alternatively, we can compute confidence bands, where we are required to find the smallest area enveloping k time series, where k is a user parameter. Confidence bands can be then used to detect abnormal time series, not just individual observations within the time series. We will show that despite being an NP-hard problem it is possible to find optimal confidence band for some k. We do this by considering a different problem: discovering regularized bands, where we minimize the envelope area minus the number of included time series weighted by a parameter a. Unlike normal confidence bands we can solve the problem exactly by using a minimum cut. By varying a we can obtain solutions for various k. If we have a constraint k for which we cannot find appropriate a, we demonstrate a simple algorithm that yields O(root n) approximation guarantee by connecting the problem to a minimum k-union problem. This connection also implies that we cannot approximate the problem better than O (n(1/4)) under some (mild) assumptions. Finally, we consider a variant where instead of minimizing the area we minimize the maximum width. Here, we demonstrate a simple 2-approximation algorithm and show that we cannot achieve better approximation guarantee.Peer reviewe

    A Robust Determination of the Time Delay in 0957+561A,B and a Measurement of the Global Value of Hubble's Constant

    Full text link
    Photometric monitoring of the gravitational lens system 0957+561A,B in the g and r bands with the Apache Point Observatory (APO) 3.5 m telescope during 1996 shows a sharp g band event in the trailing (B) image light curve at the precise time predicted from the observation of an event during 1995 in the leading (A) image with a delay of 415 days. This success confirms the "short delay," and the lack of any feature at a delay near 540 days rejects the "long delay" for this system, resolving a long-standing controversy. A series of statistical analyses of our light curve data yield a best fit delay of 417 +/- 3 days (95% confidence interval). Recent improvements in the modeling of the lens system (consisting of a galaxy and cluster) allow us to derive a value of the global (at z = 0.36) value of Hubble's constant H_0 using Refsdal's method, a simple and direct distance determination based on securely understood physics and geometry. The result is H_0 = 63 +/- 12 km/s/Mpc (for Omega = 1) where this 95% confidence interval is dominated by remaining lens model uncertainties.Comment: accepted by ApJ, AASTeX 4.0 preprint, 4 PostScript figure

    1SXPS: A deep Swift X-ray Telescope point source catalog with light curves and spectra

    Get PDF
    We present the 1SXPS (Swift-XRT Point Source) catalog of 151,524 X-ray point-sources detected by the Swift-XRT in 8 years of operation. The catalog covers 1905 square degrees distributed approximately uniformly on the sky. We analyze the data in two ways. First we consider all observations individually, for which we have a typical sensitivity of ~3e-13 erg/cm2/s (0.3--10 keV). Then we co-add all data covering the same location on the sky: these images have a typical sensitivity of ~9e-14 erg/cm2/s (0.3--10 keV). Our sky coverage is nearly 2.5 times that of 3XMM-DR4, although the catalog is a factor of ~1.5 less sensitive. The median position error is 5.5" (90% confidence), including systematics. Our source detection method improves on that used in previous XRT catalogs and we report >68,000 new X-ray sources. The goals and observing strategy of the Swift satellite allow us to probe source variability on multiple timescales, and we find ~30,000 variable objects in our catalog. For every source we give positions, fluxes, time series (in four energy bands and two hardness ratios), estimates of the spectral properties, spectra and spectral fits for the brightest sources, and variability probabilities in multiple energy bands and timescales.Comment: 27 pages, 19 figures; accepted for publication in ApJS. The accompanying website, http://www.swift.ac.uk/1SXPS is live; the Vizier entry should be available shortl

    Inference for functional time series with applications to yield curves and intraday cumulative returns

    Get PDF
    2016 Spring.Includes bibliographical references.Econometric and financial data often take the form of a functional time series. Examples include yield curves, intraday price curves and term structure curves. Before an attempt is made to statistically model or predict such series, we must address whether or not such a series can be assumed stationary or trend stationary. We develop extensions of the KPSS stationarity test to functional time series. Motivated by the problem of a change in the mean structure of yield curves, we also introduce several change point methods applied to dynamic factor models. For all testing procedures, we include a complete asymptotic theory, a simulation study, illustrative data examples, as well as details of the numerical implementation of the testing procedures. The impact of scheduled macroeconomic announcements has been shown to account for sizable fractions of total annual realized stock returns. To assess this impact, we develop methods of derivative estimation which utilize a functional analogue of local-polynomial smoothing. The confidence bands are then used to find time intervals of statistically increasing cumulative returns

    Inference for the autocovariance of a functional time series under conditional heteroscedasticity

    Get PDF
    Most methods for analyzing functional time series rely on the estimation of lagged autocovariance operators or surfaces. As in univariate time series analysis, testing whether or not such operators are zero is an important diagnostic step that is well understood when the data, or model residuals, form a strong white noise. When functional data are constructed from dense records of, for example, asset prices or returns, a weak white noise model allowing for conditional heteroscedasticity is often more realistic. Applying inferential procedures for the autocovariance based on a strong white noise to such data often leads to the erroneous conclusion that the data exhibit significant autocorrelation. We develop methods for performing inference for the lagged autocovariance operators of stationary functional time series that are valid under general conditional heteroscedasticity conditions. These include a portmanteau test to assess the cumulative significance of empirical autocovariance operators up to a user selected maximum lag, as well as methods for obtaining confidence bands for a functional version of the autocorrelation that are useful in model selection/validation. We analyze the efficacy of these methods through a simulation study, and apply them to functional time series derived from asset price data of several representative assets. In this application, we found that strong white noise tests often suggest that such series exhibit significant autocorrelation, whereas our tests, which account for functional conditional heteroscedasticity, show that these data are in fact uncorrelated in a function space

    Sparsely Observed Functional Time Series: Estimation and Prediction

    Full text link
    Functional time series analysis, whether based on time of frequency domain methodology, has traditionally been carried out under the assumption of complete observation of the constituent series of curves, assumed stationary. Nevertheless, as is often the case with independent functional data, it may well happen that the data available to the analyst are not the actual sequence of curves, but relatively few and noisy measurements per curve, potentially at different locations in each curve's domain. Under this sparse sampling regime, neither the established estimators of the time series' dynamics, nor their corresponding theoretical analysis will apply. The subject of this paper is to tackle the problem of estimating the dynamics and of recovering the latent process of smooth curves in the sparse regime. Assuming smoothness of the latent curves, we construct a consistent nonparametric estimator of the series' spectral density operator and use it develop a frequency-domain recovery approach, that predicts the latent curve at a given time by borrowing strength from the (estimated) dynamic correlations in the series across time. Further to predicting the latent curves from their noisy point samples, the method fills in gaps in the sequence (curves nowhere sampled), denoises the data, and serves as a basis for forecasting. Means of providing corresponding confidence bands are also investigated. A simulation study interestingly suggests that sparse observation for a longer time period, may be provide better performance than dense observation for a shorter period, in the presence of smoothness. The methodology is further illustrated by application to an environmental data set on fair-weather atmospheric electricity, which naturally leads to a sparse functional time-series

    On vector autoregressive modeling in space and time

    Get PDF
    Despite the fact that it provides a potentially useful analytical tool, allowing for the joint modeling of dynamic interdependencies within a group of connected areas, until lately the VAR approach had received little attention in regional science and spatial economic analysis. This paper aims to contribute in this field by dealing with the issues of parameter identification and estimation and of structural impulse response analysis. In particular, there is a discussion of the adaptation of the recursive identification scheme (which represents one of the more common approaches in the time series VAR literature) to a space-time environment. Parameter estimation is subsequently based on the Full Information Maximum Likelihood (FIML) method, a standard approach in structural VAR analysis. As a convenient tool to summarize the information conveyed by regional dynamic multipliers with a specific emphasis on the scope of spatial spillover effects, a synthetic space-time impulse response function (STIR) is introduced, portraying average effects as a function of displacement in time and space. Asymptotic confidence bands for the STIR estimates are also derived from bootstrap estimates of the standard errors. Finally, to provide a basic illustration of the methodology, the paper presents an application of a simple bivariate fiscal model fitted to data for Italian NUTS 2 regions.structural VAR model, spatial econometrics, identification, space-time impulse response analysis
    • 

    corecore