526 research outputs found
Continuous record Laplace-based inference about the break date in structural change models
Building upon the continuous record asymptotic framework recently introduced by Casini and Perron (2018a) for inference in structural change models, we propose a Laplace-based (Quasi-Bayes) procedure for the construction of the estimate and confidence set for the date of a structural change. It is defined by an integration rather than an optimization-based method.A transformation of the least-squares criterion function is evaluated in order to derive a proper distribution, referred to as the Quasi-posterior. For a given choice of a loss function, the Laplace-type estimator is the minimizer of the expected risk with the expectation taken under the Quasi-posterior. Besides providing an alternative estimate that is more precise—lower mean absolute error (MAE) and lower root-mean squared error (RMSE)—than the usual least-squares one, the Quasi-posterior distribution can be used to construct asymptotically valid inference using the concept of Highest Density Region. The resulting Laplace-based inferential procedure is shown to have lower MAE and RMSE, and the confidence sets strike the best balance between empirical coverage rates and average lengths of the confidence sets relative to traditional long-span methods, whether the break size is small or large.First author draf
Continuous record asymptotics for structural change models
For a partial structural change in a linear regression model with a single break, we develop a continuous record asymptotic framework to build inference methods for the break date. We have T observations with a sampling frequency h over a fixed time horizon [0 , N ], and let T →∞ with h ↓ 0 while keeping the time span N fixed. We impose very mild regularity conditions on an underlying continuous-time model assumed to generate the data. We consider the least-squares estimate of the break date and establish consistency and convergence rate. We provide a limit theory for shrinking magnitudes of shifts and locally increasing variances. The asymptotic distribution corresponds to the location of the extremum of a function of the quadratic variation of the regressors and of a Gaussian centered martingale process over a certain time interval. We can account for the asymmetric informational content provided by the pre- and post-break regimes and show how the location of the break and shift magnitude are key ingredients in shaping the distribution. We consider a feasible version based on plug-in estimates, which provides a very good approximation to the finite sample distribution. We use the concept of Highest Density Region to construct confidence sets. Overall, our method is reliable and delivers accurate coverage probabilities and relatively short average length of the confidence sets. Importantly, it does so irrespective of the size of the break
Theory of Evolutionary Spectra for Heteroskedasticity and Autocorrelation Robust Inference in Possibly Misspecified and Nonstationary Models
We develop a theory of evolutionary spectra for heteroskedasticity and
autocorrelation robust (HAR) inference when the data may not satisfy
second-order stationarity. Nonstationarity is a common feature of economic time
series which may arise either from parameter variation or model
misspecification. In such a context, the theories that support HAR inference
are either not applicable or do not provide accurate approximations. HAR tests
standardized by existing long-run variance estimators then may display size
distortions and little or no power. This issue can be more severe for methods
that use long bandwidths (i.e., fixed-b HAR tests). We introduce a class of
nonstationary processes that have a time-varying spectral representation which
evolves continuously except at a finite number of time points. We present an
extension of the classical heteroskedasticity and autocorrelation consistent
(HAC) estimators that applies two smoothing procedures. One is over the lagged
autocovariances, akin to classical HAC estimators, and the other is over time.
The latter element is important to flexibly account for nonstationarity. We
name them double kernel HAC (DK-HAC) estimators. We show the consistency of the
estimators and obtain an optimal DK-HAC estimator under the mean squared error
(MSE) criterion. Overall, HAR tests standardized by the proposed DK-HAC
estimators are competitive with fixed-b HAR tests, when the latter work well,
with regards to size control even when there is strong dependence. Notably, in
those empirically relevant situations in which previous HAR tests are
undersized and have little or no power, the DK-HAC estimator leads to tests
that have good size and power.Comment: arXiv admin note: text overlap with arXiv:2103.0006
Continuous record asymptotics for structural change models
For a partial structural change in a linear regression model with a single break, we develop a continuous record asymptotic framework to build inference methods for the break date. We have T observations with a sampling frequency h over a fixed time horizon [0, N] , and let T with h 0 while keeping the time span N fixed. We impose very mild regularity conditions on an underlying continuous-time model assumed to generate the data. We consider the least-squares estimate of the break date and establish consistency and convergence rate. We provide a limit theory for shrinking magnitudes of shifts and locally increasing variances. The asymptotic distribution corresponds to the location of the extremum of a function of the quadratic variation of the regressors and of a Gaussian centered martingale process over a certain time interval. We can account for the asymmetric informational content provided by the pre- and post-break regimes and show how the location of the break and shift magnitude are key ingredients in shaping the distribution. We consider a feasible version based on plug-in estimates, which provides a very good approximation to the finite sample distribution. We use the concept of Highest Density Region to construct confidence sets. Overall, our method is reliable and delivers accurate coverage probabilities and relatively short average length of the confidence sets. Importantly, it does so irrespective of the size of the break.First author draf
Change-Point Analysis of Time Series with Evolutionary Spectra
This paper develops change-point methods for the spectrum of a locally
stationary time series. We focus on series with a bounded spectral density that
change smoothly under the null hypothesis but exhibits change-points or becomes
less smooth under the alternative. We address two local problems. The first is
the detection of discontinuities (or breaks) in the spectrum at unknown dates
and frequencies. The second involves abrupt yet continuous changes in the
spectrum over a short time period at an unknown frequency without signifying a
break. Both problems can be cast into changes in the degree of smoothness of
the spectral density over time. We consider estimation and minimax-optimal
testing. We determine the optimal rate for the minimax distinguishable
boundary, i.e., the minimum break magnitude such that we are able to uniformly
control type I and type II errors. We propose a novel procedure for the
estimation of the change-points based on a wild sequential top-down algorithm
and show its consistency under shrinking shifts and possibly growing number of
change-points. Our method can be used across many fields and a companion
program is made available in popular software packages
Generalized Laplace Inference in Multiple Change-Points Models
Under the classical long-span asymptotic framework we develop a class of
Generalized Laplace (GL) inference methods for the change-point dates in a
linear time series regression model with multiple structural changes analyzed
in, e.g., Bai and Perron (1998). The GL estimator is defined by an integration
rather than optimization-based method and relies on the least-squares criterion
function. It is interpreted as a classical (non-Bayesian) estimator and the
inference methods proposed retain a frequentist interpretation. This approach
provides a better approximation about the uncertainty in the data of the
change-points relative to existing methods. On the theoretical side, depending
on some input (smoothing) parameter, the class of GL estimators exhibits a dual
limiting distribution; namely, the classical shrinkage asymptotic distribution,
or a Bayes-type asymptotic distribution. We propose an inference method based
on Highest Density Regions using the latter distribution. We show that it has
attractive theoretical properties not shared by the other popular alternatives,
i.e., it is bet-proof. Simulations confirm that these theoretical properties
translate to better finite-sample performance
Semi-Partitioned Scheduling of Dynamic Real-Time Workload: A Practical Approach Based on Analysis-Driven Load Balancing
Recent work showed that semi-partitioned scheduling can achieve near-optimal schedulability performance, is simpler to implement compared to global scheduling, and less heavier in terms of runtime overhead, thus resulting in an excellent choice for implementing real-world systems. However, semi-partitioned scheduling typically leverages an off-line design to allocate tasks across the available processors, which requires a-priori knowledge of the workload. Conversely, several simple global schedulers, as global earliest-deadline first (G-EDF), can transparently support dynamic workload without requiring a task-allocation phase. Nonetheless, such schedulers exhibit poor worst-case performance. This work proposes a semi-partitioned approach to efficiently schedule dynamic real-time workload on a multiprocessor system. A linear-time approximation for the C=D splitting scheme under partitioned EDF scheduling is first presented to reduce the complexity of online scheduling decisions. Then, a load-balancing algorithm is proposed for admitting new real-time workload in the system with limited workload re-allocation. A large-scale experimental study shows that the linear-time approximation has a very limited utilization loss compared to the exact technique and the proposed approach achieves very high schedulability performance, with a consistent improvement on G-EDF and pure partitioned EDF scheduling
- …