199 research outputs found

    VERA monitoring of the radio jet 3C 84 during 2007--2013: detection of non-linear motion

    Get PDF
    We present a kinematic study of the subparsec-scale radio jet of the radio galaxy 3C 84/NGC 1275 with the VLBI Exploration of Radio Astrometry (VERA) array at 22 GHz for 80 epochs from 2007 October to 2013 December. The averaged radial velocity of the bright component "C3" with reference to the radio core is found to be 0.27pm0.02c0.27 pm 0.02c between 2007 October and 2013 December. This constant velocity of C3 is naturally explained by the advancing motion of the head of the mini-radio lobe. We also find a non-linear component in the motion of C3 with respect to the radio core. We briefly discuss possible origins of this non-linear motion.Comment: 11 pages, 7 figures, 8 tables (table 1 - 5 are supplementaries), accepted for publication on PAS

    Inference on periodicity of circadian time series

    Get PDF
    Estimation of the period length of time-course data from cyclical biological processes, such as those driven by the circadian pacemaker, is crucial for inferring the properties of the biological clock found in many living organisms. We propose a methodology for period estimation based on spectrum resampling (SR) techniques. Simulation studies show that SR is superior and more robust to non-sinusoidal and noisy cycles than a currently used routine based on Fourier approximations. In addition, a simple fit to the oscillations using linear least squares is available, together with a non-parametric test for detecting changes in period length which allows for period estimates with different variances, as frequently encountered in practice. The proposed methods are motivated by and applied to various data examples from chronobiology

    Change-Point Testing and Estimation for Risk Measures in Time Series

    Full text link
    We investigate methods of change-point testing and confidence interval construction for nonparametric estimators of expected shortfall and related risk measures in weakly dependent time series. A key aspect of our work is the ability to detect general multiple structural changes in the tails of time series marginal distributions. Unlike extant approaches for detecting tail structural changes using quantities such as tail index, our approach does not require parametric modeling of the tail and detects more general changes in the tail. Additionally, our methods are based on the recently introduced self-normalization technique for time series, allowing for statistical analysis without the issues of consistent standard error estimation. The theoretical foundation for our methods are functional central limit theorems, which we develop under weak assumptions. An empirical study of S&P 500 returns and US 30-Year Treasury bonds illustrates the practical use of our methods in detecting and quantifying market instability via the tails of financial time series during times of financial crisis

    Comparing Spectral Densities in Replicated Time Series by Smoothing Spline ANOVA

    Get PDF
    Comparing several groups of populations based on replicated data is one of the main concerns in statistical analysis. A specific type of data, time series data, such as waves of earthquakes present difficulties because of the correlations amongst the data. Spectral analysis solves this problem somewhat because the discrete Fourier transform transforms the data to near independence under general conditions.The goal of our research is to develop general, user friendly, statistical methods to compare group spectral density functions. To accomplish this, we consider two main problems: How can we construct an estimation function from replicated time series for each group and what method can be used to compare the estimated functions? For the first part, we present smooth estimates of spectral densities from time series data obtained from replication across subjects (units) (Wahba 1990; Guo et al. 2003). We assume that each spectral density is in some reproducing kernel Hilbert space and apply penalized least squares methods to estimate spectral density in smoothing spline ANOVA. For the second part, we consider confidence intervals to determine the frequencies where the spectrum of one spectral density may differ from another. These confidence intervals are the independent simultaneous confidence interval and the bootstrapping confidence interval (Babu et al. 1983; Olshen et al. 1989). Finally, as an application, we consider the replicated time series data that consist of shear (S) waves of 8 earthquakes and 8 explosions (Shumway & Stoffer 2006)

    Realized Volatility and Correlation in Grain Futures Markets: Testing for Spill-Over Effects

    Get PDF
    Fluctuations in commodity prices are a major concern to many market participants. This paper uses realized volatility methods to calculate daily volatility and correlation estimates for three grain futures prices (corn, soybean and wheat). The realized volatility estimates exhibit the properties consistent with the stylized facts observed in earlier studies. According to the realized correlations and regression coefficients, the spot returns from the three grain futures are positively related. The realized estimates are then used to evaluate the degree of volatility transmissions across grain future prices. The impulse response analysis is conducted by fitting the vector autoregressive model to realized volatility and correlation estimates, using the bootstrap method for statistical inference. The results indicate that there exist rich dynamic interactions among the volatilities and correlations across the grain futures markets.Volatility Transmission, Vector Autoregressive Model, Impulse Response Analysis, Bootstrap

    New Evidence on Interest Rate and Foreign Exchange Rate Modeling

    Get PDF
    This dissertation empirically and theoretically investigates three interrelated issues of market anomalies in interest rates derivatives and foreign exchange rates. The first essay models the spot exchange rate as a decomposition of permanent and transitory components. Unlike extant analysis, the transitory component could be stationary or explosive. The second essay examines the market efficiency hypothesis in the foreign exchange markets and relates the rejection of forward rate unbiasedness hypothesis to the existence of risk premium not to the failure of rational expectation. The third essay examines the behavior of short-term riskless rate and models the risk free rate as a nonlinear trend stationary process. While addressing these issues, these essays account for: (1) finite sample bias; (2) Unit root and other nonstationary behaviors; (3) the role of nonlinear trend; and (4) the interrelations between different behaviors. Several new results have been gleaned from our analysis; we find that: (1) the spot exchange rates display a very slow mean aversion behavior, which implies the failure of the purchasing power parity; (2) there are positive autocorrelations across the long horizons overlapping returns increases overtime and then begin to decline at a very long horizon period; (3) the short-term riskless rate displays a nonlinear trend stationary process which is closer to driftless random walk behavior; (4) modifying the mean reverting shortterm interest rates models to a nonlinear trend stationary shows an extreme improvement and outperforms all suggested models; (5) the traditional tests for rational expectations and market efficiency in the foreign exchange markets are subject to size distortions; (6) we relate the rejection of market efficiency in the foreign exchange markets documented across most currencies to the existence of risk premium not to the rejection of rational expectation hypothesis

    A Bridge between Short-Range and Seasonal Forecasts: Data-Based First Passage Time Prediction in Temperatures

    Get PDF
    Current conventional weather forecasts are based on high-dimensional numerical models. They are usually only skillful up to a maximum lead time of around 7 days due to the chaotic nature of the climate dynamics and the related exponential growth of model and data initialisation errors. Even the fully detailed medium-range predictions made for instance at the European Centre for Medium-Range Weather Forecasts do not exceed lead times of 14 days, while even longer-range predictions are limited to time-averaged forecast outputs only. Many sectors would profit significantly from accurate forecasts on seasonal time scales without needing the wealth of details a full dynamical model can deliver. In this thesis, we aim to study the potential of a much cheaper data-based statistical approach to provide predictions of comparable or even better skill up to seasonal lead times, using as an examplary forecast target the time until the next occurrence of frost. To this end, we first analyse the properties of the temperature anomaly time series obtained from measured data by subtracting a sinusoidal seasonal cycle, as well as the distribution properties of the first passage times to frost. The possibility of generating additional temperature anomaly data with the same properties by using very simple autoregressive model processes to potentially reduce the statistical fluctuations in our analysis is investigated and ultimately rejected. In a next step, we study the potential for predictability using only conditional first passage time distributions derived from the temperature anomaly time series and confirm a significant dependence of the distributions on the initial conditions. After this preliminary analysis, we issue data-based out-of-sample forecasts for three different prediction targets: The specific date of first frost, the probability of observing frost before summer for forecasts issued in spring, and the full probability distribution of the first passage times to frost. We then study the possibility of improving the forecast quality first by enhancing the stationarity of the temperature anomaly time series and then by adding as an additional input variable the state of the North Atlantic Oscillation on the date the predictions are issued. We are able to obtain significant forecast skill up to seasonal lead times when comparing our results to an unskilled reference forecast. A first comparison between the data-based forecasts and corresponding predictions gathered from a dynamical weather model, necessarily using a lead time of only up to 15 days, shows that our simple statistical schemes are only outperformed (and then only slightly) if further statistical post-processing is applied to the model output.Aktuelle Wetterprognosen werden mit Hilfe von hochdimensionalen, numerischen Modellen generiert. Durch die dem Klima zugrunde liegende chaotische Dynamik wachsen Modellfehler und Ungenauigkeiten in der Modellinitialisierung exponentiell an, sodass Vorhersagen mit signifikanter GĂŒte ĂŒblicherweise nur fĂŒr eine Vorlaufzeit von maximal sieben Tagen möglich sind. Selbst die detaillierten Prognosen des EuropĂ€ischen Zentrums fĂŒr mittelfristige Wettervorhersagen gehen nicht ĂŒber eine Vorlaufzeit von 14 Tagen hinaus, wĂ€hrend noch lĂ€ngerfristigere Vorhersagen auf zeitgemittelte GrĂ¶ĂŸen beschrĂ€nkt sind. Viele Branchen wĂŒrden signifikant von akkuraten Vorhersagen auf saisonalen Zeitskalen pro-fitieren, ohne das ganze Ausmaß an Details zu benötigen, das von einem vollstĂ€ndigen dynamischen Modell geliefert werden kann. In dieser Dissertation beabsichtigen wir, am Beispiel einer Vorhersage der Zeitdauer bis zum nĂ€chsten Eintreten von Frost zu untersuchen, inwieweit deutlich kostengĂŒnstigere, datenbasierte statistische Verfahren Prognosen von gleicher oder sogar besserer GĂŒte auf bis zu saisonalen Zeitskalen liefern können. Dazu analysieren wir zunĂ€chst die Eigenschaften der Zeitreihe der Temperaturanomalien, die aus den Messdaten durch das Subtrahieren eines sinusförmigen Jahresganges erhalten werden, sowie die Charakteristiken der Wahrscheinlichkeitsverteilungen der Zeitdauer bis zum nĂ€chsten Eintreten von Frost. Die Möglichkeit, durch einen einfachen autoregressiven Modellprozess zusĂ€tzliche Datenpunkte gleicher statistischer Eigenschaften wie der Temperaturanomalien zu generieren, um die statistischen Fluktuationen in der Analyse zu reduzieren, wird untersucht und letztendlich verworfen. Im nĂ€chsten Schritt analysieren wir das Vorhersagepotential, wenn ausschließlich aus den Temperaturanomalien gewonnene bedingte Wahrscheinlichkeitsverteilungen der Wartezeit bis zum nĂ€chsten Frost verwendet werden, und können eine signifikante AbhĂ€ngigkeit der Verteilungen von den Anfangsbedingungen nachweisen. Nach dieser einleitenden Untersuchung erstellen wir datenbasierte Prognosen fĂŒr drei verschiedene VorhersagegrĂ¶ĂŸen: Das konkrete Datum, an dem es das nĂ€chste Mal Frost geben wird; die Wahrscheinlichkeit, noch vor dem Sommer Frost zu beobachten, wenn die Vorhersagen im FrĂŒhjahr ausgegeben werden; und die volle Wahrscheinlichkeitsverteilung der Zeitdauer bis zum nĂ€chsten Eintreten von Frost. Anschließend untersuchen wir die Möglichkeit, die VorhersagegĂŒte weiter zu erhöhen - zunĂ€chst durch eine Verbesserung der StationaritĂ€t der Temperaturanomalien und dann durch die zusĂ€tzliche BerĂŒcksichtigung der Nordatlantischen Oszillation als einer zweiten, den Anfangszustand charakterisierenden Variablen im Vorhersageschema. Wir sind in der Lage, im Vergleich mit einem naiven Referenzvorhersageschema eine signifikante Verbesserung der VorhersagegĂŒte auch auf saisonalen Zeitskalen zu erreichen. Ein erster Vergleich zwischen den datenbasierten Vorhersagen und entsprechenden, aus den dynamischen Wettermodellen gewonnenen Prognosen, der sich notwendigerweise auf eine Vorlaufzeit der Vorhersagen von lediglich 15 Tagen beschrĂ€nkt, zeigt, dass letztere unsere simplen statistischen Vorhersageschemata nur schlagen (und zwar knapp), wenn der Modelloutput noch einer statistischen Nachbearbeitung unterzogen wird

    Bootstrapping the log-periodogram estimator of the long-memory parameter: is it worth weighting?

    Get PDF
    Estimation of the long-memory parameter from the log-periodogram (LP) regression, due to Geweke and Porter-Hudak (GPH), is a simple and frequently used method of semi-parametric estimation. However, the simple LP estimator suffers from a finite sample bias that increases with the dependency in the short-run component of the spectral density. In a modification of the GPH estimator, Andrews and Guggenberger, AG (2003) suggested a bias-reduced estimator, but this comes at the cost of inflating the variance. To avoid variance inflation, Guggenberger and Sun (2004, 2006) suggested a weighted LP (WLP) estimator using bands of frequencies, which potentially improves upon the simple LP estimator. In all cases a key parameter in these methods is the need to choose a frequency bandwidth, m, which confines the chosen frequencies to be in the ‘neighbourhood’ of zero. GPH suggested a ‘square-root’ rule of thumb that has been widely used, but has no optimality characteristics. An alternative, due to Hurvich and Deo (1999), is to derive the root mean square error (rmse) optimising value of m, which depends upon an unknown parameter, although that can be consistently estimated to make the method feasible. More recently, Arteche and Orbe (2009a,b), in the context of the GPH estimator, suggested a promising bootstrap method, based on the frequency domain, to obtain the rmse value of m that avoids estimating the unknown parameter. We extend this bootstrap method to the AG and WLP estimators and to consideration of bootstrapping in the frequency domain (FD) and the time domain (TD) and, in each case, to ‘blind’ and ‘local’ versions. We undertake a comparative simulation analysis of these methods for relative performance on the dimensions of bias, rmse, confidence interval width and fidelity

    Locating the gamma-ray emission site in Fermi/LAT blazars from correlation analysis between 37 GHz radio and gamma-ray light curves

    Get PDF
    We address the highly debated issue of constraining the gamma-ray emission region in blazars from cross-correlation analysis using discrete correlation function between radio and gamma-ray light curves. The significance of the correlations is evaluated using two different approaches: simulating light curves and mixed source correlations. The cross-correlation analysis yielded 26 sources with significant correlations. In most of the sources, the gamma-ray peaks lead the radio with time lags in the range +20 and +690 days, whereas in sources 1633+382 and 3C 345 we find the radio emission to lead the gamma rays by -15 and -40 days, respectively. Apart from the individual source study, we stacked the correlations of all sources and also those based on sub-samples. The time lag from the stacked correlation is +80 days for the whole sample and the distance travelled by the emission region corresponds to 7 pc. We also compared the start times of activity in radio and gamma rays of the correlated flares using Bayesian block representation. This shows that most of the flares at both wavebands start at almost the same time, implying a co-spatial origin of the activity. The correlated sources show more flares and are brighter in both bands than the uncorrelated ones.Comment: 15 pages, 8 figures and 4 tables. Published in MNRAS. Online-only Figure 6 is available as ancillary file with this submissio
    • 

    corecore