102 research outputs found

    Descriptive Seasonal Adjustment by Minimizing Perturbations

    Get PDF
    The seasonal adjustment method proposed by Schlicht (1981) can be viewed as a method that minimizes non-stochastic deviations (perturbations). This interpretation gives rise to a critique of the seasonality criterion used there. A new seasonality criterion is proposed that avoids these shortcomings, and the resulting seasonal adjustment method is give

    Small-sample Properties of Estimators in an ARCH(1) and GARCH(1,1) Model with a Generalized Error Distribution: a Robustness Study

    Get PDF
    GARCH Models have become a workhouse in volatility forecasting of financial and monetary market time series. In this article, we assess the small sample properties in estimation and the performance in volatility forecasting of four competing distribution free methods, including quasi-maximum likelihood and three regression based methods. The study is carried out by means of Monte Carlo simulations. To guarantee an utmost realistic framework, simulated time series are generated from a mixture of two symmetric generalized error distributions. This data generating process allow to reproduce the stylized facts of financial time series, in particular, peakedness and skewness. The results of the study suggest that regression based methods can be an asset in volatility forecasting, since model parameters are subject to structural change over time and the efficiency of the quasi- maximum likelihood method is confined to large sample sizes. Furthermore, the good performance of forecasts based on the historical volatility supports to use the variance targeting method for volatility forecasting.GARCH, volatility forecasting, Monte Carlo simulation, mixture of generalized error distributions, variance targeting.

    Descriptive Seasonal Adjustment by Minimizing Perturbations

    Get PDF
    The seasonal adjustment method proposed by Schlicht (1981) can be viewed as a method that minimizes non-stochastic deviations (perturbations). This interpretation gives rise to a critique of the seasonality criterion used there. A new seasonality criterion is proposed that avoids these shortcomings, and the resulting seasonal adjustment method is givenseasonal adjustment; seasonality; smoothing; spline; descriptive decomposition

    Proposals for a Needed Adjustment of the VaR-based Market Risk Charge of Basle II

    Get PDF
    We analyze around 200 different financial time series, i.e. components of Dow Jones, Nasdaq, FTSE and Nikkei with seven different VaR approaches. We differentiate our analysis according to characteristics that can be observed. Our analysis shows that in high risk situations in which the time series show high volatility risk and high fat tail risk the current Basle II guidelines fail in the attempt to cushion against large losses by higher capital requirements. One of the factors causing this problem is that the builtin positive incentive of the penalty factor resulting from the Basle II backtesting is set too weak. Therefore, we propose adjustments regarding the Basle II penalty factor that take different risk situations into account and lead to higher capital buffers for forecast models with a systematic risk underestimation.Risk evaluation, Value-at-risk, Basle II backtesting, GARCH

    Proposals for a needed adjustment of the VaR-based market risk charge of Basle II

    Full text link
    We analyze around 200 different financial time series, i.e. components of Dow Jones, Nasdaq, FTSE and Nikkei with seven different VaR approaches. We differentiate our analysis according to characteristics that can be observed. Our analysis shows that in high risk situations in which the time series show high volatility risk and high fat tail risk the current Basle II guidelines fail in the attempt to cushion against large losses by higher capital requirements. One of the factors causing this problem is that the builtin positive incentive of the penalty factor resulting from the Basle II backtesting is set too weak. Therefore, we propose adjustments regarding the Basle II penalty factor that take different risk situations into account and lead to higher capital buffers for forecast models with a systematic risk underestimation

    Risk evaluation in financial risk management: Prediction limits and backtesting

    Full text link

    Small-sample properties of estimators in an ARCH(1) and GARCH(1,1) model with a generalized error distribution: A robustness study

    Full text link
    GARCH Models have become a workhouse in volatility forecasting of financial and monetary market time series. In this article, we assess the small sample properties in estimation and the performance in volatility forecasting of four competing distribution free methods, including quasi-maximum likelihood and three regression based methods. The study is carried out by means of Monte Carlo simulations. To guarantee an utmost realistic framework, simulated time series are generated from a mixture of two symmetric generalized error distributions. This data generating process allow to reproduce the stylized facts of financial time series, in particular, peakedness and skewness. The results of the study suggest that regression based methods can be an asset in volatility forecasting, since model parameters are subject to structural change over time and the efficiency of the quasi- maximum likelihood method is confined to large sample sizes. Furthermore, the good performance of forecasts based on the historical volatility supports to use the variance targeting method for volatility forecasting

    Ni/Al-Hybrid Cellular Foams: An Interface Study by Combination of 3D-Phase Morphology Imaging, Microbeam Fracture Mechanics and In Situ Synchrotron Stress Analysis

    Get PDF
    Nickel(Ni)/aluminium(Al) hybrid foams are Al base foams coated with Ni by electrodeposition. Hybrid foams offer an enhanced energy absorption capacity. To ensure a good adhering Ni coating, necessary for a shear resistant interface, the influence of a chemical pre-treatment of the base foam was investigated by a combination of an interface morphology analysis by focused ion beam (FIB) tomography and in situ mechanical testing. The critical energy for interfacial decohesion from these microbending fracture tests in the scanning electron microscope (SEM) were contrasted to and the results validated by depth-resolved measurements of the evolving stresses in the Ni coating during three-point bending tests at the energy-dispersive diffraction (EDDI) beamline at the synchrotron BESSY II. Such a multi-method assessment of the interface decohesion resistance with respect to the interface morphology provides a reliable investigation strategy for further improvement of the interface morphology

    On the role of data, statistics and decisions in a pandemic

    Get PDF
    A pandemic poses particular challenges to decision-making because of the need to continuously adapt decisions to rapidly changing evidence and available data. For example, which countermeasures are appropriate at a particular stage of the pandemic? How can the severity of the pandemic be measured? What is the effect of vaccination in the population and which groups should be vaccinated first? The process of decision-making starts with data collection and modeling and continues to the dissemination of results and the subsequent decisions taken. The goal of this paper is to give an overview of this process and to provide recommendations for the different steps from a statistical perspective. In particular, we discuss a range of modeling techniques including mathematical, statistical and decision-analytic models along with their applications in the COVID-19 context. With this overview, we aim to foster the understanding of the goals of these modeling approaches and the specific data requirements that are essential for the interpretation of results and for successful interdisciplinary collaborations. A special focus is on the role played by data in these different models, and we incorporate into the discussion the importance of statistical literacy, and of effective dissemination and communication of findings
    corecore