9 research outputs found

    Overcomplete Mathematical Models with Applications

    Get PDF
    Chen, Donoho a Saunders (1998) studují problematiku hledání řídké reprezentace vektorů (signálů) s použitím speciálních přeurčených systémů vektorů vyplňujících prostor signálu. Takovéto systémy (někdy jsou také nazývány frejmy) jsou typicky vytvořeny buď rozšířením existující báze, nebo sloučením různých bazí. Narozdíl od vektorů, které tvoří konečně rozměrné prostory, může být problém formulován i obecněji v rámci nekonečně rozměrných separabilních Hilbertových prostorů (Veselý, 2002b; Christensen, 2003). Tento funkcionální přístup nám umožňuje nacházet v těchto prostorech přesnější reprezentace objektů, které, na rozdíl od vektorů, nejsou diskrétní. V této disertační práci se zabývám hledáním řídkých representací v přeurčených modelech časových řad náhodných veličin s konečnými druhými momenty. Numerická studie zachycuje výhody a omezení tohoto přístupu aplikovaného na zobecněné lineární modely a na vícerozměrné ARMA modely. Analýzou mnoha numerických simulací i modelů reálných procesů můžeme říci, že tyto metody spolehlivě identifikují parametry blízké nule, a tak nám umožňují redukovat původně špatně podmíněný přeparametrizovaný model. Tímto významně redukují počet odhadovaných parametrů. V konečném důsledku se tak nemusíme starat o řády modelů, jejichž zjišťování je většinou předběžným krokem standardních technik. Pro kratší časové řady (100 a méně vzorků) řídké odhady dávají lepší predikce v porovnání s těmi, které jsou založené na standardních metodách (např. maximální věrohodnosti v MATLABu - MATLAB System Identification Toolbox (IDENT)). Pro delší časové řady (500 a více) obě techniky dávají v podstatě stejně přesné predikce. Na druhou stranu řešení těchto problémů je náročnější, a to i časově, nicméně výpočetní doba je stále přijatelná.Chen, Donoho a Saunders (1998) deal with the problem of sparse representation of vectors (signals) by using special overcomplete (redundant) systems of vectors spanning this space. Typically such systems (also called frames) are obtained either by refining existing basis or merging several such bases (refined or not) of various kinds (so-called packets). In contrast to vectors which belong to a finite-dimensional space, the problem of sparse representation may be formulated within a more general framework of (even infinite-dimensional) separable Hilbert space (Veselý, 2002b; Christensen, 2003). Such functional approach allows us to get more precise representation of objects from such space which, unlike vectors, are not discrete by their nature. In this Thesis, I attack the problem of sparse representation from overcomplete time series models using expansions in the Hilbert space of random variables of finite variance. A numerical study demonstrates benefits and limits of this approach when applied to generalized linear models or to overcomplete VARMA models of multivariate stationary time series, respectively. After having accomplished and analyzed a lot of numerical simulations as well as real data models, we can conclude that the sparse method reliably identifies nearly zero parameters allowing us to reduce the originally badly conditioned overparametrized model. Thus it significantly reduces the number of estimated parameters. Consequently there is no care about model orders the fixing of which is a common preliminary step used by standard techniques. For short time series paths (100 or less samples), the sparse parameter estimates provide more precise predictions compared with those based on standard maximum likelihood estimators from MATLAB's System Identification Toolbox (IDENT). For longer paths (500 or more), both techniques yield nearly equal prediction paths. On the other hand, solution of such problems requires more sophistication and that is why a computational speed is larger, but still comfortable.

    The macro-financial linkages modelling for the Czech economy

    Get PDF
    The contribution presents and analyze the model with financialfrictions. It is tailor-made for the Czech economy, and thus contains severalfeatures for capturing Czech stylized facts (a cascade of nominal rigidities, highopenness, real exchange rate appreciation in consumer prices etc.). Linkages between real and financial sectors are incorporated via the state non-contingent debt-contracts within the financial accelerator. Also, the model contains shocks which hit financial variables and propagate through the model into real sectors. The empirical analysis is presented via results of the Bayesian estimation.The contribution presents and analyze the model with financialfrictions. It is tailor-made for the Czech economy, and thus contains severalfeatures for capturing Czech stylized facts (a cascade of nominal rigidities, highopenness, real exchange rate appreciation in consumer prices etc.). Linkages between real and financial sectors are incorporated via the state non-contingent debt-contracts within the financial accelerator. Also, the model contains shocks which hit financial variables and propagate through the model into real sectors. The empirical analysis is presented via results of the Bayesian estimation

    Sparse Parameter Estimation in Overcomplete Time Series Models

    No full text
    Abstract: We suggest a new approach to parameter estimation in time series models with large number of parameters. We use a modified version of the Basis Pursuit Algorithm (BPA) by Chen et al. [SIAM Review 43 (2001), No. 1] to verify its applicability to times series modelling. For simplicity we restrict to ARIMA models of univariate stationary time series. After having accomplished and analyzed a lot of numerical simulations we can draw the following conclusions: (1) We were able to reliably identify nearly zero parameters in the model allowing us to reduce the originally badly conditioned overparametrized model. Among others we need not take care about model orders the fixing of which is a common preliminary step used by standard techniques. For short time series paths (100 or less samples) the sparse parameter estimates provide more precise predictions compared with those based on standard maximum likelihood estimators from MATLAB’s System Identification Toolbox (IDENT). For longer paths (500 or more) both techniques yield nearly equal prediction paths. (2) As the model usually depends on the estimated parameters, we tried to improve their accuracy by iterating BPA several times. Keywords: Overcomplete Model, Algorithm.

    The Czech Housing Market Through the Lens of a DSGE Model Containing Collateral-Constrained Households

    No full text
    We incorporate a housing market with liquidity-constrained households into the Czech National Bank’s core forecasting model (g3) to analyze the relationship between housing market and aggregate fluctuations in a small open economy framework. We discuss the historical shock decomposition of house prices and interpret the results in the light of recent empirical work. For a wide range of model calibrations, the interaction between the housing market and the aggregate economy is weak and so the monetary policy implications of house price fluctuations for the Czech Republic are not strong. We interpret this – in line with recent empirical evidence – as an indication that the wealth effects stemming from house ownership are not significant in the Czech Republic. Nevertheless, we show that the collateral mechanism significantly improves the forecasting properties of the extended model, especially for private consumption. This indicates the importance of the collateral effect, which can be caused by assets other than houses

    Labour Market Modelling within a DSGE Approach

    No full text
    The goal of this paper is to find a suitable way of modelling the main labour market variables in the framework of the CNB’s core DSGE model. The model selection criteria are: the predictive ability for unemployment, the change in the overall predictive ability in comparison to the baseline model and the extent of the required model change. We find that the incorporation of a modified Galí, Smets and Wouters (2011) labour market specification allows us to predict unemployment with an acceptable forecast error. At the same time it leads to a moderate improvement in the overall predictive ability of the model and requires only minor adjustments to the model structure. Thus, it should be preferred to more complicated concepts that yield a similar improvement in predictive ability. We also came to the conclusion that the concept linking unemployment and the GDP gap is promising. However, its practical application would require (additional) improvement in the accuracy of the consumption prediction. As a practical experiment, we compare the inflation pressures arising from nominal wages and the exchange rate in the baseline model and in alternative specifications. The experiment is motivated by the use of the exchange rate as an additional monetary policy instrument by the CNB since November 2013 in an environment of near-zero interest rates and growing disinflationary pressures. We find that the baseline model tends to forecast higher nominal wage growth and lower exchange rate depreciation than the models with more elaborate labour markets. Therefore, the alternative models would probably have identified an even higher need for exchange rate depreciation than the baseline model did

    Monetary policy implications of financial frictions in the Czech republic

    No full text
    As the global economy seems to be recovering from the 2009 financial crisis, we find it desirable to look back and analyze the Czech economy ex post. We work with a Swedish New Keynesian model of a small open economy which embeds financial frictions in light of the financial accelerator literature

    A Baseline Model for Monetary Policy Analysis

    No full text
    The paper deals with a baseline New Keynesian DSGE model for a closed economy. The model follows the concept of the New Open Economy Macroeconomics based on microeconomic foundations enriched with real and nominal rigidities. It is estimated with Bayesian technique using quarterly Eurozone data. The estimation results are discussed and compared with related papers. Via impulse responses to unanticipated shocks, we analyze the behaviour of the model without any rigidities in order to understand essential model mechanisms. Then, we add separately real and nominal rigidities to investigate their impacts within the model.DSGE, New Keynesian model, Bayesian estimation, impulse responses

    A Macroeconomic Forecasting Model of the fixed exchange rate regime for the oil-rich Kazakh economy

    No full text
    This paper presents a semi-structural quarterly projection open-economy model for analyzing monetary policy transmission and macroeconomic developments in Kazakhstan during the period of the fixed exchange rate regime. The model captures key stylized facts of the Kazakh economy, especially the important role of oil prices in influencing the economic cycle in Kazakhstan. The application of the model to observed data provides a reasonable interpretation of Kazakh economic history, including the global crisis, through to late 2015, when the National Bank of Kazakhstan introduced a managed float. The dynamic properties of the model are analyzed using impulse response functions for selected country-specific shocks. The model’s shock decomposition and in-sample forecasting properties presented in the paper suggest that the model was an applicable tool for monetary policy analysis and practical forecasting at the National Bank of Kazakhstan. In a general sense, the model can be considered an example of a quarterly projection model for oil-rich countries with a fixed exchange rate

    Incorporating Judgments and Dealing with Data Uncertainty in Forecasting at the Czech National Bank

    No full text
    This paper focuses on the forecasting process at the Czech National Bank with an empha- sis on incorporating expert judgments into forecasts and addressing data uncertainty. At the beginning, the core model and the forecasting process are described and it is presented how data and the underlying uncertainty are handled. The core of the paper contains five case studies, which reflect policy issues addressed during forecasting rounds since 2008. Each case study first describes a particular forecasting problem, then the way how the issue was addressed, and finally the effect of incorporating off-model information into the forecast is briefly summarized. The case studies demonstrate that a careful incor- poration of expert information into a structural framework may be useful for generating economically intuitive forecasts even during very turbulent times, and we show that such judgements may have important monetary policy implications
    corecore