12 research outputs found
Laiko eilučių agregavimo, deagregavimo uždaviniai ir tolima priklausomybė
Large-scale aggregation and its inverse, disaggregation, problems are important in many fields of studies like macroeconomics, astronomy, hydrology and sociology. It was shown in Granger (1980) that a certain aggregation of random coefficient AR(1) models can lead to long memory output. Dacunha-Castelle and Oppenheim (2001) explored the topic further, answering when and if a predefined long memory process could be obtained as the result of aggregation of a specific class of individual processes. In this paper, the disaggregation scheme of Leipus et al. (2006) is briefly discussed. Then disaggregation into AR(1) is analyzed further, resulting in a theorem that helps, under corresponding assumptions, to construct a mixture density for a given aggregated by AR(1) scheme process. Finally the theorem is illustrated by FARUMA mixture densityÆs example.Santraukos nėra
Asymptotic normality of the mixture density estimator in a disaggregation scheme
The paper concerns the asymptotic distribution of the mixture density
estimator, proposed by Oppenheim et al 2006, in the aggregation/disaggregation
problem of random parameter AR(1) process. We prove that, under mild conditions
on the (semiparametric) form of the mixture density, the estimator is
asymptotically normal. The proof is based on the limit theory for the quadratic
form in linear random variables developed by Bhansali et al 2007. The moving
average representation of the aggregated process is investigated. A small
simulation study illustrates the result
Agreguotų AR(1) procesų autoregresijos parametro tankio vertinimo metodų palyginimas
The article investigates the properties of two alternative disaggregation methods. First one, proposed in Chong (2006), is based on the assumption of polynomial autoregressive parameter density. Second one, proposed in Leipus et al. (2006), uses the approximation of the density by the means of Gegenbauer polynomials. Examining results of Monte-Carlo simulations it is shown that none of the methods was found to outperform another. Chong’s method is narrowed by the class of polynomial densities, and the secondmethod is not effective in the presence of common innovations.Bothmethodswork correctly under assumptions proposed in the corresponding articles
The fiscal and macroeconomic effects of government wages and employment reform
This paper examines the overall macroeconomic impact arising from reform in government wages and employment, at times of fiscal consolidation. Reform of these two components of the government wage bill appeared necessary for containing the deterioration of the public finances in several EU countries, as a consequence of the financial crisis. Such reforms entailed in some instances, but not always, the implementation of cost-cutting measures affecting the government wage bill, as part of broader consolidation packages that typically hinged more heavily on other fiscal instruments, like public investment. While such measures have adverse short-term macroeconomic effects, public wage bill restraining policy changes present the idiosyncrasy that they can yield medium- to longer-term benefits due to possible competitiveness and efficiency gains through their impact on labour market dynamics. This paper provides some evidence of such medium- to long-run effects, based on a wealth of micro and macro data in the euro area and the EU. It concludes that appropriately designed government wage bill moderation could indeed produce positive dividends to the economy, which depend on certain country-specific conditions. These gains can be reinforced by relevant fiscal-structural reforms
Time series aggregation, disaggregation and long memory.
Large scale aggregation and its inverse, disaggregation, problems are important nin many fields of studies like macroeconomics, astronomy, hydrology and sociology. It was shown in Granger (1980) that a certain aggregation of random coefficient AR(I) models can lead to long memory output. Dacunha-Castelle and Oppenheim (2001) explored the topic further, answering when and if a predefined long memory process could be obtained as the result of aggregation of a specific class of individual processes. In this paper, the disaggregation scheme of Leipus et al. (2006) is briefly discussed. Then disaggregation into AR(I) is analyzed further, resulting in a theorem that helps, under corresponding assumptions, to construct a mixture density for a given aggregated by AR(I) scheme process. Finally the theorem is illustrated by FARUMA mixture density Æ example
The mathematical model of the distribution of the research funds of the Faculty of Mathematics and Informatics.
We evaluate the efficiency of research outputs of departments for proper distribution of research funds using a sample of MIF VU Departments' dataset of research and publications. The tool is Data envelopment analysis (DEA). Research outputs, measured by points, are taken as output data. Inputs are measured by potentials of department's personal and their salaries. The results of DEA analysis shows that some of the departments operates under increasing return-to-scale..
Evaluation of Value-at-Risk (VaR) using the Gaussian Mixture Models
The normality of the distribution of stock returns is one of the basic assumptions in financial mathematics. Empirical studies, however, undermine the validity of this assumption. In order to flexibly fit complex non-normal distributions, this article applies a Gaussian Mixture Model (GMM) in the context of Value-at-Risk (VaR) estimation. The study compares the forecasting ability of GMM with other widespread VaR approaches, scrutinizing the data on the daily log-returns for a wide range of “S&P 500” stocks in two periods: from 2006 to 2010 and from 2016 to 2021. The statistical and graphical analysis revealed that GMM quickly and adequately adjusts to significant and rapid stock market changes, although the remaining methods delay. The study also found that the ratio of short-term and long-term standard deviations significantly improves the GMM and other methods’ ability to predict VaR, reflecting the observed features of analyzed stock log-returns
P* model for inflation dynamics in Lithuania
The article examines the P* model, based by the neo-classicist quantity money theory, describing the change of the inflation in Lithuania. Referring to the model, inflation is explained by the deviations of prices, applicable in Lithuania and foreign countries from the long-term balance price, which is in essence determined by such main factors, as the existing money supply, the volumes of potential production and the speed of turnover of money. The article analyzes different possibilities of expansion of the P* model, formulates and evaluates the version of the model, corresponding to the peculiarities of the Lithuanian economy and examines the possibilities of the model to forecast the inflation