821 research outputs found
Focused Transfer Targeting against Poverty Evidence from Tunisia
This paper introduces a new methodology to target direct transfers against poverty. Our method is based on estimation methods that focus on the poor. Using data from Tunisia, we estimate ‘focused’ transfer schemes that highly improve anti-poverty targeting performances. Post-transfer poverty can be substantially reduced with the new estimation method. In terms of P2, the most popular axiomatically valid poverty indicator, a 30 percent reduction in poverty from transfer schemes based on OLS method to focused transfer schemes, requires only a few hours of computer work based on methods available on popular statistical packages. Finally, the obtained levels of under-coverage of the poor is so low that reforms based on ‘proxy-means’ focused transfer schemes are likely to avoid social unrest.Poverty; Targeting; Transfers.
Least Median of Squares Estimation by Optimization Heuristics with an Application to the CAPM and Multi Factor Models
For estimating the parameters of models for financial market data, the use of robust techniques is of particular interest. Conditional forecasts, based on the capital asset pricing model, and a factor model are considered. It is proposed to consider least median of squares estimators as one possible alternative to ordinary least squares. Given the complexity of the objective function for the least median of squares estimator, the estimates are obtained by means of optimization heuristics. The performance of two heuristics is compared, namely differential evolution and threshold accepting. It is shown that these methods are well suited to obtain least median of squares estimators for real world problems. Furthermore, it is analyzed to what extent parameter estimates and conditional forecasts differ between the two estimators. The empirical analysis considers daily and monthly data on some stocks from the Dow Jones Industrial Average Index (DJIA).LMS, CAPM, Multi Factor Model, Differential Evolution, Threshold Accepting
Evaluating Value-at-Risk models via Quantile Regression
This paper is concerned with evaluating value at risk estimates. It is well known that using only binary variables, such as whether or not there was an exception, sacrifices too much information. However, most of the specification tests (also called backtests) available in the literature, such as Christoffersen (1998) and Engle and Maganelli (2004) are based on such variables. In this paper we propose a new backtest that does not rely solely on binary variables. It is shown that the new backtest provides a sufficient condition to assess the finite sample performance of a quantile model whereas the existing ones do not. The proposed methodology allows us to identify periods of an increased risk exposure based on a quantile regression model (Koenker & Xiao, 2002). Our theoretical findings are corroborated through a Monte Carlo simulation and an empirical exercise with daily S&P500 time series.Value-at-Risk, Backtesting, Quantile Regression
Recommended from our members
Econometrics: A bird's eye view
As a unified discipline, econometrics is still relatively young and has been transforming and expanding very rapidly over the past few decades. Major advances have taken place in the analysis of cross sectional data by means of semi-parametric and non-parametric techniques. Heterogeneity of economic relations across individuals, firms and industries is increasingly acknowledge and attempts have been made to take them into account either by integrating out their effects or by modeling the sources of heterogeneity when suitable panel data exists. The counterfactual considerations that underlie policy analysis and treatment evaluation have been given a more satisfactory foundation. New time series econometric techniques have been developed and employed extensively in the areas of macroeconometrics and finance. Non-linear econometric techniques are used increasingly in the analysis of cross section and time series observations. Applications of Bayesian techniques to econometric problems have been given new impetus largely thanks to advances in computer power and computational techniques. The use of Bayesian techniques have in turn provided the investigators with a unifying framework where the tasks and forecasting, decision making, model evaluation and learning can be considered as parts of the same interactive and iterative process; thus paving the way for establishing the foundation of the "real time econometrics". This paper attempts to provide an overview of some of these developments
Survival Models for the Duration of Bid-Ask Spread Deviations
Many commonly used liquidity measures are based on snapshots of the state of
the limit order book (LOB) and can thus only provide information about
instantaneous liquidity, and not regarding the local liquidity regime. However,
trading in the LOB is characterised by many intra-day liquidity shocks, where
the LOB generally recovers after a short period of time. In this paper, we
capture this dynamic aspect of liquidity using a survival regression framework,
where the variable of interest is the duration of the deviations of the spread
from a pre-specified level. We explore a large number of model structures using
a branch-and-bound subset selection algorithm and illustrate the explanatory
performance of our model
Evaluating Value-at-Risk models via Quantile Regression
This paper is concerned with evaluating value at risk estimates. It is well known that using only binary variables, such as whether or not there was an exception, sacrifices too much information. However, most of the specification tests (also called backtests) available in the literature, such as Christoffersen (1998) and Engle and Maganelli (2004) are based on such variables. In this paper we propose a new backtest that does not rely solely on binary variables.
It is shown that the new backtest provides a sufficient condition to assess the finite sample performance of a quantile model whereas the existing ones do not. The proposed methodology allows us to identify periods of an increased risk exposure based on a quantile regression model (Koenker & Xiao, 2002). Our theoretical findings are corroborated through a Monte Carlo simulation and an empirical exercise with daily S&P500 time series
The Impact of Piped Water Provision on Infant Mortality in Brazil: A Quantile Panel Data Approach
We examine the impact of piped water on the under-1 infant mortality rate (IMR) in Brazil using a novel econometric procedure for the estimation of quantile treatment effects with panel data. The provision of piped water in Brazil is highly correlated with other observable and unobservable determinants of IMR -- the latter leading to an important source of bias. Instruments for piped water provision are not readily available, and fixed effects to control for time invariant correlated unobservables are invalid in the simple quantile regression framework. Using the quantile panel data procedure in Chen and Khan (2007), our estimates indicate that the provision of piped water reduces infant mortality by significantly more at the higher conditional quantiles of the IMR distribution than at the lower conditional quantiles (except for cases of extreme underdevelopment). These results imply that targeting piped water intervention toward areas in the upper quantiles of the conditional IMR distribution, when accompanied by other basic public health inputs, can achieve significantly greater reductions in infant mortality.
The trade-growth nexus in the developing countries: a quantile regression approach
This paper applies quantile regression techniques to investigate how the impact of trade openness on the growth rate of per capita income varies with the conditional distribution of growth. Using formal robustness analyses, we first identify robust variables affecting economic growth (investment, government balance, terms of trade, inflation, and population growth) which we then use as controls in the quantile regression estimations. Our findings suggest a heterogeneous trade-growth nexus: for both the short and the long run, the effect of openness on growth is higher in countries with low growth rates compared to those with high growth rates. [author's abstract
The Role of Social Ties in the Job Search of Recent Immigrants
We show that among workers whose network is weaker than formal (nonnetwork) channels, those finding a job through the network should have higher wages than those finding a job through formal channels. Moreover, this wage differential is decreasing in network strength. We test these implications using a survey of recent immigrants into Canada. At least at the lower end of an individual’s wage distribution above his reservation wage, finding a network job is associated with higher wages for those with weak networks, and the interaction between network strength and finding a job through the network is negative as predicted.Immigrants, Job Search, Social Networks, Strong Ties
- …