89 research outputs found

    Econometric modeling of business Telecommunications demand using Retina and Finite Mixtues

    Get PDF
    In this paper we estimate the business telecommunications demands for local, intra-LATA and inter-LATA services, using US data from a Bill Harvesting R survey carried out during 1997. We model heterogeneity, which is present among firms due to a variety of different business telecommunication needs, by estimating normal heteroskedastic mixture regressions. The results show that a three-component mixture model fits the demand for local services well, while a two-component structure is used to model intra-LATA and inter-LATA demand. We characterize the groups in terms of their differences among the coefficients, and then use Retina to perform automatic model selection over an expanded candidate regressor set which includes heterogeneity parameters as well as transformations of the original variables.Telecommunication Demand Models, Local calls, Inter-LATA calls, intra-LATA calls, Retina, Flexible Functional Forms, Heterogeneity, Finite Mixtures.

    Demand for Internet Access and Use in Spain

    Get PDF
    The goal of this paper is to analyze a new phenomenon: Internet demand in Spain. To do so, we use a new high quality data set and advanced econometric techniques for estimating Internet demand functions, incorporating the socio-demographic characteristics of the individuals. We begin with a graphic analysis of the data, searching for relationships between the different characteristics. Then we specify and estimate two econometric models, one for broadband access at home and another for Internet use intensity. We also find that 25.2% of the Spanish population accesses the Internet at home, but less than half uses broadband connection. This demand is positively related to income and other technological attributes and negatively related to socio-demographic attributes such as habitat and age. Our results are compatible with previous literature for other countries, although there is a important difference: broadband Internet connections are still considered as a luxury good in Spain.Broadband, Internet Access, Internet Use, Selection Bias Correction, Multinomial Logit Models, marginal effects, elasticities.

    Econometric Modeling of Business Telecommunications Demand using RETINA and Finite Mixtures.

    Get PDF
    In this paper we estimate the business telecommunications demands for local,intra-LATA and inter-LATA services, using US data from a Bill Harvesting (R) survey carried out during 1997. We model heterogeneity, which is present among firms due to a variety of different business telecommunication needs, by estimating normal heteroskedastic mixture regressions. The results show that a three-component mixture model fits the demand for local services well, while a two-component structure is used to model intra-LATA and inter-LATA demand. We characterize the groups in terms of their differences among the coefficients, and then use RETINA to perform automatic model selection over an expanded candidate regressor set which includes heterogeneity parameters as well as transformations of the original variables. Our models improve substantially the in-sample fit as well the out-of-sample predictive ability over alternative candidate models. RETINA suggests that the final demand specification should include telephone equipment variables as relevant regressors. On the other hand, the output of the firm, as well as its physical extension, have second order, yet significant effects on the demand for telecommunication services. Estimated elasticities are different for the three demands but always positive for access form (single-line or private network).Telecommunication Demand Models, Local calls, inter-LATA calls, intra-LATA calls, RETINA, Flexible Functional Forms, Heterogeneity, Finite Mixtures.

    Econometric modeling of business Telecommunications demand using Retina and Finite Mixtues

    Get PDF
    In this paper we estimate the business telecommunications demands for local, intra-LATA and inter-LATA services, using US data from a Bill Harvesting survey carried out during 1997. We model heterogeneity, which is present among firms due to a variety of different business telecommunication needs, by estimating normal heteroskedastic mixture regressions. The results show that a three-component mixture model fits the demand for local services well, while a two-component structure is used to model intra-LATA and inter-LATA demand. We characterize the groups in terms of their differences among the coefficients, and then use Retina to perform automatic model selection over an expanded candidate regressor set which includes heterogeneity parameters as well as transformations of the original variables

    GFC-Robust Risk Management Strategies under the Basel Accord

    Get PDF
    A risk management strategy is proposed as being robust to the Global Financial Crisis (GFC) by selecting a Value-at-Risk (VaR) forecast that combines the forecasts of different VaR models. The robust forecast is based on the median of the point VaR forecasts of a set of conditional volatility models. This risk management strategy is GFC-robust in the sense that maintaining the same risk management strategies before, during and after a financial crisis would lead to comparatively low daily capital charges and violation penalties. The new method is illustrated by using the S&P500 index before, during and after the 2008-09 global financial crisis. We investigate the performance of a variety of single and combined VaR forecasts in terms of daily capital requirements and violation penalties under the Basel II Accord, as well as other criteria. The median VaR risk management strategy is GFC-robust as it provides stable results across different periods relative to other VaR forecasting models. The new strategy based on combined forecasts of single models is straightforward to incorporate into existing computer software packages that are used by banks and other financial institutions.Value-at-Risk (VaR), daily capital charges, robust forecasts, violation penalties, optimizing strategy, aggressive risk management strategy, conservative risk management strategy, Basel II Accord, global financial crisis.

    Optimal Risk Management Before, During and After the 2008-09 Financial Crisis

    Get PDF
    In this paper we advance the idea that optimal risk management under the Basel II Accord will typically require the use of a combination of different models of risk. This idea is illustrated by analyzing the best empirical models of risk for five stock indexes before, during,and after the 2008-09 financial crisis. The data used are the Dow Jones Industrial Average, Financial Times Stock Exchange 100, Nikkei, Hang Seng and Standard and Poor’s 500 Composite Index. The primary goal of the exercise is to identify the best models for risk management in each period according to the minimization of average daily capital requirements under the Basel II Accord. It is found that the best risk models can and do vary before, during and after the 2008-09 financial crisis. Moreover, it is found that an aggressive risk management strategy, namely the supremum strategy that combines different models of risk, can result in significant gains in average daily capital requirements, relative to the strategy of using single models, while staying within the limits of the Basel II Accord.Financial portfolios, daily capital charges, frequency of violations, magnitude of violations, optimizing strategy, risk forecasts, value-at-risk, green zone, red zone.

    Optimal Risk Management Before, During and After the 2008-09 Financial Crisis

    Get PDF
    In this paper we advance the idea that optimal risk management under the Basel II Accord will typically require the use of a combination of different models of risk. This idea is illustrated by analyzing the best empirical models of risk for five stock indexes before, during,and after the 2008-09 financial crisis. The data used are the Dow Jones Industrial Average, Financial Times Stock Exchange 100, Nikkei, Hang Seng and Standard and Poor’s 500 Composite Index. The primary goal of the exercise is to identify the best models for risk management in each period according to the minimization of average daily capital requirements under the Basel II Accord. It is found that the best risk models can and do vary before, during and after the 2008-09 financial crisis. Moreover, it is found that an aggressive risk management strategy, namely the supremum strategy that combines different models of risk, can result in significant gains in average daily capital requirements, relative to the strategy of using single models, while staying within the limits of the Basel II Accord.Optimal risk management, average daily capital requirements, alternative risk strategies, value-at-risk forecasts, combining risk models.

    Demand for telephone lines and universal service in Spain

    Get PDF
    In this paper we use a model of demand for telephone lines to derive an econometric model of the net demand for new access lines in Spain, for the period 1980-1993, using quarterly observations. We use cointegration techniques to obtain long and short run equations, both estimated separately in two steps and jointly, The results show a strong sensitivy of the net demand for new lines to domestic usage price with an elasticity greater than one, an income elasticity also greater than one, and an elasticity with respect to price of access, in absolute value, less than one. We find that a tariff restructuring that lowers intemational and long distance rates while raises access rates might have a very small effect on the net demand for new lines. This suggests that the objective of universal service might be compatible with the kinds of tariff restructuring that have been recently considered in Spain

    A flexible Tool for Model Building: the Relevant Transformation of the Inputs Network Approach (RETINA)

    Get PDF
    A new method, called relevant transformation of the inputs network approach (RETINA) is proposed as a tool for model building and selection. It is designed to improve some of the shortcomings of neural networks. It has the flexibility of neural network models, the concavity of the likelihood in the weights of the usual likelihood models, and the ability to identify a parsimonious set of attributes that are likely to be relevant for predicting out of sample outcomes. RETINA expands the range of models by considering transformations of the original inputs; splits the sample in three disjoint subsamples, sorts the candidate regressors by a saliency feature, chooses the models in subsample 1, uses subsample 2 for parameter estimation and subsample 3 for cross-validation. It is modular, can be used as a data exploratory tool and is computationally feasible in personal computers. In tests on simulated data, it achieves high rates of successes when the sample size or the R2 are large enough. As our experiments show, it is superior to alternative procedures such as the non negative garrote and forward and backward stepwise regression.

    International Evidence on GFC-robust Forecasts for Risk Management under the Basel Accord

    Get PDF
    A risk management strategy that is designed to be robust to the Global Financial Crisis (GFC), in the sense of selecting a Value-at-Risk (VaR) forecast that combines the forecasts of different VaR models, was proposed in McAleer et al. (2010c). The robust forecast is based on the median of the point VaR forecasts of a set of conditional volatility models. Such a risk management strategy is robust to the GFC in the sense that, while maintaining the same risk management strategy before, during and after a financial crisis, it will lead to comparatively low daily capital charges and violation penalties for the entire period. This paper presents evidence to support the claim that the median point forecast of VaR is generally GFC-robust. We investigate the performance of a variety of single and combined VaR forecasts in terms of daily capital requirements and violation penalties under the Basel II Accord, as well as other criteria. In the empirical analysis, we choose several major indexes, namely French CAC, German DAX, US Dow Jones, UK FTSE100, Hong Kong Hang Seng, Spanish Ibex35, Japanese Nikkei, Swiss SMI and US S&P500. The GARCH, EGARCH, GJR and Riskmetrics models, as well as several other strategies, are used in the comparison. Backtesting is performed on each of these indexes using the Basel II Accord regulations for 2008-10 to examine the performance of the Median strategy in terms of the number of violations and daily capital charges, among other criteria. The Median is shown to be a profitable and safe strategy for risk management, both in calm and turbulent periods, as it provides a reasonable number of violations and daily capital charges. The Median also performs well when both total losses and the asymmetric linear tick loss function are considered.Median strategy; Value-at-Risk (VaR); daily capital charges; robust forecasts; violation penalties; optimizing strategy; aggressive risk management; conservative risk management; Basel II Accord; global financial crisis (GFC)
    • 

    corecore