9,641 research outputs found

    A formula for the solution of DEA models

    Get PDF

    Robust ASR using Support Vector Machines

    Get PDF
    The improved theoretical properties of Support Vector Machines with respect to other machine learning alternatives due to their max-margin training paradigm have led us to suggest them as a good technique for robust speech recognition. However, important shortcomings have had to be circumvented, the most important being the normalisation of the time duration of different realisations of the acoustic speech units. In this paper, we have compared two approaches in noisy environments: first, a hybrid HMM–SVM solution where a fixed number of frames is selected by means of an HMM segmentation and second, a normalisation kernel called Dynamic Time Alignment Kernel (DTAK) first introduced in Shimodaira et al. [Shimodaira, H., Noma, K., Nakai, M., Sagayama, S., 2001. Support vector machine with dynamic time-alignment kernel for speech recognition. In: Proc. Eurospeech, Aalborg, Denmark, pp. 1841–1844] and based on DTW (Dynamic Time Warping). Special attention has been paid to the adaptation of both alternatives to noisy environments, comparing two types of parameterisations and performing suitable feature normalisation operations. The results show that the DTA Kernel provides important advantages over the baseline HMM system in medium to bad noise conditions, also outperforming the results of the hybrid system.Publicad

    Evaluating transfer programs within a general equilibrium framework

    Get PDF
    The authors set out a general equilibrium model for the evaluation of a domestically financed transfer program, which helps to combine the results from a computable general equilibrium model with disaggregated household data.Using a Mexican cash transfer program as an illustration, they use the approach to show that the substantial welfare gains that result from the switch from universal food subsidies to targeted cash transfers reflect both the improved targeting efficiency of the latter as well as a relaxation of the trade-off between equity and efficiency objectives when designing tax systems.FCND ,Subsidies Mexico. ,Transfer payments. ,Equilibrium (Economics) Models. ,

    Market structure and hospital efficiency: Evaluating potential effects of deregulation in a national health service

    Get PDF
    In this article we examine the potential effect of market structure on hospital technical efficiency as a measure of performance controlled by ownership and regulation. This study is relevant to provide an evaluation of the potential effects of recommended and initiated deregulation policies in order to promote market reforms in the context of a European National Health Service. Our goal was reached through three main empirical stages. Firstly, using patient origin data from hospitals in the region of Catalonia in 1990, we estimated geographic hospital markets through the Elzinga--Hogarty approach, based on patient flows. Then we measured the market level of concentration using the Herfindahl--Hirschman index. Secondly, technical and scale efficiency scores for each hospital was obtained specifying a Data Envelopment Analysis. According to the data nearly two--thirds of the hospitals operate under the production frontier with an average efficiency score of 0.841. Finally, the determinants of the efficiency scores were investigated using a censored regression model. Special attention was paid to test the hypothesis that there is an efficiency improvement in more competitive markets. The results suggest that the number of competitors in the market contributes positively to technical efficiency and there is some evidence that the differences in efficiency scores are attributed to several environmental factors such as ownership, market structure and regulation effects.Geographic markets, market concentration, technical efficiency, data envelopment analysis, censored regression model

    Real-time inflation forecasting in a changing world

    Get PDF
    This paper revisits inflation forecasting using reduced form Phillips curve forecasts, i.e., inflation forecasts using activity and expectations variables. We propose a Phillips curve-type model that results from averaging across different regression specifications selected from a set of potential predictors. The set of predictors includes lagged values of inflation, a host of real activity data, term structure data, nominal data and surveys. In each of the individual specifications we allow for stochastic breaks in regression parameters, where the breaks are described as occasional shocks of random magnitude. As such, our framework simultaneously addresses structural change and model certainty that unavoidably affects Phillips curve forecasts. We use this framework to describe PCE deflator and GDP deflator inflation rates for the United States across the post-WWII period. Over the full1960-2008 sample the framework indicates several structural breaks across different combinations of activity measures. These breaks often coincide with, amongst others, policy regime changes and oil price shocks. In contrast to many previous studies, we find less evidence for autonomous variance breaks and inflation gap persistence. Through a \\textit{real-time} out-of-sample forecasting exercise we show that our model specification generally provides superior one-quarter and one-year ahead forecasts for quarterly inflation relative to a whole range of forecasting models that are typically used in the literature.Bayesian model averaging;structural breaks;real-time data;model uncertainty;Phillips correlations;inflation forecasting
    • 

    corecore