1,575 research outputs found

    The political economy of efficient public good provision: evidence from Flemish libraries using a generalised conditional efficiency framework

    Get PDF
    Provision of most public goods (e.g., health care, library services, education, utilities) can be characterised by a two-stage ‘production’ process. The first stage translates basic inputs (e.g., labour and capital) into service potential (e.g., opening hours), while the second stage describes how these programmatic inputs are transformed into observed outputs (e.g., school outcomes, library circulation). While the latter stage is best analysed in a supply-demand framework, particularly in the former stage one would like to have efficient public production. Hence, unlike previous work on public sector efficiency (which often conflates both ‘production’ stages), this paper analyses how political economy factors shape efficient public good provision in stage one (using local public libraries as our centre of attention). To do so, we use a specially tailored, fully non-parametric efficiency model. The model is rooted in popular Data Envelopment Analysis models, but allows for both outlying observations and heterogeneity (i.e., a conditional efficiency model). Using an exceptionally rich dataset comprising all 290 Flemish public libraries, our findings suggest that the ideological stance of the local government, the wealth and density of the local population and the source of library funding (i.e., local funding versus intergovernmental transfers) are crucial determinants of library efficiency.Nonparametric estimation, Conditional efficiency, Political economy, Public good provision, Libraries.

    The political economy of efficient public good provision: evidence from flemish libraries using a generalised conditional efficiency framework.

    Get PDF
    Provision of most public goods (e.g., health care, library services, education, utilities) can be characterised by a two-stage ‘production’process. The first stage translates basic inputs (e.g., labour and capital) into service potential (e.g., opening hours), while the second stage describes how these programmatic inputs are transformed into observed outputs (e.g., school outcomes, library circulation). While the latter stage is best analysed in a supply-demand framework, particularly in the former stage one would like to have efficient public production. Hence, unlike previous work on public sector efficiency (which often conflates both ‘production’stages), this paper analyses how political economy factors shape efficient public good provision in stage one (using local public libraries as our centre of attention). To do so, we use a specially tailored, fully non-parametric efficiency model. The model is rooted in popular Data Envelopment Analysis models, but allows for both outlying observations and heterogeneity (i.e., a conditional efficiency model). Using an exceptionally rich dataset comprising all 290 Flemish public libraries, our findings suggest that the ideological stance of the local government, the wealth and density of the local population and the source of library funding (i.e., local funding versus intergovernmental transfers) are crucial determinants of library efficiency.Nonparametric estimation; Conditional efficiency; Political economy; Public good provision; Libraries;

    Stochastic non-parametric efficiency measurement and yardstick competition in electricity regulation

    Get PDF
    Stochastic non-parametric efficiency measurement constructs production or cost frontiers that incorporate both inefficiency and stochastic error. This results in a closer envelopment of the mean performance of the companies in the sample and diminishes the effect of extreme outliers. This paper uses the Land, Lovell and Thore (1993) model incorporating information on the covariance structure of inputs and outputs to study efficiency across a panel of 14 electricity distribution companies in the UK during the 1990s. The purpose is to revisit the 1999 distribution price control review carried out by the UK regulator. The regulator’s benchmarking is contrasted with the stochastic nonparametric efficiency results and with other comparative efficiency models offering close envelopment of the data. Some conclusions are offered about the possible regulated price effects in the UK case

    A decade of application of the Choquet and Sugeno integrals in multi-criteria decision aid

    Get PDF
    The main advances regarding the use of the Choquet and Sugeno integrals in multi-criteria decision aid over the last decade are reviewed. They concern mainly a bipolar extension of both the Choquet integral and the Sugeno integral, interesting particular submodels, new learning techniques, a better interpretation of the models and a better use of the Choquet integral in multi-criteria decision aid. Parallel to these theoretical works, the Choquet integral has been applied to many new fields, and several softwares and libraries dedicated to this model have been developed.Choquet integral, Sugeno integral, capacity, bipolarity, preferences

    ISIPTA'07: Proceedings of the Fifth International Symposium on Imprecise Probability: Theories and Applications

    Get PDF
    B

    Defuzzification of groups of fuzzy numbers using data envelopment analysis

    Get PDF
    Defuzzification is a critical process in the implementation of fuzzy systems that converts fuzzy numbers to crisp representations. Few researchers have focused on cases where the crisp outputs must satisfy a set of relationships dictated in the original crisp data. This phenomenon indicates that these crisp outputs are mathematically dependent on one another. Furthermore, these fuzzy numbers may exist as a group of fuzzy numbers. Therefore, the primary aim of this thesis is to develop a method to defuzzify groups of fuzzy numbers based on Charnes, Cooper, and Rhodes (CCR)-Data Envelopment Analysis (DEA) model by modifying the Center of Gravity (COG) method as the objective function. The constraints represent the relationships and some additional restrictions on the allowable crisp outputs with their dependency property. This leads to the creation of crisp values with preserved relationships and/or properties as in the original crisp data. Comparing with Linear Programming (LP) based model, the proposed CCR-DEA model is more efficient, and also able to defuzzify non-linear fuzzy numbers with accurate solutions. Moreover, the crisp outputs obtained by the proposed method are the nearest points to the fuzzy numbers in case of crisp independent outputs, and best nearest points to the fuzzy numbers in case of dependent crisp outputs. As a conclusion, the proposed CCR-DEA defuzzification method can create either dependent crisp outputs with preserved relationship or independent crisp outputs without any relationship. Besides, the proposed method is a general method to defuzzify groups or individuals fuzzy numbers under the assumption of convexity with linear and non-linear membership functions or relationships

    A possibilistic framework for interpreting ensemble predictions in weather forecasting and aggregate imperfect sources of information

    Get PDF
    Until now, works in the field of tide routing (i.e., optimization of cargo loading and ship scheduling decisions in tidal ports and shallow seas) have omitted the uncertainty of sea level predictions. However, the widely used harmonic tide forecasts are not perfectly reliable. Consequences for the maritime industry are significant: current solutions to tide routing may be made robust through the introduction of arbitrary slack, but they are not optimal. Given the financial implications at stake for every additional centimeter of draft and the catastrophic effects of a grounding, an investigation of tide routing from the perspective of risk analysis is necessary, which we first develop in this PhD thesis. Predicting future sea level errors w.r.t. tide predictions can be achieved by statistical modelling of these errors, based on historical archives, or by physics-based numerical predictions of these deviations. In the latter option, ensemble forecasting has gained popularity in the field of numerical weather prediction as a way of quantifying the uncertainty on forecasts. Tide-surge ensemble forecasts are thus routinely produced, combining hydrodynamic models with weather ensembles. This type of forecasts is commonly interpreted in a probabilistic way. However, the latter is regularly criticized for not being reliable, especially for predicting extreme events because of the chaotic nature of the dynamics of the atmospheric-ocean system, model error, and the fact that ensemble of forecasts are not, in reality, produced in a probabilistic manner. In this PhD thesis, we consequently develop an alternative possibilistic framework to interpret and use operationally such ensembles of predictions. In particular, we show by numerical experiments on the Lorenz 96 system that probability theory is not always (e.g. at large lead times and extreme events) the best way to extract the valuable information contained in ensemble predictions. Besides, such a possibilistic perspective eases the combination of different imperfect sources of information about the future state of the system at hand (e.g. dynamical information based on past time series and the analog method), in addition to making more sense without the need of post-processing. Finally, combining both the scheduling problem and the ensemble interpretation solution, we design a shipping decision model to compute optimal cargo loading and scheduling decisions, given the time series of the fuzzy sea levels in these ports that we derive from a possibilistic interpretation of surge ensemble forecasts. The under keel clearance becomes a possibilistic constraint and the resulting shipping optimization problem is solved by means of an optimisation routine adapted to possibilistic variables. Results obtained on a realistic case study with 7-day-ahead tide surge ensemble predictions are discussed and compared with those given by a probabilistic approach, or by standard practices on ships. After our numerical case studies on the Lorenz 96 system, they illustrate the potential and limitations of a possibilistic interpretation of the weather ensemble forecasts over its probabilistic counterpart in a realistic setting

    Big data-driven multimodal traffic management : trends and challenges

    Get PDF
    corecore