4,620 research outputs found
Born Reciprocity and Cosmic Accelerations
The trans-Planckian theory is a model that realizes concretely the Born
reciprocity idea, which is the postulate of absolute equivalence between
coordinate and momenta . This model is intrinsically global, and thus it
is naturally implemented in a cosmological setting. Cosmology and Born
reciprocity are made for each other. Inflation provides the essential mechanism
to suppress the terms coming from the dual part of the action. The
trans-Planckian theory provides an explanation for the present acceleration of
the universe scale factor. This is possible just considering a simple model
that contains gravity, one gauge field plus one matter field (to be identified
with dark matter), together with the reciprocity principle.Comment: 22 pages, 5 figures. v2: minor corrections. v3: book chapter of
"Advances in Dark Energy Research". v4: some correction
Multicriteria ranking using weights which minimize the score range
Various schemes have been proposed for generating a set of non-subjective weights when aggregating multiple criteria for the purposes of ranking or selecting alternatives. The maximin approach chooses the weights which maximise the lowest score (assuming there is an upper bound to scores). This is equivalent to finding the weights which minimize the maximum deviation, or range, between the worst and best scores (minimax). At first glance this seems to be an equitable way of apportioning weight, and the Rawlsian theory of justice has been cited in its support.We draw a distinction between using the maximin rule for the purpose of assessing performance, and using it for allocating resources amongst the alternatives. We demonstrate that it has a number of drawbacks which make it inappropriate for the assessment of performance. Specifically, it is tantamount to allowing the worst performers to decide the worth of the criteria so as to maximise their overall score. Furthermore, when making a selection from a list of alternatives, the final choice is highly sensitive to the removal or inclusion of alternatives whose performance is so poor that they are clearly irrelevant to the choice at hand
Municipality Size and Efficiency of Local Public Services: Does Size Matter?
Similarly to western Germany in the 1960s and 1970s, the eastern part of Germany has experienced a still ongoing process of numerous amalgamations among counties, towns and municipalities since the mid-1990s. The evidence in the economic literature is mixed with regard to the claimed expenditure reductions and efficiency gains from municipal mergers. We therefore analyze the global efficiency of the municipalities in Saxony-Anhalt, for the first time in this context, using a double-bootstrap procedure combining DEA and truncated regression. This allows including environmental variables to control for exogenous determinants of municipal efficiency. Our focus thereby is on institutional and fiscal variables. Moreover, the scale efficiency is estimated to find out whether large units are necessary to benefit from scale economies. In contrast to previous studies, we chose the aggregate budget of municipal associations (âVerwaltungsgemeinschaftenâ) as the object of our analysis since important competences of the member municipalities are settled on a joint administrative level. Furthermore, we use a data set that has been carefully adjusted for bookkeeping items and transfers within the communal level. On the âeveâ of a mayor municipal reform the majority of the municipalities were found to have an approximately scale-efficient size and centralized organizational forms (âEinheitsgemeindenâ) showed no efficiency advantage over municipal associations.efficiency, local government, DEA, bootstrap, demographic change, local institutions
Opening the 'black box' of efficiency measurement: input allocation in multi-output settings.
We develop a new Data Envelopment Analysis (DEA)-based methodology for measuring the efficiency of Decision Making Units (DMUs) characterized by multiple inputs and multiple outputs. The distinguishing feature of our method is that it explicitly includes information about output-specific inputs and joint inputs in the efficiency evaluation. This contributes to opening the âblack boxâ of efficiency measurement in two different ways. First, including information on the input allocation substantially increases the discriminatory power of the efficiency measurement. Second, it allows to decompose the efficiency value of a DMU into output-specific efficiency values which facilitates the identification of the outputs the manager should focus on to remedy the observed inefficiency. We demonstrate the usefulness and managerial implications of our methodology by means of a unique dataset collected from the Activity Based Costing (ABC) system of a large service company with 290 DMUs.
MEASURING THE PERFORMANCE OF TWO-STAGE PRODUCTION SYSTEMS WITH SHARED INPUTS BY DATA ENVELOPMENT ANALYSIS
As a non-parametric technique in Operations Research and Economics, Data Envelopment Analysis (DEA) evaluates the relative efficiency of peer production systems or decision making units (DMUs) that have multiple inputs and outputs. In recent years, a great number of DEA studies have focused on two-stage production systems in series, where all outputs from the first stage are intermediate products that make up the inputs to the second stage. There are, of course, other types of two-stage processes that the inputs of the system can be freely allocated among two stages. For this type of two-stage production system, the conventional two-stage DEA models have some limitations e.g. efficiency formulation and linearizing transformation. In this paper, we introduce a relational DEA model, considering series relationship among two stages, to measure the overall efficiency of two-stage production systems with shared inputs. The linearity of DEA models is preserved in our model. The proposed DEA model not only evaluates the efficiency of the whole process, but also it provides the efficiency for each of the two sub-processes. A numerical example of US commercial banks from literature is used to clarify the model.Data envelopment analysis, Decision making unit, Two-stage, Shared input, Efficiency
Classical and Quantum Gravity in 1+1 Dimensions, Part I: A Unifying Approach
We provide a concise approach to generalized dilaton theories with and
without torsion and coupling to Yang-Mills fields. Transformations on the space
of fields are used to trivialize the field equations locally. In this way their
solution becomes accessible within a few lines of calculation only. In this
first of a series of papers we set the stage for a thorough global
investigation of classical and quantum aspects of more or less all available 2D
gravity-Yang-Mills models.Comment: 24 pages, no figures, some sign errors in Eqs. 52--59 have been
corrected (according to the Erratum
Cognitive strategic groups and long-run efficiency evaluation : the case of Spanish savings banks
In the framework of Cognitive Approach, this paper proposes a new method to identify
strategic groups (SG) using Data Envelopment Analysis (DEA) methods. Two
assumptions are maintained in the SG literature: first, firms grouped together value
inputs and outputs similarly, and, second, some degree of stability in those valuations
should be identified. Virtual weights obtained from DEA are extremely useful in the
valuation of the strategic variables, but a problem emerges when longitudinal analysis is
performed. This problem is addressed by defining a long run DEA evaluation. SGs are
determined by means of Cluster Analysis, using virtual outputs and virtual inputs as
variables and Spanish savings banks as observations. The traditional method of
determining SGs by clustering on the original variables is also applied and the results
are compared. It is shown that the long run DEA weights approach has advantages over
the traditional methodology
- âŠ