8,306 research outputs found
Productivity and Efficiency of US Gas Transmission Companies: A European Regulatory Perspective
Keywords JEL Classification On both sides of the Atlantic the regulation of gas transmission networks has undergone major changes since the early 1990’s. Whereas in the US the long-standing regime of cost-plus regulation was complemented by increasing pipe-to-pipe competition, most European countries moved towards incentive regulation complemented by market integration. We study the impact of US regulatory reform using a Malmquist-based productivity analysis for a panel of US interstate companies. Results are presented for changes in productivity, as well as for several convergence tests. The results indicate that taking productivity and convergence as performance indicators, regulation has been rather successful, in particular during a period where overall demand was flat. Lessons for European regulators are twofold. First, the US analysis shows that benchmarking of European transmission operators would be possible if data were available. Second, our results suggest that, in the long-run, market integration and competition are alternatives to the current European model. Natural gas transmission; utility regulation; data envelopment analysis; total factor productivity; convergenc
Improving Performance of Iterative Methods by Lossy Checkponting
Iterative methods are commonly used approaches to solve large, sparse linear
systems, which are fundamental operations for many modern scientific
simulations. When the large-scale iterative methods are running with a large
number of ranks in parallel, they have to checkpoint the dynamic variables
periodically in case of unavoidable fail-stop errors, requiring fast I/O
systems and large storage space. To this end, significantly reducing the
checkpointing overhead is critical to improving the overall performance of
iterative methods. Our contribution is fourfold. (1) We propose a novel lossy
checkpointing scheme that can significantly improve the checkpointing
performance of iterative methods by leveraging lossy compressors. (2) We
formulate a lossy checkpointing performance model and derive theoretically an
upper bound for the extra number of iterations caused by the distortion of data
in lossy checkpoints, in order to guarantee the performance improvement under
the lossy checkpointing scheme. (3) We analyze the impact of lossy
checkpointing (i.e., extra number of iterations caused by lossy checkpointing
files) for multiple types of iterative methods. (4)We evaluate the lossy
checkpointing scheme with optimal checkpointing intervals on a high-performance
computing environment with 2,048 cores, using a well-known scientific
computation package PETSc and a state-of-the-art checkpoint/restart toolkit.
Experiments show that our optimized lossy checkpointing scheme can
significantly reduce the fault tolerance overhead for iterative methods by
23%~70% compared with traditional checkpointing and 20%~58% compared with
lossless-compressed checkpointing, in the presence of system failures.Comment: 14 pages, 10 figures, HPDC'1
Who leads Research Productivity Change? Guidelines for R&D policy makers
Relying on efficiency analysis we evaluate to what extent policy makers have been able to promote the establishment of consolidated and comprehensive research groups to contribute to the implementation of a successful innovation system for the Spanish food technology sector, oriented to the production of knowledge based on an application model. Using data envelopment analysis techniques and Malmquist productivity indices we find pervasive levels of inefficiency and a typology of different research strategies. Among these, in contrast to what has been assumed, established groups do not play the pre-eminent benchmarking role; rather, partially oriented, specialized and "shooting star" groups are the most common patterns. These results correspond with an infant innovation system, where the fostering of higher levels of efficiency and promotion of the desired research patterns are ongoing.Innovation Policy; Management; Productivity Change; Malmquist Index; Distance Function
Efficiency, Productivity and Environmental Policy: A Case Study of Power Generation in the EU
This study uses the EU public power generating sector as a case study to investigate the environmental efficiency and productivity enhancing performance of the EU ETS in its pilot phase. Using Data Envelopment Analysis methods, we measures the environmental efficiency and the productivity growth registered in public power generation across the EU over the 1996-2007 period. In the second stage of our analysis we attempt to explain changes in productivity and efficiency over time using state-of-the-art econometric techniques. Our analysis suggests two conclusions: on the one hand carbon pricing led to an increase in environmental efficiency and to a shift outwards of the technological frontier; on the other hand, the overly generous allocation of emission permits had a negative impact on both measures. These results are shown to be quite robust to changes in controls and specifications.Emissions Trading; EU ETS; Environmental Efficiency; Productivity GrowthM; Data Envelopment Analysis
Vietnam between developmental state and neoliberalism: the case of the industrial sector
Since the mid 1980s Vietnam has launched a thorough programme of economic reforms, with a transition from central planning to a market-based economy. The gradual and pragmatic reform process achieved remarkable results in terms of sustainable economic growth and poverty reduction. With entrance into the WTO (in 2007), the country has become an important manufacturing hub and is attracting huge FDI flows, but with a risk of increased dependency from foreign capital and technology and vulnerability to exogenous shocks. This paper suggests that national authorities have so far (and rather successfully) relied on a large state sector to manage economic development but the government has not been able to design and implement coherent industrial strategies.Vietnam, economic reform, industrial development, Washington Consensus.
Nonparametric approach to evaluation of economic and social development in the EU28 member states by DEA efficiency
Data envelopment analysis (DEA) methodology is used in this study for a comparison of the dynamic efficiency of European countries over the last decade. Moreover, efficiency analysis is used to determine where resources are distributed efficiently and/or were used efficiently/inefficiently under factors of competitiveness extracted from factor analysis. DEA measures numerical grades of the efficiency of economic processes within evaluated countries and, therefore, it becomes a suitable tool for setting an efficient/inefficient position of each country. Most importantly, the DEA technique is applied to all (28) European Union (EU) countries to evaluate their technical and technological efficiency within the selected factors of competitiveness based on country competitiveness index in the 2000-2017 reference period. The main aim of the paper is to measure efficiency changes over the reference period and to analyze the level of productivity in individual countries based on the Malmquist productivity index (MPI). Empirical results confirm significant disparities among European countries and selected periods 2000-2007, 2008-2011, and 2012-2017. Finally, the study offers a comprehensive comparison and discussion of results obtained by MPI that indicate the EU countries in which policy-making authorities should aim to stimulate national development and provide more quality of life to the EU citizens.Web of Science122art. no. 7
Assessing load-sharing within optimistic simulation platforms
The advent of multi-core machines has lead to the need for revising the architecture of modern simulation platforms. One recent proposal we made attempted to explore the viability of load-sharing for optimistic simulators run on top of these types of machines. In this article, we provide an extensive experimental study for an assessment of the effects on run-time dynamics by a load-sharing architecture that has been implemented within the ROOT-Sim package, namely an open source simulation platform adhering to the optimistic synchronization paradigm. This experimental study is essentially aimed at evaluating possible sources of overheads when supporting load-sharing. It has been based on differentiated workloads allowing us to generate different execution profiles in terms of, e.g., granularity/locality of the simulation events. © 2012 IEEE
Managing Communication Latency-Hiding at Runtime for Parallel Programming Languages and Libraries
This work introduces a runtime model for managing communication with support
for latency-hiding. The model enables non-computer science researchers to
exploit communication latency-hiding techniques seamlessly. For compiled
languages, it is often possible to create efficient schedules for
communication, but this is not the case for interpreted languages. By
maintaining data dependencies between scheduled operations, it is possible to
aggressively initiate communication and lazily evaluate tasks to allow maximal
time for the communication to finish before entering a wait state. We implement
a heuristic of this model in DistNumPy, an auto-parallelizing version of
numerical Python that allows sequential NumPy programs to run on distributed
memory architectures. Furthermore, we present performance comparisons for eight
benchmarks with and without automatic latency-hiding. The results shows that
our model reduces the time spent on waiting for communication as much as 27
times, from a maximum of 54% to only 2% of the total execution time, in a
stencil application.Comment: PREPRIN
- …