2,299 research outputs found
The IMF and the Liberalization of Capital Flows
Using data from a panel of developing economies from the 1982-98 period, the claim that the International Monetary Fund precipitated financial crises during the 1990s by pressuring countries to liberalize their capital accounts prematurely is evaluated. Examining whether the changes in the regime governing capital flows took place during participation in IMF programs, evidence finds that IMF program participation is correlated with capital account liberalization episodes during the 1990s. Alternative indicators of capital account openness were used to test the robustness of the results by comparing the economic and financial characteristics of countries that decontrolled during IMF programs with those of countries who did so independently to determine whether decontrol was premature.
The IMF and the Liberalization of Capital Flows
We evaluate the claim that the International Monetary Fund precipitated financial crises during the 1990s by pressuring countries to liberalize their capital accounts prematurely. Using data from a panel of developing economies from 1982-98, we examine whether the changes in the regime governing capital flows took place during participation in IMF programs. We find evidence that IMF program participation is correlated with capital account liberalization episodes during the 1990s. We verify the robustness of our results using alternative indicators of capital account openness. To determine whether decontrol was premature, we compare the economic and financial characteristics of countries that decontrolled during IMF programs with those of countries who did so independently and find some evidence of IMF-led premature liberalizations.IMF programs; capital account liberalization
Dynamic Windows Scheduling with Reallocation
We consider the Windows Scheduling problem. The problem is a restricted
version of Unit-Fractions Bin Packing, and it is also called Inventory
Replenishment in the context of Supply Chain. In brief, the problem is to
schedule the use of communication channels to clients. Each client ci is
characterized by an active cycle and a window wi. During the period of time
that any given client ci is active, there must be at least one transmission
from ci scheduled in any wi consecutive time slots, but at most one
transmission can be carried out in each channel per time slot. The goal is to
minimize the number of channels used. We extend previous online models, where
decisions are permanent, assuming that clients may be reallocated at some cost.
We assume that such cost is a constant amount paid per reallocation. That is,
we aim to minimize also the number of reallocations. We present three online
reallocation algorithms for Windows Scheduling. We evaluate experimentally
these protocols showing that, in practice, all three achieve constant amortized
reallocations with close to optimal channel usage. Our simulations also expose
interesting trade-offs between reallocations and channel usage. We introduce a
new objective function for WS with reallocations, that can be also applied to
models where reallocations are not possible. We analyze this metric for one of
the algorithms which, to the best of our knowledge, is the first online WS
protocol with theoretical guarantees that applies to scenarios where clients
may leave and the analysis is against current load rather than peak load. Using
previous results, we also observe bounds on channel usage for one of the
algorithms.Comment: 6 figure
Winnowing ontologies based on application use
The requirements of specific applications and services are often over estimated when ontologies are reused or built. This sometimes results in many ontologies being too large for their intended purposes. It is not uncommon that when applications and services are deployed over an ontology, only a few parts of the ontology are queried and used. Identifying which parts of an ontology are being used could be helpful to winnow the ontology, i.e., simplify or shrink the ontology to smaller, more fit for purpose size. Some approaches to handle this problem have already been suggested in the literature. However, none of that work showed how ontology-based applications can be used in the ontology-resizing process, or how they might be affected by it. This paper presents a study on the use of the AKT Reference Ontology by a number of applications and services,and investigates the possibility of relying on this usage information to winnow that ontology
Evidence of localised gas propagation pathways in a field-scale bentonite engineered barrier system: results from three gas injection tests in the large scale gas injection test (Lasgit)
Three gas injection tests have been conducted during a large scale gas injection test (Lasgit) performed at the Äspö Hard Rock Laboratory, Sweden. Lasgit is a full-scale experiment based on the Swedish KBS-3 repository concept, examining the processes controlling gas and water flow in highly water-saturated compact buffer bentonite. Three preliminary gas injection tests have been performed. The first two tests were conducted in the lower array of injection filters (FL903). Both of these tests showed similar behaviour that corresponded with laboratory observations. The third gas test was conducted in an upper array filter (FU910), which gave a subtly dissimilar response at major gas entry with an initial pressure drop followed by a secondary gas peak pressure. Lasgit has confirmed the coupling between gas, stress and pore-water pressure for flow before and after major gas entry at the field scale. All observations suggest mechanisms of pathway propagation and dilatancy predominate. In all three gas tests the propagation was through localised features that tended to exploit the interface between the copper canister and the bentonite buffer. Considerable evidence exists for the development of a highly-dynamic, tortuous network of pressure induced pathways which evolves both temporally and geospatially within the clay, opening and closing probably due to local changes in gas pressure and or effective stress
SPEDEN: Reconstructing single particles from their diffraction patterns
Speden is a computer program that reconstructs the electron density of single
particles from their x-ray diffraction patterns, using a single-particle
adaptation of the Holographic Method in crystallography. (Szoke, A., Szoke, H.,
and Somoza, J.R., 1997. Acta Cryst. A53, 291-313.) The method, like its parent,
is unique that it does not rely on ``back'' transformation from the diffraction
pattern into real space and on interpolation within measured data. It is
designed to deal successfully with sparse, irregular, incomplete and noisy
data. It is also designed to use prior information for ensuring sensible
results and for reliable convergence. This article describes the theoretical
basis for the reconstruction algorithm, its implementation and quantitative
results of tests on synthetic and experimentally obtained data. The program
could be used for determining the structure of radiation tolerant samples and,
eventually, of large biological molecular structures without the need for
crystallization.Comment: 12 pages, 10 figure
Recommended from our members
Enhancing comprehension of complex data visualizations: Framework and techniques based on signature exploration
This thesis presents a framework and set of readily applicable techniques for enhancing comprehension of complex data visualizations. Central to the work has been the definition and exploration of a new concept, signature exploration.
Visualization is being used increasingly to help make sense of large sets of data and information. Abstractions of complex data can be performed to reduce the dimensions to 2 or 3 for display. Novel or established representations can be used that allow direct mapping of greater numbers of attributes, and of a variety of data structures. There is an ever expanding set of visualization tools available. Two questions face the user: how to choose appropriate displays and how to understand the resultant graphic. This thesis examines how to support the user’s comprehension in this context.
The work makes the following three main contributions to enhancing comprehension of complex data visualizations: the definition and application of signature exploration, a concept describing the exploration of visualization behaviour using specially constructed data; the proposal of a framework for the design of visualization systems for increased comprehension; the introduction of two new forms of interaction - which are here described as visual data tracking and feature fingerprinting.
The central theme for the exploration presented in this work is the notion that a user wants to take data that is known in some way, put this into the visualization process and assess the resultant visual depiction. This intuitive desire has been captured in the definition of the concept, signature exploration. Signature exploration describes the exploration of the behaviour of visual representations using specially constructed datasets that contain features of interest. The datasets are used to explore the signatures of different visual representations and mathematical transformations. The thesis defines and illustrates signature exploration, with five proposed approaches: generic dataset provision; user-construction of data; querying; insertion of landmarks; elicitation and application of feedback data. These applications of signature exploration, together with analysis of the comprehension challenges presented by different aspects of visualization, and established work to support user comprehension, form the basis of the framework for increased user comprehension.
Example software has been developed within the context of a visualization application that employs a number of visualization algorithms to generate graphics for multivariate or proximity data. Principal Components Analysis, Principal Coordinates Analysis and distance metrics of various kinds are the algorithms used. An additional interface is given to the user, to perform signature exploration. The work has resulted in the specification of a set of techniques that developers can readily apply. Two new interaction forms are described: visual data tracking - bi-directional brushing and linking between representations also allowing change of position or value; feature fingerprinting - synthetic additions to real-world datasets to provide the user with calibration of the visual depiction
- …