1,046,532 research outputs found
Reconsidering economic sanctions reconsidered. A detailed analysis of the Peterson Institute sanction database
This paper analyses two vintages of the key resource for research on economic sanctions: the Peterson Institute database reported in Hufbauer et al. (2nd edition in 1990 and 3rd edition in 2007). The Peterson Institute has not reported transparently on these changes.
We provide detailed tables in order to facilitate comparison between descriptive statistics and the findings of the two editions. One way to interpret our results is as are porting of the 2nd edition results corrected for changes in methodology and case selection.
Using descriptive statistics, ratio analysis, first-difference method and probit we investigate how case selection, (re)coding and new observations impacted on sanction characteristics and assumed effectiveness of economic sanctions.
About 17% of the common cases of the 2nd and 3rd edition is modified and changed to some extent. The number of goals assigned to these cases increased from 146 to 155. The average success score increases from 6.6 to 7.0 for the common cases. Indeed, the mean values for all categories of core variables for the common cases in the 3rd edition exceed those reported in the 2nd edition.
A redefined index value of the ‘sanction contribution’ underlies these changes. The lowest value index is defined as zero or negative contribution in the in 2nd edition whereas is limited to negative contribution in the 3rd edition (upgrading all zero contributions by definition) Likewise ‘modest and significant contribution’ is used in the 3rd edition instead of ‘substantial and decisive contribution’, making it easier to get a high score. We provide a probit analysis that shows that the 3rd edition’s methodology in comparison to the methodology used in the 2nd edition is biased in favour of finding positive results for modest policy change, regime change and the use of sanctions to disrupt military adventures and to achieve military impairment
Comparing and Combining Lexicase Selection and Novelty Search
Lexicase selection and novelty search, two parent selection methods used in
evolutionary computation, emphasize exploring widely in the search space more
than traditional methods such as tournament selection. However, lexicase
selection is not explicitly driven to select for novelty in the population, and
novelty search suffers from lack of direction toward a goal, especially in
unconstrained, highly-dimensional spaces. We combine the strengths of lexicase
selection and novelty search by creating a novelty score for each test case,
and adding those novelty scores to the normal error values used in lexicase
selection. We use this new novelty-lexicase selection to solve automatic
program synthesis problems, and find it significantly outperforms both novelty
search and lexicase selection. Additionally, we find that novelty search has
very little success in the problem domain of program synthesis. We explore the
effects of each of these methods on population diversity and long-term problem
solving performance, and give evidence to support the hypothesis that
novelty-lexicase selection resists converging to local optima better than
lexicase selection
Hysteretic optimization for the Sherrington-Kirkpatrick spin glass
Hysteretic optimization is a heuristic optimization method based on the
observation that magnetic samples are driven into a low energy state when
demagnetized by an oscillating magnetic field of decreasing amplitude. We show
that hysteretic optimization is very good for finding ground states of
Sherrington-Kirkpatrick spin glass systems. With this method it is possible to
get good statistics for ground state energies for large samples of systems
consisting of up to about 2000 spins. The way we estimate error rates may be
useful for some other optimization methods as well. Our results show that both
the average and the width of the ground state energy distribution converges
faster with increasing size than expected from earlier studies.Comment: Physica A, accepte
Simplest random K-satisfiability problem
We study a simple and exactly solvable model for the generation of random
satisfiability problems. These consist of random boolean constraints
which are to be satisfied simultaneously by logical variables. In
statistical-mechanics language, the considered model can be seen as a diluted
p-spin model at zero temperature. While such problems become extraordinarily
hard to solve by local search methods in a large region of the parameter space,
still at least one solution may be superimposed by construction. The
statistical properties of the model can be studied exactly by the replica
method and each single instance can be analyzed in polynomial time by a simple
global solution method. The geometrical/topological structures responsible for
dynamic and static phase transitions as well as for the onset of computational
complexity in local search method are thoroughly analyzed. Numerical analysis
on very large samples allows for a precise characterization of the critical
scaling behaviour.Comment: 14 pages, 5 figures, to appear in Phys. Rev. E (Feb 2001). v2: minor
errors and references correcte
Oversampling PCM techniques and optimum noise shapers for quantizing a class of nonbandlimited signals
We consider the efficient quantization of a class of nonbandlimited signals, namely, the class of discrete-time signals that can be recovered from their decimated version. The signals are modeled as the output of a single FIR interpolation filter (single band model) or, more generally, as the sum of the outputs of L FIR interpolation filters (multiband model). These nonbandlimited signals are oversampled, and it is therefore reasonable to expect that we can reap the same benefits of well-known efficient A/D techniques that apply only to bandlimited signals. We first show that we can obtain a great reduction in the quantization noise variance due to the oversampled nature of the signals. We can achieve a substantial decrease in bit rate by appropriately decimating the signals and then quantizing them. To further increase the effective quantizer resolution, noise shaping is introduced by optimizing prefilters and postfilters around the quantizer. We start with a scalar time-invariant quantizer and study two important cases of linear time invariant (LTI) filters, namely, the case where the postfilter is the inverse of the prefilter and the more general case where the postfilter is independent from the prefilter. Closed form expressions for the optimum filters and average minimum mean square error are derived in each case for both the single band and multiband models. The class of noise shaping filters and quantizers is then enlarged to include linear periodically time varying (LPTV)M filters and periodically time-varying quantizers of period M. We study two special cases in great detail
Statistical Mechanics of maximal independent sets
The graph theoretic concept of maximal independent set arises in several
practical problems in computer science as well as in game theory. A maximal
independent set is defined by the set of occupied nodes that satisfy some
packing and covering constraints. It is known that finding minimum and
maximum-density maximal independent sets are hard optimization problems. In
this paper, we use cavity method of statistical physics and Monte Carlo
simulations to study the corresponding constraint satisfaction problem on
random graphs. We obtain the entropy of maximal independent sets within the
replica symmetric and one-step replica symmetry breaking frameworks, shedding
light on the metric structure of the landscape of solutions and suggesting a
class of possible algorithms. This is of particular relevance for the
application to the study of strategic interactions in social and economic
networks, where maximal independent sets correspond to pure Nash equilibria of
a graphical game of public goods allocation
Does Reform Work? An Econometric Examination of the Reform-Growth Puzzle
Why are socially beneficial reforms not implemented? One simple answer to this question (which has received little attention in the literature) is that this may be caused by generalised uncertainty about the effectiveness of reforms. If agents are unsure about whether a proposed reform will work, it will be less likely to be adopted. Despite the numerous benefits economists assign to structural reforms, the empirical literature has thus far failed to establish a positive and significant effect of reforms on economic performance. We collect data from 43 econometric studies (for more than 300 coefficients on the effects of reform on growth) and show that approximately one third of these coefficients is positive and significant, another third is negative and significant, and the final third is not statistically significant different from zero. In trying to understand this remarkable variation, we find that the measurement of reform and controlling for institutions and initial conditions are main factors in decreasing the probability of reporting a significant and positive effect of reform on growth.http://deepblue.lib.umich.edu/bitstream/2027.42/57250/1/wp870 .pd
Sampling the ground-state magnetization of d-dimensional p-body Ising models
We demonstrate that a recently introduced heuristic optimization algorithm
[Phys. Rev. E 83, 046709 (2011)] that combines a local search with triadic
crossover genetic updates is capable of sampling nearly uniformly among
ground-state configurations in spin-glass-like Hamiltonians with p-spin
interactions in d space dimensions that have highly degenerate ground states.
Using this algorithm we probe the zero-temperature ferromagnet to spin-glass
transition point q_c of two example models, the disordered version of the
two-dimensional three-spin Baxter-Wu model [q_c = 0.1072(1)] and the
three-dimensional Edwards-Anderson model [q_c = 0.2253(7)], by computing the
Binder ratio of the ground-state magnetization.Comment: 8 pages, 6 figures, 3 table
- …