36,127 research outputs found
Collision of High Frequency Plane Gravitational and Electromagnetic Waves
We study the head-on collision of linearly polarized, high frequency plane
gravitational waves and their electromagnetic counterparts in the
Einstein-Maxwell theory. The post-collision space-times are obtained by solving
the vacuum Einstein-Maxwell field equations in the geometrical optics
approximation. The head-on collisions of all possible pairs of these systems of
waves is described and the results are then generalised to non-linearly
polarized waves which exhibit the maximum two degrees of freedom of
polarization.Comment: Latex file, 17 pages, accepted for publication in International
Journal of Modern Physics
On The Complexity and Completeness of Static Constraints for Breaking Row and Column Symmetry
We consider a common type of symmetry where we have a matrix of decision
variables with interchangeable rows and columns. A simple and efficient method
to deal with such row and column symmetry is to post symmetry breaking
constraints like DOUBLELEX and SNAKELEX. We provide a number of positive and
negative results on posting such symmetry breaking constraints. On the positive
side, we prove that we can compute in polynomial time a unique representative
of an equivalence class in a matrix model with row and column symmetry if the
number of rows (or of columns) is bounded and in a number of other special
cases. On the negative side, we show that whilst DOUBLELEX and SNAKELEX are
often effective in practice, they can leave a large number of symmetric
solutions in the worst case. In addition, we prove that propagating DOUBLELEX
completely is NP-hard. Finally we consider how to break row, column and value
symmetry, correcting a result in the literature about the safeness of combining
different symmetry breaking constraints. We end with the first experimental
study on how much symmetry is left by DOUBLELEX and SNAKELEX on some benchmark
problems.Comment: To appear in the Proceedings of the 16th International Conference on
Principles and Practice of Constraint Programming (CP 2010
The Phase Diagram of 1-in-3 Satisfiability Problem
We study the typical case properties of the 1-in-3 satisfiability problem,
the boolean satisfaction problem where a clause is satisfied by exactly one
literal, in an enlarged random ensemble parametrized by average connectivity
and probability of negation of a variable in a clause. Random 1-in-3
Satisfiability and Exact 3-Cover are special cases of this ensemble. We
interpolate between these cases from a region where satisfiability can be
typically decided for all connectivities in polynomial time to a region where
deciding satisfiability is hard, in some interval of connectivities. We derive
several rigorous results in the first region, and develop the
one-step--replica-symmetry-breaking cavity analysis in the second one. We
discuss the prediction for the transition between the almost surely satisfiable
and the almost surely unsatisfiable phase, and other structural properties of
the phase diagram, in light of cavity method results.Comment: 30 pages, 12 figure
Income-Related Subsidies for Universal Health Insurance Premia: Exploring Alternatives Using the SWITCH Model. ESRI WP516. November 2015
The Programme for Government indicated that under a Universal Health Insurance system, the State would “pay insurance premia for people on low incomes and subsidise premia for people on middle incomes”. This paper examines issues in the design of such a subsidy scheme, in the context of overall premium costs as estimated by Wren et al. (2015) and the KPMG (2015) study for the Health
Insurance Authority. Subsidy design could involve a step-level system, similar to the medical card and GP visit card in the current system; or a smooth, tapered withdrawal of the subsidy, similar to what obtains for many cash benefits in the welfare system. The trade-offs between the income limit up to which a full subsidy would be payable, the rate of withdrawal of subsidy with respect to extra income and overall subsidy cost are explored
Local search for stable marriage problems
The stable marriage (SM) problem has a wide variety of practical
applications, ranging from matching resident doctors to hospitals, to matching
students to schools, or more generally to any two-sided market. In the
classical formulation, n men and n women express their preferences (via a
strict total order) over the members of the other sex. Solving a SM problem
means finding a stable marriage where stability is an envy-free notion: no man
and woman who are not married to each other would both prefer each other to
their partners or to being single. We consider both the classical stable
marriage problem and one of its useful variations (denoted SMTI) where the men
and women express their preferences in the form of an incomplete preference
list with ties over a subset of the members of the other sex. Matchings are
permitted only with people who appear in these lists, an we try to find a
stable matching that marries as many people as possible. Whilst the SM problem
is polynomial to solve, the SMTI problem is NP-hard. We propose to tackle both
problems via a local search approach, which exploits properties of the problems
to reduce the size of the neighborhood and to make local moves efficiently. We
evaluate empirically our algorithm for SM problems by measuring its runtime
behaviour and its ability to sample the lattice of all possible stable
marriages. We evaluate our algorithm for SMTI problems in terms of both its
runtime behaviour and its ability to find a maximum cardinality stable
marriage.For SM problems, the number of steps of our algorithm grows only as
O(nlog(n)), and that it samples very well the set of all stable marriages. It
is thus a fair and efficient approach to generate stable marriages.Furthermore,
our approach for SMTI problems is able to solve large problems, quickly
returning stable matchings of large and often optimal size despite the
NP-hardness of this problem.Comment: 12 pages, Proc. COMSOC 2010 (Third International Workshop on
Computational Social Choice
BUDGET PERSPECTIVES 2016, PAPER 1. Exploring Tax and Welfare Options. June 2015
Budgetary policies on income-related taxes and welfare must find a balance between providing income support to those in need and maintaining a financial incentive to work which supports high employment. This paper focuses principally on the “cash” or “first round” impact of tax and welfare policy changes across the income distribution. Incentive issues are considered in Section 5 of this paper, and in a companion paper to this conference (Savage et al., 2015)
Modelling Eligibility for Medical Cards and GP Visit Cards: Methods and Baseline Results. ESRI WP515. November 2015
The Irish healthcare system includes a complex mix of entitlements – some are universal, others age-related, and some are income-related. In this report, we concentrate on the major income-related entitlements in the current system i.e., the Medical Card and the GP Visit Card. Most medical cards are provided on an income-tested basis, and provide free access to in-patient and out-patient care in public hospitals, to GP care, and to prescription drugs. We examine how the income test for such schemes can be modelled using the detailed income and demographic information in the Survey on Income and Living Conditions. The approach taken applies the rules for income-related cards to each family in this nationally representative sample, using the information they provide on incomes and family composition. This is essential groundwork for
later studies which will examine how the pattern of entitlements might change under different rules, such as those introducing age-related entitlements to GP visit cards, or changes in income limits
Stochastic theory of large-scale enzyme-reaction networks: Finite copy number corrections to rate equation models
Chemical reactions inside cells occur in compartment volumes in the range of
atto- to femtolitres. Physiological concentrations realized in such small
volumes imply low copy numbers of interacting molecules with the consequence of
considerable fluctuations in the concentrations. In contrast, rate equation
models are based on the implicit assumption of infinitely large numbers of
interacting molecules, or equivalently, that reactions occur in infinite
volumes at constant macroscopic concentrations. In this article we compute the
finite-volume corrections (or equivalently the finite copy number corrections)
to the solutions of the rate equations for chemical reaction networks composed
of arbitrarily large numbers of enzyme-catalyzed reactions which are confined
inside a small sub-cellular compartment. This is achieved by applying a
mesoscopic version of the quasi-steady state assumption to the exact
Fokker-Planck equation associated with the Poisson Representation of the
chemical master equation. The procedure yields impressively simple and compact
expressions for the finite-volume corrections. We prove that the predictions of
the rate equations will always underestimate the actual steady-state substrate
concentrations for an enzyme-reaction network confined in a small volume. In
particular we show that the finite-volume corrections increase with decreasing
sub-cellular volume, decreasing Michaelis-Menten constants and increasing
enzyme saturation. The magnitude of the corrections depends sensitively on the
topology of the network. The predictions of the theory are shown to be in
excellent agreement with stochastic simulations for two types of networks
typically associated with protein methylation and metabolism.Comment: 13 pages, 4 figures; published in The Journal of Chemical Physic
Crisis, Response and Distributional Impact: The Case of Ireland. ESRI WP456. May 2013
Ireland is one of the countries most severely affected by the Great Recession. National income fell by more than 10 per cent between 2007 and 2012, as a result of the bursting of a remarkable property bubble, an exceptionally severe banking crisis, and deep fiscal adjustment. This paper examines the income distribution consequences of the recession, and identifies the impact of a broad range of austerity policies on the income distribution. The overall fall in income was just under 8 per cent between 2008 and 2011, but the greatest losses were strongly concentrated on the bottom and top deciles. Tax, welfare and public sector pay changes over the 2008 to 2013 period gave rise to slightly lower than average losses for the bottom decile. Thus, the larger than average losses observed overall are not due to these policy changes; instead, the main driving factors are the direct effects of the recession itself. Policy changes do contribute to the larger than average losses at high income levels
Percolation of satisfiability in finite dimensions
The satisfiability and optimization of finite-dimensional Boolean formulas
are studied using percolation theory, rare region arguments, and boundary
effects. In contrast with mean-field results, there is no satisfiability
transition, though there is a logical connectivity transition. In part of the
disconnected phase, rare regions lead to a divergent running time for
optimization algorithms. The thermodynamic ground state for the NP-hard
two-dimensional maximum-satisfiability problem is typically unique. These
results have implications for the computational study of disordered materials.Comment: 4 pages, 4 fig
- …
