2,118 research outputs found
Constant Rank Bimatrix Games are PPAD-hard
The rank of a bimatrix game (A,B) is defined as rank(A+B). Computing a Nash
equilibrium (NE) of a rank-, i.e., zero-sum game is equivalent to linear
programming (von Neumann'28, Dantzig'51). In 2005, Kannan and Theobald gave an
FPTAS for constant rank games, and asked if there exists a polynomial time
algorithm to compute an exact NE. Adsul et al. (2011) answered this question
affirmatively for rank- games, leaving rank-2 and beyond unresolved.
In this paper we show that NE computation in games with rank , is
PPAD-hard, settling a decade long open problem. Interestingly, this is the
first instance that a problem with an FPTAS turns out to be PPAD-hard. Our
reduction bypasses graphical games and game gadgets, and provides a simpler
proof of PPAD-hardness for NE computation in bimatrix games. In addition, we
get:
* An equivalence between 2D-Linear-FIXP and PPAD, improving a result by
Etessami and Yannakakis (2007) on equivalence between Linear-FIXP and PPAD.
* NE computation in a bimatrix game with convex set of Nash equilibria is as
hard as solving a simple stochastic game.
* Computing a symmetric NE of a symmetric bimatrix game with rank is
PPAD-hard.
* Computing a (1/poly(n))-approximate fixed-point of a (Linear-FIXP)
piecewise-linear function is PPAD-hard.
The status of rank- games remains unresolved
Micro-evaporators for kinetic exploration of phase diagrams
We use pervaporation-based microfluidic devices to concentrate species in
aqueous solutions with spatial and temporal control of the process. Using
experiments and modelling, we quantitatively describe the advection-diffusion
behavior of the concentration field of various solutions (electrolytes,
colloids, etc) and demonstrate the potential of these devices as universal
tools for the kinetic exploration of the phases and textures that form upon
concentration
Budget Feasible Mechanisms for Experimental Design
In the classical experimental design setting, an experimenter E has access to
a population of potential experiment subjects , each
associated with a vector of features . Conducting an experiment
with subject reveals an unknown value to E. E typically assumes
some hypothetical relationship between 's and 's, e.g., , and estimates from experiments, e.g., through linear
regression. As a proxy for various practical constraints, E may select only a
subset of subjects on which to conduct the experiment.
We initiate the study of budgeted mechanisms for experimental design. In this
setting, E has a budget . Each subject declares an associated cost to be part of the experiment, and must be paid at least her cost. In
particular, the Experimental Design Problem (EDP) is to find a set of
subjects for the experiment that maximizes V(S) = \log\det(I_d+\sum_{i\in
S}x_i\T{x_i}) under the constraint ; our objective
function corresponds to the information gain in parameter that is
learned through linear regression methods, and is related to the so-called
-optimality criterion. Further, the subjects are strategic and may lie about
their costs.
We present a deterministic, polynomial time, budget feasible mechanism
scheme, that is approximately truthful and yields a constant factor
approximation to EDP. In particular, for any small and , we can construct a (12.98, )-approximate mechanism that is
-truthful and runs in polynomial time in both and
. We also establish that no truthful,
budget-feasible algorithms is possible within a factor 2 approximation, and
show how to generalize our approach to a wide class of learning problems,
beyond linear regression
Truthful Multi-unit Procurements with Budgets
We study procurement games where each seller supplies multiple units of his
item, with a cost per unit known only to him. The buyer can purchase any number
of units from each seller, values different combinations of the items
differently, and has a budget for his total payment.
For a special class of procurement games, the {\em bounded knapsack} problem,
we show that no universally truthful budget-feasible mechanism can approximate
the optimal value of the buyer within , where is the total number of
units of all items available. We then construct a polynomial-time mechanism
that gives a -approximation for procurement games with {\em concave
additive valuations}, which include bounded knapsack as a special case. Our
mechanism is thus optimal up to a constant factor. Moreover, for the bounded
knapsack problem, given the well-known FPTAS, our results imply there is a
provable gap between the optimization domain and the mechanism design domain.
Finally, for procurement games with {\em sub-additive valuations}, we
construct a universally truthful budget-feasible mechanism that gives an
-approximation in polynomial time with a
demand oracle.Comment: To appear at WINE 201
High-fidelity readout of trapped-ion qubits
We demonstrate single-shot qubit readout with fidelity sufficient for
fault-tolerant quantum computation, for two types of qubit stored in single
trapped calcium ions. For an optical qubit stored in the (4S_1/2, 3D_5/2)
levels of 40Ca+ we achieve 99.991(1)% average readout fidelity in one million
trials, using time-resolved photon counting. An adaptive measurement technique
allows 99.99% fidelity to be reached in 145us average detection time. For a
hyperfine qubit stored in the long-lived 4S_1/2 (F=3, F=4) sub-levels of 43Ca+
we propose and implement a simple and robust optical pumping scheme to transfer
the hyperfine qubit to the optical qubit, capable of a theoretical fidelity
99.95% in 10us. Experimentally we achieve 99.77(3)% net readout fidelity,
inferring at least 99.87(4)% fidelity for the transfer operation.Comment: 4 pages, 3 figures; improved readout fidelity (numerical results
changed
Computing Stable Coalitions: Approximation Algorithms for Reward Sharing
Consider a setting where selfish agents are to be assigned to coalitions or
projects from a fixed set P. Each project k is characterized by a valuation
function; v_k(S) is the value generated by a set S of agents working on project
k. We study the following classic problem in this setting: "how should the
agents divide the value that they collectively create?". One traditional
approach in cooperative game theory is to study core stability with the
implicit assumption that there are infinite copies of one project, and agents
can partition themselves into any number of coalitions. In contrast, we
consider a model with a finite number of non-identical projects; this makes
computing both high-welfare solutions and core payments highly non-trivial.
The main contribution of this paper is a black-box mechanism that reduces the
problem of computing a near-optimal core stable solution to the purely
algorithmic problem of welfare maximization; we apply this to compute an
approximately core stable solution that extracts one-fourth of the optimal
social welfare for the class of subadditive valuations. We also show much
stronger results for several popular sub-classes: anonymous, fractionally
subadditive, and submodular valuations, as well as provide new approximation
algorithms for welfare maximization with anonymous functions. Finally, we
establish a connection between our setting and the well-studied simultaneous
auctions with item bidding; we adapt our results to compute approximate pure
Nash equilibria for these auctions.Comment: Under Revie
Role of cardiac energetics in aortic stenosis disease progression: identifying the high-risk metabolic phenotype
Background: Severe aortic stenosis (AS) is associated with left ventricular (LV) hypertrophy and cardiac metabolic alterations with evidence of steatosis and impaired myocardial energetics. Despite this common phenotype, there is an unexplained and wide individual heterogeneity in the degree of hypertrophy and progression to myocardial fibrosis and heart failure. We sought to determine whether the cardiac metabolic state may underpin this variability.
Methods: We recruited 74 asymptomatic participants with AS and 13 healthy volunteers. Cardiac energetics were measured using phosphorus spectroscopy to define the myocardial phosphocreatine to adenosine triphosphate ratio. Myocardial lipid content was determined using proton spectroscopy. Cardiac function was assessed by cardiovascular magnetic resonance cine imaging.
Results: Phosphocreatine/adenosine triphosphate was reduced early and significantly across the LV wall thickness quartiles (Q2, 1.50 [1.21–1.71] versus Q1, 1.64 [1.53–1.94]) with a progressive decline with increasing disease severity (Q4, 1.48 [1.18–1.70]; P=0.02). Myocardial triglyceride content levels were overall higher in all the quartiles with a significant increase seen across the AV pressure gradient quartiles (Q2, 1.36 [0.86–1.98] versus Q1, 1.03 [0.81–1.56]; P=0.034). While all AS groups had evidence of subclinical LV dysfunction with impaired strain parameters, impaired systolic longitudinal strain was related to the degree of energetic impairment (r=0.219; P=0.03). Phosphocreatine/adenosine triphosphate was not only an independent predictor of LV wall thickness (r=−0.20; P=0.04) but also strongly associated with myocardial fibrosis (r=−0.24; P=0.03), suggesting that metabolic changes play a role in disease progression. The metabolic and functional parameters showed comparable results when graded by clinical severity of AS.
Conclusions: A gradient of myocardial energetic deficit and steatosis exists across the spectrum of hypertrophied AS hearts, and these metabolic changes precede irreversible LV remodeling and subclinical dysfunction. As such, cardiac metabolism may play an important and potentially causal role in disease progression
- …