343 research outputs found

### Computability of Julia sets

In this paper we settle most of the open questions on algorithmic
computability of Julia sets. In particular, we present an algorithm for
constructing quadratics whose Julia sets are uncomputable. We also show that a
filled Julia set of a polynomial is always computable.Comment: Revised. To appear in Moscow Math. Journa

### The Role of Randomness and Noise in Strategic Classification

We investigate the problem of designing optimal classifiers in the strategic
classification setting, where the classification is part of a game in which
players can modify their features to attain a favorable classification outcome
(while incurring some cost). Previously, the problem has been considered from a
learning-theoretic perspective and from the algorithmic fairness perspective.
Our main contributions include 1. Showing that if the objective is to maximize
the efficiency of the classification process (defined as the accuracy of the
outcome minus the sunk cost of the qualified players manipulating their
features to gain a better outcome), then using randomized classifiers (that is,
ones where the probability of a given feature vector to be accepted by the
classifier is strictly between 0 and 1) is necessary. 2. Showing that in many
natural cases, the imposed optimal solution (in terms of efficiency) has the
structure where players never change their feature vectors (the randomized
classifier is structured in a way, such that the gain in the probability of
being classified as a 1 does not justify the expense of changing one's
features). 3. Observing that the randomized classification is not a stable
best-response from the classifier's viewpoint, and that the classifier doesn't
benefit from randomized classifiers without creating instability in the system.
4. Showing that in some cases, a noisier signal leads to better equilibria
outcomes -- improving both accuracy and fairness when more than one
subpopulation with different feature adjustment costs are involved. This is
interesting from a policy perspective, since it is hard to force institutions
to stick to a particular randomized classification strategy (especially in a
context of a market with multiple classifiers), but it is possible to alter the
information environment to make the feature signals inherently noisier.Comment: 22 pages. Appeared in FORC, 202

### Simulating Noisy Channel Interaction

We show that $T$ rounds of interaction over the binary symmetric channel
$BSC_{1/2-\epsilon}$ with feedback can be simulated with $O(\epsilon^2 T)$
rounds of interaction over a noiseless channel. We also introduce a more
general "energy cost" model of interaction over a noisy channel. We show energy
cost to be equivalent to external information complexity, which implies that
our simulation results are unlikely to carry over to energy complexity. Our
main technical innovation is a self-reduction from simulating a noisy channel
to simulating a slightly-less-noisy channel, which may have other applications
in the area of interactive compression

### The Price of Uncertain Priors in Source Coding

We consider the problem of one-way communication when the recipient does not
know exactly the distribution that the messages are drawn from, but has a
"prior" distribution that is known to be close to the source distribution, a
problem first considered by Juba et al. We consider the question of how much
longer the messages need to be in order to cope with the uncertainty about the
receiver's prior and the source distribution, respectively, as compared to the
standard source coding problem. We consider two variants of this uncertain
priors problem: the original setting of Juba et al. in which the receiver is
required to correctly recover the message with probability 1, and a setting
introduced by Haramaty and Sudan, in which the receiver is permitted to fail
with some probability $\epsilon$. In both settings, we obtain lower bounds that
are tight up to logarithmically smaller terms. In the latter setting, we
furthermore present a variant of the coding scheme of Juba et al. with an
overhead of $\log\alpha+\log 1/\epsilon+1$ bits, thus also establishing the
nearly tight upper bound.Comment: To appear in IEEE Transactions on Information Theor

### Information complexity is computable

The information complexity of a function $f$ is the minimum amount of
information Alice and Bob need to exchange to compute the function $f$. In this
paper we provide an algorithm for approximating the information complexity of
an arbitrary function $f$ to within any additive error $\alpha > 0$, thus
resolving an open question as to whether information complexity is computable.
In the process, we give the first explicit upper bound on the rate of
convergence of the information complexity of $f$ when restricted to $b$-bit
protocols to the (unrestricted) information complexity of $f$.Comment: 30 page

- …