4,807 research outputs found
Learning DNF Expressions from Fourier Spectrum
Since its introduction by Valiant in 1984, PAC learning of DNF expressions
remains one of the central problems in learning theory. We consider this
problem in the setting where the underlying distribution is uniform, or more
generally, a product distribution. Kalai, Samorodnitsky and Teng (2009) showed
that in this setting a DNF expression can be efficiently approximated from its
"heavy" low-degree Fourier coefficients alone. This is in contrast to previous
approaches where boosting was used and thus Fourier coefficients of the target
function modified by various distributions were needed. This property is
crucial for learning of DNF expressions over smoothed product distributions, a
learning model introduced by Kalai et al. (2009) and inspired by the seminal
smoothed analysis model of Spielman and Teng (2001).
We introduce a new approach to learning (or approximating) a polynomial
threshold functions which is based on creating a function with range [-1,1]
that approximately agrees with the unknown function on low-degree Fourier
coefficients. We then describe conditions under which this is sufficient for
learning polynomial threshold functions. Our approach yields a new, simple
algorithm for approximating any polynomial-size DNF expression from its "heavy"
low-degree Fourier coefficients alone. Our algorithm greatly simplifies the
proof of learnability of DNF expressions over smoothed product distributions.
We also describe an application of our algorithm to learning monotone DNF
expressions over product distributions. Building on the work of Servedio
(2001), we give an algorithm that runs in time \poly((s \cdot
\log{(s/\eps)})^{\log{(s/\eps)}}, n), where is the size of the target DNF
expression and \eps is the accuracy. This improves on \poly((s \cdot
\log{(ns/\eps)})^{\log{(s/\eps)} \cdot \log{(1/\eps)}}, n) bound of Servedio
(2001).Comment: Appears in Conference on Learning Theory (COLT) 201
Almost Automorphic and Almost Periodic Dynamics in Skew-Product Semiflows
AMS(MOS) subject classifications: 34C27, 34D05, 35B15, 35B40, 35K57, 54H20.The current series of papers, which consists of three parts, are devoted to the study of almost automorphic dynamics in differential equations. By making use of techniques from abstract topological dynamics, we show that almost automorphy, a notion which was introduced by S. Bochner in 1955, is essential and fundamental in the qualitative study of almost periodic differential equations. Fundamental notions from topological dynamics are introduced in the first part. Harmonic properties of almost automorphic functions such as Fourier series
and frequency module are studied. A module containment result is provided.
In the second part, we study lifting dynamics of w-limit sets and minimal sets
of a skew-product semiflow from an almost periodic minimal base flow. Skewproduct
semiflows with (strongly) order preserving or monotone natures on fibers
are given a particular attention. It is proved that a linearly stable minimal set
must be almost automorphic and become almost periodic if it is also uniformly stable. Other issues such as flow extensions and the existence of almost periodic global attractors, etc. are also studied.
The third part of the series deals with dynamics of almost periodic differential
equations. In this part, we apply the general theory developed in the previous
two parts to study almost automorphic and almost periodic dynamics which are lifted from certain coefficient structures (e.g., almost automorphic or almost
periodic) of differential equations. It is shown that (harmonic or subharmonic)
almost automorphic solutions exist for a large class of almost periodic ordinary,
parabolic and delay differential equations.Partially supported by NSF grants DMS-9207069, DMS-9402945 and DMS-9501412
Learning circuits with few negations
Monotone Boolean functions, and the monotone Boolean circuits that compute
them, have been intensively studied in complexity theory. In this paper we
study the structure of Boolean functions in terms of the minimum number of
negations in any circuit computing them, a complexity measure that interpolates
between monotone functions and the class of all functions. We study this
generalization of monotonicity from the vantage point of learning theory,
giving near-matching upper and lower bounds on the uniform-distribution
learnability of circuits in terms of the number of negations they contain. Our
upper bounds are based on a new structural characterization of negation-limited
circuits that extends a classical result of A. A. Markov. Our lower bounds,
which employ Fourier-analytic tools from hardness amplification, give new
results even for circuits with no negations (i.e. monotone functions)
Optimal Bounds on Approximation of Submodular and XOS Functions by Juntas
We investigate the approximability of several classes of real-valued
functions by functions of a small number of variables ({\em juntas}). Our main
results are tight bounds on the number of variables required to approximate a
function within -error over
the uniform distribution: 1. If is submodular, then it is -close
to a function of variables.
This is an exponential improvement over previously known results. We note that
variables are necessary even for linear
functions. 2. If is fractionally subadditive (XOS) it is -close
to a function of variables. This result holds for all
functions with low total -influence and is a real-valued analogue of
Friedgut's theorem for boolean functions. We show that
variables are necessary even for XOS functions.
As applications of these results, we provide learning algorithms over the
uniform distribution. For XOS functions, we give a PAC learning algorithm that
runs in time . For submodular functions we give
an algorithm in the more demanding PMAC learning model (Balcan and Harvey,
2011) which requires a multiplicative factor approximation with
probability at least over the target distribution. Our uniform
distribution algorithm runs in time .
This is the first algorithm in the PMAC model that over the uniform
distribution can achieve a constant approximation factor arbitrarily close to 1
for all submodular functions. As follows from the lower bounds in (Feldman et
al., 2013) both of these algorithms are close to optimal. We also give
applications for proper learning, testing and agnostic learning with value
queries of these classes.Comment: Extended abstract appears in proceedings of FOCS 201
Bounds on the Average Sensitivity of Nested Canalizing Functions
Nested canalizing Boolean (NCF) functions play an important role in
biological motivated regulative networks and in signal processing, in
particular describing stack filters. It has been conjectured that NCFs have a
stabilizing effect on the network dynamics. It is well known that the average
sensitivity plays a central role for the stability of (random) Boolean
networks. Here we provide a tight upper bound on the average sensitivity for
NCFs as a function of the number of relevant input variables. As conjectured in
literature this bound is smaller than 4/3 This shows that a large number of
functions appearing in biological networks belong to a class that has very low
average sensitivity, which is even close to a tight lower bound.Comment: revised submission to PLOS ON
- …