2,144 research outputs found
No Internal Regret via Neighborhood Watch
We present an algorithm which attains O(\sqrt{T}) internal (and thus
external) regret for finite games with partial monitoring under the local
observability condition. Recently, this condition has been shown by (Bartok,
Pal, and Szepesvari, 2011) to imply the O(\sqrt{T}) rate for partial monitoring
games against an i.i.d. opponent, and the authors conjectured that the same
holds for non-stochastic adversaries. Our result is in the affirmative, and it
completes the characterization of possible rates for finite partial-monitoring
games, an open question stated by (Cesa-Bianchi, Lugosi, and Stoltz, 2006). Our
regret guarantees also hold for the more general model of partial monitoring
with random signals
Fast Ridge Regression with Randomized Principal Component Analysis and Gradient Descent
We propose a new two stage algorithm LING for large scale regression
problems. LING has the same risk as the well known Ridge Regression under the
fixed design setting and can be computed much faster. Our experiments have
shown that LING performs well in terms of both prediction accuracy and
computational efficiency compared with other large scale regression algorithms
like Gradient Descent, Stochastic Gradient Descent and Principal Component
Regression on both simulated and real datasets
Calibration: Respice, Adspice, Prospice
“Those who claim for themselves to judge the truth are bound to possess a criterion of truth.” JEL Code: C18, C53, D89calibration, prediction
- …