39,584 research outputs found
A Proximal Stochastic Gradient Method with Progressive Variance Reduction
We consider the problem of minimizing the sum of two convex functions: one is
the average of a large number of smooth component functions, and the other is a
general convex function that admits a simple proximal mapping. We assume the
whole objective function is strongly convex. Such problems often arise in
machine learning, known as regularized empirical risk minimization. We propose
and analyze a new proximal stochastic gradient method, which uses a multi-stage
scheme to progressively reduce the variance of the stochastic gradient. While
each iteration of this algorithm has similar cost as the classical stochastic
gradient method (or incremental gradient method), we show that the expected
objective value converges to the optimum at a geometric rate. The overall
complexity of this method is much lower than both the proximal full gradient
method and the standard proximal stochastic gradient method
The Exhaustion of Unemployment Benefits in Belgium. Does it Enhance the Probability of Employment ?
In Belgium unemployment insurance benefits can only exhaust for one category of workers : partners of workers with (replacement) labour income (mostly women) may loose their entitlement after an unemployment duration ranging from two to eight years, depending on individual characteristics. We contrast three propensity score matching estimators of the impact of benefit exhaustion on the probability of employment : a standard, a before-after and a IV matching estimator. We conclude that benefit expiration is anticipated as from the moment at which the worker is notified, three months in advance, and that it gradually increases the employment rate up to 25 percentage points 14 months after benefit withdrawal.Unemployment insurance, benefit exhaustion, programme evaluation, before-after estimator, nonparametric methods
Selective sampling importance resampling particle filter tracking with multibag subspace restoration
- …