33 research outputs found

    Beyond Disagreement-based Agnostic Active Learning

    Full text link
    We study agnostic active learning, where the goal is to learn a classifier in a pre-specified hypothesis class interactively with as few label queries as possible, while making no assumptions on the true function generating the labels. The main algorithms for this problem are {\em{disagreement-based active learning}}, which has a high label requirement, and {\em{margin-based active learning}}, which only applies to fairly restricted settings. A major challenge is to find an algorithm which achieves better label complexity, is consistent in an agnostic setting, and applies to general classification problems. In this paper, we provide such an algorithm. Our solution is based on two novel contributions -- a reduction from consistent active learning to confidence-rated prediction with guaranteed error, and a novel confidence-rated predictor

    Dynamic behavior of flexible rectangular fluid containers with time varying fluid

    Get PDF
    With fuel consumption, the fuel container system vibrates with decreasing mass, which is a typical variable mass system. This paper investigates the dynamic characteristics of flexible rectangular fluid containers with decreasing liquid. The dynamic equations of the container with time varying liquid are derived by combining finite element method (FEM) and virtual mass method (VMM). Free vibration states of the variable mass system are mainly investigated. The vibration signals are decomposed using Choi-Walliam Distribution, and the energy density spectrum is given by time frequency analysis. Results show that decrease of the liquid of the system induces increase of the vibration frequencies of the system, and generates an additional negative damping causing the vibration decay slowly. It is found that the additional damping is proportional to rate of mass change. The additional negative damping can cause the system vibrate with increasing amplitudes while the negative damping plays the dominant role rather than the structural damping

    Efficient Active Learning Halfspaces with Tsybakov Noise: A Non-convex Optimization Approach

    Full text link
    We study the problem of computationally and label efficient PAC active learning dd-dimensional halfspaces with Tsybakov Noise~\citep{tsybakov2004optimal} under structured unlabeled data distributions. Inspired by~\cite{diakonikolas2020learning}, we prove that any approximate first-order stationary point of a smooth nonconvex loss function yields a halfspace with a low excess error guarantee. In light of the above structural result, we design a nonconvex optimization-based algorithm with a label complexity of O~(d(1Ο΅)8βˆ’6Ξ±3Ξ±βˆ’1)\tilde{O}(d (\frac{1}{\epsilon})^{\frac{8-6\alpha}{3\alpha-1}})\footnote{In the main body of this work, we use O~(β‹…),Θ~(β‹…)\tilde{O}(\cdot), \tilde{\Theta}(\cdot) to hide factors of the form \polylog(d, \frac{1}{\epsilon}, \frac{1}{\delta})}, under the assumption that the Tsybakov noise parameter α∈(13,1]\alpha \in (\frac13, 1], which narrows down the gap between the label complexities of the previously known efficient passive or active algorithms~\citep{diakonikolas2020polynomial,zhang2021improved} and the information-theoretic lower bound in this setting.Comment: 29 page
    corecore