21,526 research outputs found

    Expanded Parts Model for Semantic Description of Humans in Still Images

    Get PDF
    We introduce an Expanded Parts Model (EPM) for recognizing human attributes (e.g. young, short hair, wearing suit) and actions (e.g. running, jumping) in still images. An EPM is a collection of part templates which are learnt discriminatively to explain specific scale-space regions in the images (in human centric coordinates). This is in contrast to current models which consist of a relatively few (i.e. a mixture of) 'average' templates. EPM uses only a subset of the parts to score an image and scores the image sparsely in space, i.e. it ignores redundant and random background in an image. To learn our model, we propose an algorithm which automatically mines parts and learns corresponding discriminative templates together with their respective locations from a large number of candidate parts. We validate our method on three recent challenging datasets of human attributes and actions. We obtain convincing qualitative and state-of-the-art quantitative results on the three datasets.Comment: Accepted for publication in IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI

    Representation Learning by Learning to Count

    Full text link
    We introduce a novel method for representation learning that uses an artificial supervision signal based on counting visual primitives. This supervision signal is obtained from an equivariance relation, which does not require any manual annotation. We relate transformations of images to transformations of the representations. More specifically, we look for the representation that satisfies such relation rather than the transformations that match a given representation. In this paper, we use two image transformations in the context of counting: scaling and tiling. The first transformation exploits the fact that the number of visual primitives should be invariant to scale. The second transformation allows us to equate the total number of visual primitives in each tile to that in the whole image. These two transformations are combined in one constraint and used to train a neural network with a contrastive loss. The proposed task produces representations that perform on par or exceed the state of the art in transfer learning benchmarks.Comment: ICCV 2017(oral

    A Novel Adaptive Spectrum Noise Cancellation Approach for Enhancing Heartbeat Rate Monitoring in a Wearable Device

    Get PDF
    This paper presents a novel approach, Adaptive Spectrum Noise Cancellation (ASNC), for motion artifacts removal in Photoplethysmography (PPG) signals measured by an optical biosensor to obtain clean PPG waveforms for heartbeat rate calculation. One challenge faced by this optical sensing method is the inevitable noise induced by movement when the user is in motion, especially when the motion frequency is very close to the target heartbeat rate. The proposed ASNC utilizes the onboard accelerometer and gyroscope sensors to detect and remove the artifacts adaptively, thus obtaining accurate heartbeat rate measurement while in motion. The ASNC algorithm makes use of a commonly accepted spectrum analysis approaches in medical digital signal processing, discrete cosine transform, to carry out frequency domain analysis. Results obtained by the proposed ASNC have been compared to the classic algorithms, the adaptive threshold peak detection and adaptive noise cancellation. The mean (standard deviation) absolute error and mean relative error of heartbeat rate calculated by ASNC is 0.33 (0.57) beats·min-1 and 0.65%, by adaptive threshold peak detection algorithm is 2.29 (2.21) beats·min-1 and 8.38%, by adaptive noise cancellation algorithm is 1.70 (1.50) beats·min-1 and 2.02%. While all algorithms performed well with both simulated PPG data and clean PPG data collected from our Verity device in situations free of motion artifacts, ASNC provided better accuracy when motion artifacts increase, especially when motion frequency is very close to the heartbeat rate

    Projection operator formalism and entropy

    Full text link
    The entropy definition is deduced by means of (re)deriving the generalized non-linear Langevin equation using Zwanzig projector operator formalism. It is shown to be necessarily related to an invariant measure which, in classical mechanics, can always be taken to be the Liouville measure. It is not true that one is free to choose a ``relevant'' probability density independently as is done in other flavors of projection operator formalism. This observation induces an entropy expression which is valid also outside the thermodynamic limit and in far from equilibrium situations. The Zwanzig projection operator formalism therefore gives a deductive derivation of non-equilibrium, and equilibrium, thermodynamics. The entropy definition found is closely related to the (generalized) microcanonical Boltzmann-Planck definition but with some subtle differences. No ``shell thickness'' arguments are needed, nor desirable, for a rigorous definition. The entropy expression depends on the choice of macroscopic variables and does not exactly transform as a scalar quantity. The relation with expressions used in the GENERIC formalism are discussed
    • …
    corecore