1,080 research outputs found

    Visual Chunking: A List Prediction Framework for Region-Based Object Detection

    Full text link
    We consider detecting objects in an image by iteratively selecting from a set of arbitrarily shaped candidate regions. Our generic approach, which we term visual chunking, reasons about the locations of multiple object instances in an image while expressively describing object boundaries. We design an optimization criterion for measuring the performance of a list of such detections as a natural extension to a common per-instance metric. We present an efficient algorithm with provable performance for building a high-quality list of detections from any candidate set of region-based proposals. We also develop a simple class-specific algorithm to generate a candidate region instance in near-linear time in the number of low-level superpixels that outperforms other region generating methods. In order to make predictions on novel images at testing time without access to ground truth, we develop learning approaches to emulate these algorithms' behaviors. We demonstrate that our new approach outperforms sophisticated baselines on benchmark datasets.Comment: to appear at ICRA 201

    Learning Anytime Predictions in Neural Networks via Adaptive Loss Balancing

    Full text link
    This work considers the trade-off between accuracy and test-time computational cost of deep neural networks (DNNs) via \emph{anytime} predictions from auxiliary predictions. Specifically, we optimize auxiliary losses jointly in an \emph{adaptive} weighted sum, where the weights are inversely proportional to average of each loss. Intuitively, this balances the losses to have the same scale. We demonstrate theoretical considerations that motivate this approach from multiple viewpoints, including connecting it to optimizing the geometric mean of the expectation of each loss, an objective that ignores the scale of losses. Experimentally, the adaptive weights induce more competitive anytime predictions on multiple recognition data-sets and models than non-adaptive approaches including weighing all losses equally. In particular, anytime neural networks (ANNs) can achieve the same accuracy faster using adaptive weights on a small network than using static constant weights on a large one. For problems with high performance saturation, we also show a sequence of exponentially deepening ANNscan achieve near-optimal anytime results at any budget, at the cost of a const fraction of extra computation

    Assessing Excited State Energy Gaps with Time-Dependent Density Functional Theory on Ru(II) Complexes

    Full text link
    A set of density functionals coming from different rungs on Jacob's ladder are employed to evaluate the electronic excited states of three Ru(II) complexes. While most studies on the performance of density functionals compare the vertical excitation energies, in this work we focus on the energy gaps between the electronic excited states, of the same and different multiplicity. Excited state energy gaps are important for example to determine radiationless transition probabilities. Besides energies, a functional should deliver the correct state character and state ordering. Therefore, wavefunction overlaps are introduced to systematically evaluate the effect of different functionals on the character of the excited states. As a reference, the energies and state characters from multi-state second-order perturbation theory complete active space (MS-CASPT2) are used. In comparison to MS-CASPT2, it is found that while hybrid functionals provide better vertical excitation energies, pure functionals typically give more accurate excited state energy gaps. Pure functionals are also found to reproduce the state character and ordering in closer agreement to MS-CASPT2 than the hybrid functionals
    • …
    corecore