6,977 research outputs found

    Gauge invariance induced relations and the equivalence between distinct approaches to NLSM amplitudes

    Full text link
    In this paper, we derive generalized Bern-Carrasco-Johansson relations for color-ordered Yang-Mills amplitudes by imposing gauge invariance conditions and dimensional reduction appropriately on the new discovered graphic expansion of Einstein-Yang-Mills amplitudes. These relations are also satisfied by color-ordered amplitudes in other theories such as color-scalar theory, bi-scalar theory and nonlinear sigma model (NLSM). As an application of the gauge invariance induced relations, we further prove that the three types of BCJ numerators in NLSM , which are derived from Feynman rules, Abelian Z-theory and Cachazo-He- Yuan formula respectively, produce the same total amplitudes. In other words, the three distinct approaches to NLSM amplitudes are equivalent to each other.Comment: 40pages, 2 figure

    DroTrack: High-speed Drone-based Object Tracking Under Uncertainty

    Full text link
    We present DroTrack, a high-speed visual single-object tracking framework for drone-captured video sequences. Most of the existing object tracking methods are designed to tackle well-known challenges, such as occlusion and cluttered backgrounds. The complex motion of drones, i.e., multiple degrees of freedom in three-dimensional space, causes high uncertainty. The uncertainty problem leads to inaccurate location predictions and fuzziness in scale estimations. DroTrack solves such issues by discovering the dependency between object representation and motion geometry. We implement an effective object segmentation based on Fuzzy C Means (FCM). We incorporate the spatial information into the membership function to cluster the most discriminative segments. We then enhance the object segmentation by using a pre-trained Convolution Neural Network (CNN) model. DroTrack also leverages the geometrical angular motion to estimate a reliable object scale. We discuss the experimental results and performance evaluation using two datasets of 51,462 drone-captured frames. The combination of the FCM segmentation and the angular scaling increased DroTrack precision by up to 9%9\% and decreased the centre location error by 162162 pixels on average. DroTrack outperforms all the high-speed trackers and achieves comparable results in comparison to deep learning trackers. DroTrack offers high frame rates up to 1000 frame per second (fps) with the best location precision, more than a set of state-of-the-art real-time trackers.Comment: 10 pages, 12 figures, FUZZ-IEEE 202
    corecore