978 research outputs found

    Geodesic Distance Histogram Feature for Video Segmentation

    Full text link
    This paper proposes a geodesic-distance-based feature that encodes global information for improved video segmentation algorithms. The feature is a joint histogram of intensity and geodesic distances, where the geodesic distances are computed as the shortest paths between superpixels via their boundaries. We also incorporate adaptive voting weights and spatial pyramid configurations to include spatial information into the geodesic histogram feature and show that this further improves results. The feature is generic and can be used as part of various algorithms. In experiments, we test the geodesic histogram feature by incorporating it into two existing video segmentation frameworks. This leads to significantly better performance in 3D video segmentation benchmarks on two datasets

    On the use of edge orientation and distance for content-based image retrieval

    Get PDF
    [[abstract]]Recently, various features for content-based image retrieval (CBIR) have been proposed, such as texture, color, shape, and spatial features. In this paper we propose a new feature, called orientation-distance histogram for CBIR. Firstly, we transform the RGB color model of a given image to the HSI color model and detect edge points by using the H-vector information. Secondly, we evaluate the orientation-distance histogram from the edge points to form a feature vector. After normalization of feature, our proposed method can cope with most problems of variations in image. Finally, we show some results of query for real life images with the precision and recall rates to measure the performance. The experimental results show that the proposed retrieval method is efficient and effective[[notice]]補正完畢[[incitationindex]]EI[[conferencetype]]國際[[conferencedate]]20051013~20051015[[booktype]]紙本[[iscallforpapers]]Y[[conferencelocation]]Beijing, Chin

    Model-free measurement of the pair potential in colloidal fluids using optical microscopy

    Full text link
    We report a straightforward, model-free approach for measuring pair potentials from particle-coordinate data, based on enforcing consistency between the pair distribution function measured separately by the distance-histogram and test-particle insertion routes. We demonstrate the method's accuracy and versatility in simulations of simple fluids, before applying it to an experimental system composed of superparamagnetic colloidal particles. The method will enable experimental investigations into many-body interactions and allow for effective coarse-graining of interactions from simulations.Comment: 8 pages, 3 figures including supplemental materia

    General method for constructing local-hidden-variable models for entangled quantum states

    Get PDF
    Entanglement allows for the nonlocality of quantum theory, which is the resource behind device-independent quantum information protocols. However, not all entangled quantum states display nonlocality, and a central question is to determine the precise relation between entanglement and nonlocality. Here we present the first general test to decide whether a quantum state is local, and that can be implemented by semidefinite programming. This method can be applied to any given state and for the construction of new examples of states with local hidden-variable models for both projective and general measurements. As applications we provide a lower bound estimate of the fraction of two-qubit local entangled states and present new explicit examples of such states, including those which arise from physical noise models, Bell-diagonal states, and noisy GHZ and W states.Comment: Published version with new title and abstract, improved presentation and new examples of LHV states. Codes are available at https://github.com/paulskrzypczyk/localhiddenstatemodels (please cite this paper if you use them). See also the related work by F. Hirsch et al arXiv:1512.0026

    Beyond Reuse Distance Analysis: Dynamic Analysis for Characterization of Data Locality Potential

    Get PDF
    Emerging computer architectures will feature drastically decreased flops/byte (ratio of peak processing rate to memory bandwidth) as highlighted by recent studies on Exascale architectural trends. Further, flops are getting cheaper while the energy cost of data movement is increasingly dominant. The understanding and characterization of data locality properties of computations is critical in order to guide efforts to enhance data locality. Reuse distance analysis of memory address traces is a valuable tool to perform data locality characterization of programs. A single reuse distance analysis can be used to estimate the number of cache misses in a fully associative LRU cache of any size, thereby providing estimates on the minimum bandwidth requirements at different levels of the memory hierarchy to avoid being bandwidth bound. However, such an analysis only holds for the particular execution order that produced the trace. It cannot estimate potential improvement in data locality through dependence preserving transformations that change the execution schedule of the operations in the computation. In this article, we develop a novel dynamic analysis approach to characterize the inherent locality properties of a computation and thereby assess the potential for data locality enhancement via dependence preserving transformations. The execution trace of a code is analyzed to extract a computational directed acyclic graph (CDAG) of the data dependences. The CDAG is then partitioned into convex subsets, and the convex partitioning is used to reorder the operations in the execution trace to enhance data locality. The approach enables us to go beyond reuse distance analysis of a single specific order of execution of the operations of a computation in characterization of its data locality properties. It can serve a valuable role in identifying promising code regions for manual transformation, as well as assessing the effectiveness of compiler transformations for data locality enhancement. We demonstrate the effectiveness of the approach using a number of benchmarks, including case studies where the potential shown by the analysis is exploited to achieve lower data movement costs and better performance.Comment: Transaction on Architecture and Code Optimization (2014
    • …
    corecore