15,619 research outputs found

    Correcting curvature-density effects in the Hamilton-Jacobi skeleton

    Get PDF
    The Hainilton-Jacobi approach has proven to be a powerful and elegant method for extracting the skeleton of two-dimensional (2-D) shapes. The approach is based on the observation that the normalized flux associated with the inward evolution of the object boundary at nonskeletal points tends to zero as the size of the integration area tends to zero, while the flux is negative at the locations of skeletal points. Nonetheless, the error in calculating the flux on the image lattice is both limited by the pixel resolution and also proportional to the curvature of the boundary evolution front and, hence, unbounded near endpoints. This makes the exact location of endpoints difficult and renders the performance of the skeleton extraction algorithm dependent on a threshold parameter. This problem can be overcome by using interpolation techniques to calculate the flux with subpixel precision. However, here, we develop a method for 2-D skeleton extraction that circumvents the problem by eliminating the curvature contribution to the error. This is done by taking into account variations of density due to boundary curvature. This yields a skeletonization algorithm that gives both better localization and less susceptibility to boundary noise and parameter choice than the Hamilton-Jacobi method

    Graph edit distance from spectral seriation

    Get PDF
    This paper is concerned with computing graph edit distance. One of the criticisms that can be leveled at existing methods for computing graph edit distance is that they lack some of the formality and rigor of the computation of string edit distance. Hence, our aim is to convert graphs to string sequences so that string matching techniques can be used. To do this, we use a graph spectral seriation method to convert the adjacency matrix into a string or sequence order. We show how the serial ordering can be established using the leading eigenvector of the graph adjacency matrix. We pose the problem of graph-matching as a maximum a posteriori probability (MAP) alignment of the seriation sequences for pairs of graphs. This treatment leads to an expression in which the edit cost is the negative logarithm of the a posteriori sequence alignment probability. We compute the edit distance by finding the sequence of string edit operations which minimizes the cost of the path traversing the edit lattice. The edit costs are determined by the components of the leading eigenvectors of the adjacency matrix and by the edge densities of the graphs being matched. We demonstrate the utility of the edit distance on a number of graph clustering problems

    A graph-spectral approach to shape-from-shading

    Get PDF
    In this paper, we explore how graph-spectral methods can be used to develop a new shape-from-shading algorithm. We characterize the field of surface normals using a weight matrix whose elements are computed from the sectional curvature between different image locations and penalize large changes in surface normal direction. Modeling the blocks of the weight matrix as distinct surface patches, we use a graph seriation method to find a surface integration path that maximizes the sum of curvature-dependent weights and that can be used for the purposes of height reconstruction. To smooth the reconstructed surface, we fit quadrics to the height data for each patch. The smoothed surface normal directions are updated ensuring compliance with Lambert's law. The processes of height recovery and surface normal adjustment are interleaved and iterated until a stable surface is obtained. We provide results on synthetic and real-world imagery

    X-ray time lags in AGN: inverse-Compton scattering and spherical corona model

    Full text link
    We develop a physically motivated, spherical corona model to investigate the frequency-dependent time lags in AGN. The model includes the effects of Compton up-scattering between the disc UV photons and coronal electrons, and the subsequent X-ray reverberation from the disc. The time lags are associated with the time required for multiple scatterings to boost UV photons up to soft and hard X-ray energies, and the light crossing time the photons take to reach the observer. This model can reproduce not only low-frequency hard and high-frequency soft lags, but also the clear bumps and wiggles in reverberation profiles which should explain the wavy-residuals currently observed in some AGN. Our model supports an anti-correlation between the optical depth and coronal temperatures. In case of an optically thin corona, time delays due to propagating fluctuations may be required to reproduce observed time lags. We fit the model to the lag-frequency data of 1H0707-495, Ark 564, NGC 4051 and IRAS 13224-3809 estimated using the minimal bias technique so that the observed lags here are highest-possible quality. We find their corona size is ~7-15 r_g having the constrained optical depth ~2-10. The coronal temperature is ~150-300 keV. Finally, we note that the reverberation wiggles may be signatures of repeating scatters inside the corona that control the distribution of X-ray sources.Comment: 15 pages, 10 figures, accepted for publication in MNRA

    Recovering facial shape using a statistical model of surface normal direction

    Get PDF
    In this paper, we show how a statistical model of facial shape can be embedded within a shape-from-shading algorithm. We describe how facial shape can be captured using a statistical model of variations in surface normal direction. To construct this model, we make use of the azimuthal equidistant projection to map the distribution of surface normals from the polar representation on a unit sphere to Cartesian points on a local tangent plane. The distribution of surface normal directions is captured using the covariance matrix for the projected point positions. The eigenvectors of the covariance matrix define the modes of shape-variation in the fields of transformed surface normals. We show how this model can be trained using surface normal data acquired from range images and how to fit the model to intensity images of faces using constraints on the surface normal direction provided by Lambert's law. We demonstrate that the combination of a global statistical constraint and local irradiance constraint yields an efficient and accurate approach to facial shape recovery and is capable of recovering fine local surface details. We assess the accuracy of the technique on a variety of images with ground truth and real-world images

    Non-invasive single-bunch matching and emittance monitor

    Get PDF
    On-line monitoring of beam quality for high brightness beams is only possible using non-invasive instruments. For matching measurements, very few such instruments are available. One candidate is a quadrupole pick-up. Therefore, a new type of quadrupole pick-up has been developed for the 26 GeV Proton Synchrotron (PS) at CERN, and a measurement system consisting of two such pick-ups is now installed in this accelerator. Using the information from these pick-ups, it is possible to determine both injection matching and emittance in the horizontal and vertical planes, for each bunch separately. This paper presents the measurement method and some of the results from the first year of use, as well as comparisons with other measurement methods.Comment: 10 pages, 10 figures; added figure, minor textual additions; To be resubmitted to Phys. Rev. ST-A

    Learning probability spaces for classification and recognition of patterns with or without supervision

    Get PDF
    Learning probability spaces for classification and recognition of patterns with or without supervisio

    Terrain analysis using radar shape-from-shading

    Get PDF
    This paper develops a maximum a posteriori (MAP) probability estimation framework for shape-from-shading (SFS) from synthetic aperture radar (SAR) images. The aim is to use this method to reconstruct surface topography from a single radar image of relatively complex terrain. Our MAP framework makes explicit how the recovery of local surface orientation depends on the whereabouts of terrain edge features and the available radar reflectance information. To apply the resulting process to real world radar data, we require probabilistic models for the appearance of terrain features and the relationship between the orientation of surface normals and the radar reflectance. We show that the SAR data can be modeled using a Rayleigh-Bessel distribution and use this distribution to develop a maximum likelihood algorithm for detecting and labeling terrain edge features. Moreover, we show how robust statistics can be used to estimate the characteristic parameters of this distribution. We also develop an empirical model for the SAR reflectance function. Using the reflectance model, we perform Lambertian correction so that a conventional SFS algorithm can be applied to the radar data. The initial surface normal direction is constrained to point in the direction of the nearest ridge or ravine feature. Each surface normal must fall within a conical envelope whose axis is in the direction of the radar illuminant. The extent of the envelope depends on the corrected radar reflectance and the variance of the radar signal statistics. We explore various ways of smoothing the field of surface normals using robust statistics. Finally, we show how to reconstruct the terrain surface from the smoothed field of surface normal vectors. The proposed algorithm is applied to various SAR data sets containing relatively complex terrain structure

    An Analysis of Kinetic Response Variability

    Get PDF
    Studies evaluating variability of force as a function of absolute force generated are synthesized. Inconsistencies in reported estimates of this relationship are viewed as a function of experimental constraints imposed. Typically, within-subject force variability increases at a negative accelerating rate with equal increments in force produced. Current pulse-step and impulse variability models are unable to accommodate this description, although the notion of efficiency is suggested as a useful construct to explain the description outlined
    • …
    corecore