170,813 research outputs found

    A Generative Model of Natural Texture Surrogates

    Full text link
    Natural images can be viewed as patchworks of different textures, where the local image statistics is roughly stationary within a small neighborhood but otherwise varies from region to region. In order to model this variability, we first applied the parametric texture algorithm of Portilla and Simoncelli to image patches of 64X64 pixels in a large database of natural images such that each image patch is then described by 655 texture parameters which specify certain statistics, such as variances and covariances of wavelet coefficients or coefficient magnitudes within that patch. To model the statistics of these texture parameters, we then developed suitable nonlinear transformations of the parameters that allowed us to fit their joint statistics with a multivariate Gaussian distribution. We find that the first 200 principal components contain more than 99% of the variance and are sufficient to generate textures that are perceptually extremely close to those generated with all 655 components. We demonstrate the usefulness of the model in several ways: (1) We sample ensembles of texture patches that can be directly compared to samples of patches from the natural image database and can to a high degree reproduce their perceptual appearance. (2) We further developed an image compression algorithm which generates surprisingly accurate images at bit rates as low as 0.14 bits/pixel. Finally, (3) We demonstrate how our approach can be used for an efficient and objective evaluation of samples generated with probabilistic models of natural images.Comment: 34 pages, 9 figure

    Dynamics of trimming the content of face representations for categorization in the brain

    Get PDF
    To understand visual cognition, it is imperative to determine when, how and with what information the human brain categorizes the visual input. Visual categorization consistently involves at least an early and a late stage: the occipito-temporal N170 event related potential related to stimulus encoding and the parietal P300 involved in perceptual decisions. Here we sought to understand how the brain globally transforms its representations of face categories from their early encoding to the later decision stage over the 400 ms time window encompassing the N170 and P300 brain events. We applied classification image techniques to the behavioral and electroencephalographic data of three observers who categorized seven facial expressions of emotion and report two main findings: (1) Over the 400 ms time course, processing of facial features initially spreads bilaterally across the left and right occipito-temporal regions to dynamically converge onto the centro-parietal region; (2) Concurrently, information processing gradually shifts from encoding common face features across all spatial scales (e.g. the eyes) to representing only the finer scales of the diagnostic features that are richer in useful information for behavior (e.g. the wide opened eyes in 'fear'; the detailed mouth in 'happy'). Our findings suggest that the brain refines its diagnostic representations of visual categories over the first 400 ms of processing by trimming a thorough encoding of features over the N170, to leave only the detailed information important for perceptual decisions over the P300

    Quasar Microlensing: when compact masses mimic smooth matter

    Full text link
    The magnification induced by gravitational microlensing is sensitive to the size of a source relative to the Einstein radius, the natural microlensing scale length. This paper investigates the effect of source size in the case where the microlensing masses are distributed with a bimodal mass function, with solar mass stars representing the normal stellar masses, and smaller masses (down to 8.5×1058.5\times 10^{-5}M_\odot) representing a dark matter component. It is found that there exists a critical regime where the dark matter is initially seen as individual compact masses, but with an increasing source size the compact dark matter acts as a smooth mass component. This study reveals that interpretation of microlensing light curves, especially claims of small mass dark matter lenses embedded in an overall stellar population, must consider the important influence of the size of the source.Comment: 6 pages, to appear in ApJ. As ever, quality of figures reduce

    Curve Reconstruction via the Global Statistics of Natural Curves

    Full text link
    Reconstructing the missing parts of a curve has been the subject of much computational research, with applications in image inpainting, object synthesis, etc. Different approaches for solving that problem are typically based on processes that seek visually pleasing or perceptually plausible completions. In this work we focus on reconstructing the underlying physically likely shape by utilizing the global statistics of natural curves. More specifically, we develop a reconstruction model that seeks the mean physical curve for a given inducer configuration. This simple model is both straightforward to compute and it is receptive to diverse additional information, but it requires enough samples for all curve configurations, a practical requirement that limits its effective utilization. To address this practical issue we explore and exploit statistical geometrical properties of natural curves, and in particular, we show that in many cases the mean curve is scale invariant and oftentimes it is extensible. This, in turn, allows to boost the number of examples and thus the robustness of the statistics and its applicability. The reconstruction results are not only more physically plausible but they also lead to important insights on the reconstruction problem, including an elegant explanation why certain inducer configurations are more likely to yield consistent perceptual completions than others.Comment: CVPR versio

    Nonlinear vibration absorber optimal design via asymptotic approach

    Get PDF
    This paper tackles the classical problem of Vibration Absorbers (VAs) operating in the nonlinear dynamic regime. Since traditional linear VAs suffer from the drawback of a narrow bandwith and numerous structures exhibit nonlinear behavior, nonlinear absorbers are of practical interest. The resonant dynamic behavior of a nonlinear hysteretic VA attached to a damped nonlinear structure is investigated analytically via asymptotics and numerically via path following. The response of the reduced-order model, obtained by projecting the dynamics of the primary structure onto the mode to control, is evaluated using the method of multiple scales up to the first nonlinear order beyond the resonance. Here, the asymptotic response of the two-degree-of-freedom system with a 1:1 internal resonance is shown to be in very close agreement with the results of path following analyses. The asymptotic solution lends itself to a versatile optimization based on differential evolutionary

    Alpha, Betti and the Megaparsec Universe: on the Topology of the Cosmic Web

    Full text link
    We study the topology of the Megaparsec Cosmic Web in terms of the scale-dependent Betti numbers, which formalize the topological information content of the cosmic mass distribution. While the Betti numbers do not fully quantify topology, they extend the information beyond conventional cosmological studies of topology in terms of genus and Euler characteristic. The richer information content of Betti numbers goes along the availability of fast algorithms to compute them. For continuous density fields, we determine the scale-dependence of Betti numbers by invoking the cosmologically familiar filtration of sublevel or superlevel sets defined by density thresholds. For the discrete galaxy distribution, however, the analysis is based on the alpha shapes of the particles. These simplicial complexes constitute an ordered sequence of nested subsets of the Delaunay tessellation, a filtration defined by the scale parameter, α\alpha. As they are homotopy equivalent to the sublevel sets of the distance field, they are an excellent tool for assessing the topological structure of a discrete point distribution. In order to develop an intuitive understanding for the behavior of Betti numbers as a function of α\alpha, and their relation to the morphological patterns in the Cosmic Web, we first study them within the context of simple heuristic Voronoi clustering models. Subsequently, we address the topology of structures emerging in the standard LCDM scenario and in cosmological scenarios with alternative dark energy content. The evolution and scale-dependence of the Betti numbers is shown to reflect the hierarchical evolution of the Cosmic Web and yields a promising measure of cosmological parameters. We also discuss the expected Betti numbers as a function of the density threshold for superlevel sets of a Gaussian random field.Comment: 42 pages, 14 figure

    Towards understanding bbˉb\bar{b} production in γγ\gamma\gamma collisions

    Full text link
    Understanding the data on the total cross section σtot(\sigma_{tot}(e+^+e^-\toe+^+ebbˉ)^-b\bar{b}) measured at LEP2 represents a serious challenge for perturbative QCD. In order to unravel the origins of the discrepancy between data and theory, we investigate the dependence of four contributions to this cross section on γγ\gamma\gamma collision energy. As the reliability of the existing calculations of σtot(\sigma_{tot}(e+^+e^-\toe+^+ebbˉ)^-b\bar{b}) depends, among other things, on the stability of calculations of the cross section σtot(γγbbˉ)\sigma_{tot}(\gamma\gamma\to b\bar{b}) with respect to variations of the renormalization and factorization scales, we investigate this aspect in detail. We show that in most of the region relevant for the LEP2 data the existing QCD calculations of σtot(γγbbˉ)\sigma_{tot}(\gamma\gamma\to b\bar{b}) do not exhibit a region of local stability and should thus be taken with caution. The source of this instability is suggested and its phenomenological implications for LEP2 data are discussed.Comment: 16 pages, latex with 10 eps-figures, misprints corrected, some formulations amende

    Long time stress relaxation of amorphous networks under uniaxial tension: The Dynamic Constrained Junction Model

    Get PDF
    Poly-isoprene networks with different degrees of cross-linking and filler amount are studied under uniaxial stress relaxation. Time decay of stress obeys a stretched exponential form with a stretching parameter of 0.4 that is same for all independent variables, i.e., extensions, crosslink density and filler amount. Relaxation time τ increases with increasing strain, and decreases with both cross-link and filler content. Dependence of τ on filler content is less sensitive than on cross-link density. The isochronous Mooney-Rivlin plots show that the phenomenological constant 2C1 is time independent, and all time dependence results from that of 2C2 , which is associated with relaxation of intermolecular interactions at and above the length-scales of network chain dimensions. The relatively low value of the stretching parameter is interpreted in terms of a molecular model where entanglements contribute to relaxation at a wide spectrum of time scales

    Progressive Simplification of Polygonal Curves

    Get PDF
    Simplifying polygonal curves at different levels of detail is an important problem with many applications. Existing geometric optimization algorithms are only capable of minimizing the complexity of a simplified curve for a single level of detail. We present an O(n3m)O(n^3m)-time algorithm that takes a polygonal curve of n vertices and produces a set of consistent simplifications for m scales while minimizing the cumulative simplification complexity. This algorithm is compatible with distance measures such as the Hausdorff, the Fr\'echet and area-based distances, and enables simplification for continuous scaling in O(n5)O(n^5) time. To speed up this algorithm in practice, we present new techniques for constructing and representing so-called shortcut graphs. Experimental evaluation of these techniques on trajectory data reveals a significant improvement of using shortcut graphs for progressive and non-progressive curve simplification, both in terms of running time and memory usage.Comment: 20 pages, 20 figure
    corecore