2,841 research outputs found

    A Multiresolution Census Algorithm for Calculating Vortex Statistics in Turbulent Flows

    Full text link
    The fundamental equations that model turbulent flow do not provide much insight into the size and shape of observed turbulent structures. We investigate the efficient and accurate representation of structures in two-dimensional turbulence by applying statistical models directly to the simulated vorticity field. Rather than extract the coherent portion of the image from the background variation, as in the classical signal-plus-noise model, we present a model for individual vortices using the non-decimated discrete wavelet transform. A template image, supplied by the user, provides the features to be extracted from the vorticity field. By transforming the vortex template into the wavelet domain, specific characteristics present in the template, such as size and symmetry, are broken down into components associated with spatial frequencies. Multivariate multiple linear regression is used to fit the vortex template to the vorticity field in the wavelet domain. Since all levels of the template decomposition may be used to model each level in the field decomposition, the resulting model need not be identical to the template. Application to a vortex census algorithm that records quantities of interest (such as size, peak amplitude, circulation, etc.) as the vorticity field evolves is given. The multiresolution census algorithm extracts coherent structures of all shapes and sizes in simulated vorticity fields and is able to reproduce known physical scaling laws when processing a set of voriticity fields that evolve over time

    Numerical and experimental assessment of the modal curvature method for damage detection in plate structures

    Get PDF
    Use of modal curvatures obtained from modal displacement data for damage detection in isotropic and composite laminated plates is addressed through numerical examples and experimental tests. Numerical simulations are carried out employing COMSOL Multiphysics as finite element solver of the equations governing the Mindlin-Reissner plate model. Damages are introduced as localized non-smooth variations of the bending stiffness of the baseline (healthy) configuration. Experiments are also performed on steel and aluminum plates using scanning laser vibrometry. The obtained results confirm that use of the central difference method to compute modal curvatures greatly amplifies the measurement errors and its application leads to unreliable predictions for damage detection, even after denoising. Therefore, specialized ad hoc numerical techniques must be suitably implemented to enable structural health monitoring via modal curvature changes. In this study, the Savitzky-Golay filter (also referred to as least-square smoothing filter) is considered for the numerical differentiation of noisy data. Numerical and experimental results show that this filter is effective for the reliable computation of modal curvature changes in plate structures due to defects and/or damages

    Topological structures in the equities market network

    Get PDF
    We present a new method for articulating scale-dependent topological descriptions of the network structure inherent in many complex systems. The technique is based on "Partition Decoupled Null Models,'' a new class of null models that incorporate the interaction of clustered partitions into a random model and generalize the Gaussian ensemble. As an application we analyze a correlation matrix derived from four years of close prices of equities in the NYSE and NASDAQ. In this example we expose (1) a natural structure composed of two interacting partitions of the market that both agrees with and generalizes standard notions of scale (eg., sector and industry) and (2) structure in the first partition that is a topological manifestation of a well-known pattern of capital flow called "sector rotation.'' Our approach gives rise to a natural form of multiresolution analysis of the underlying time series that naturally decomposes the basic data in terms of the effects of the different scales at which it clusters. The equities market is a prototypical complex system and we expect that our approach will be of use in understanding a broad class of complex systems in which correlation structures are resident.Comment: 17 pages, 4 figures, 3 table

    Community detection for correlation matrices

    Get PDF
    A challenging problem in the study of complex systems is that of resolving, without prior information, the emergent, mesoscopic organization determined by groups of units whose dynamical activity is more strongly correlated internally than with the rest of the system. The existing techniques to filter correlations are not explicitly oriented towards identifying such modules and can suffer from an unavoidable information loss. A promising alternative is that of employing community detection techniques developed in network theory. Unfortunately, this approach has focused predominantly on replacing network data with correlation matrices, a procedure that tends to be intrinsically biased due to its inconsistency with the null hypotheses underlying the existing algorithms. Here we introduce, via a consistent redefinition of null models based on random matrix theory, the appropriate correlation-based counterparts of the most popular community detection techniques. Our methods can filter out both unit-specific noise and system-wide dependencies, and the resulting communities are internally correlated and mutually anti-correlated. We also implement multiresolution and multifrequency approaches revealing hierarchically nested sub-communities with `hard' cores and `soft' peripheries. We apply our techniques to several financial time series and identify mesoscopic groups of stocks which are irreducible to a standard, sectorial taxonomy, detect `soft stocks' that alternate between communities, and discuss implications for portfolio optimization and risk management.Comment: Final version, accepted for publication on PR

    A perceptual comparison of empirical and predictive region-of-interest video

    Get PDF
    When viewing multimedia presentations, a user only attends to a relatively small part of the video display at any one point in time. By shifting allocation of bandwidth from peripheral areas to those locations where a user’s gaze is more likely to rest, attentive displays can be produced. Attentive displays aim to reduce resource requirements while minimizing negative user perception—understood in this paper as not only a user’s ability to assimilate and understand information but also his/her subjective satisfaction with the video content. This paper introduces and discusses a perceptual comparison between two region-of-interest display (RoID) adaptation techniques. A RoID is an attentive display where bandwidth has been preallocated around measured or highly probable areas of user gaze. In this paper, video content was manipulated using two sources of data: empirical measured data (captured using eye-tracking technology) and predictive data (calculated from the physical characteristics of the video data). Results show that display adaptation causes significant variation in users’ understanding of specific multimedia content. Interestingly, RoID adaptation and the type of video being presented both affect user perception of video quality. Moreover, the use of frame rates less than 15 frames per second, for any video adaptation technique, caused a significant reduction in user perceived quality, suggesting that although users are aware of video quality reduction, it does impact level of information assimilation and understanding. Results also highlight that user level of enjoyment is significantly affected by the type of video yet is not as affected by the quality or type of video adaptation—an interesting implication in the field of entertainment
    • 

    corecore