9,180 research outputs found

    Fast Numerical simulations of 2D turbulence using a dynamic model for Subgrid Motions

    Full text link
    We present numerical simulation of 2D turbulent flow using a new model for the subgrid scales which are computed using a dynamic equation linking the subgrid scales with the resolved velocity. This equation is not postulated, but derived from the constitutive equations under the assumption that the non-linear interactions of subgrid scales between themselves are equivalent to a turbulent viscosity.The performances of our model are compared with Direct Numerical Simulations of decaying and forced turbulence. For a same resolution, numerical simulations using our model allow for a significant reduction of the computational time (of the order of 100 in the case we consider), and allow the achievement of significantly larger Reynolds number than the direct method.Comment: 35 pages, 9 figure

    Non-locality and Intermittency in 3D Turbulence

    Full text link
    Numerical simulations are used to determine the influence of the non-local and local interactions on the intermittency corrections in the scaling properties of 3D turbulence. We show that neglect of local interactions leads to an enhanced small-scale energy spectrum and to a significantly larger number of very intense vortices (tornadoes) and stronger intermittency. On the other hand, neglect of the non-local interactions results in even stronger small-scale spectrum but significantly weaker intermittency. Based on these observations, a new model of turbulence is proposed, in which non-local (RDT-like) interactions couple large and small scale via a multiplicative process with additive noise and the local interactions are modeled by a turbulent viscosity. This model is used to derive a simple toy version of the Langevin equations for small-scale velocity increments. A Gaussian approximation for the large scale fields yields the Fokker-Planck equation for the probability distribution function of the velocity increments. Steady state solutions of this equation allows to qualitatively explain the anomalous corrections and the skewness generation along scale.Comment: 40 pages, 29 figure

    Legacy Software Restructuring: Analyzing a Concrete Case

    Get PDF
    Software re-modularization is an old preoccupation of reverse engineering research. The advantages of a well structured or modularized system are well known. Yet after so much time and efforts, the field seems unable to come up with solutions that make a clear difference in practice. Recently, some researchers started to question whether some basic assumptions of the field were not overrated. The main one consists in evaluating the high-cohesion/low-coupling dogma with metrics of unknown relevance. In this paper, we study a real structuring case (on the Eclipse platform) to try to better understand if (some) existing metrics would have helped the software engineers in the task. Results show that the cohesion and coupling metrics used in the experiment did not behave as expected and would probably not have helped the maintainers reach there goal. We also measured another possible restructuring which is to decrease the number of cyclic dependencies between modules. Again, the results did not meet expectations

    A Bayesian fusion model for space-time reconstruction of finely resolved velocities in turbulent flows from low resolution measurements

    Full text link
    The study of turbulent flows calls for measurements with high resolution both in space and in time. We propose a new approach to reconstruct High-Temporal-High-Spatial resolution velocity fields by combining two sources of information that are well-resolved either in space or in time, the Low-Temporal-High-Spatial (LTHS) and the High-Temporal-Low-Spatial (HTLS) resolution measurements. In the framework of co-conception between sensing and data post-processing, this work extensively investigates a Bayesian reconstruction approach using a simulated database. A Bayesian fusion model is developed to solve the inverse problem of data reconstruction. The model uses a Maximum A Posteriori estimate, which yields the most probable field knowing the measurements. The DNS of a wall-bounded turbulent flow at moderate Reynolds number is used to validate and assess the performances of the present approach. Low resolution measurements are subsampled in time and space from the fully resolved data. Reconstructed velocities are compared to the reference DNS to estimate the reconstruction errors. The model is compared to other conventional methods such as Linear Stochastic Estimation and cubic spline interpolation. Results show the superior accuracy of the proposed method in all configurations. Further investigations of model performances on various range of scales demonstrate its robustness. Numerical experiments also permit to estimate the expected maximum information level corresponding to limitations of experimental instruments.Comment: 15 pages, 6 figure

    BUR, Jacques, Le péché originel. Ce que l’Église a vraiment dit

    Get PDF

    Does understanding of the other's intention fall under the influence of a negative bias?

    Get PDF
    This paper aims to determine strategies used by adults to attribute a psychological state to a speaker in situations of communication when several cues carrying emotional valence are in opposition: Are they using the cues (context vs. prosody) or the emotional valence of the cues (positive vs. negative). Fifty adults performed a computerized judgment task, in which they were asked to complete stories. The stories varied according to context (happy or sad) and prosody (sad, happy). The results showed a strategy based on the emotional valence of cues, and the existence of a negative bias
    corecore