174 research outputs found

    Set-based corral control in stochastic dynamical systems: Making almost invariant sets more invariant

    Full text link
    We consider the problem of stochastic prediction and control in a time-dependent stochastic environment, such as the ocean, where escape from an almost invariant region occurs due to random fluctuations. We determine high-probability control-actuation sets by computing regions of uncertainty, almost invariant sets, and Lagrangian Coherent Structures. The combination of geometric and probabilistic methods allows us to design regions of control that provide an increase in loitering time while minimizing the amount of control actuation. We show how the loitering time in almost invariant sets scales exponentially with respect to the control actuation, causing an exponential increase in loitering times with only small changes in actuation force. The result is that the control actuation makes almost invariant sets more invariant.Comment: 28 pages, 12 figures, Final revision to appear in Chao

    Group Tracking Algorithm for Crowded Scene

    Get PDF

    Detection of Coherent Structures in Flows

    Get PDF
    In this work, we have developed an experimental flow tank that can produce realistic ocean-like flows, including multi-gyre flows. By generating controllable ocean-like flow fields, we can study the flows to gain a better understanding of ocean dynamics. In particular, we use particle image velocimetry and finite-time Lyapunov exponents to determine the location of the Lagrangian Coherent Structures that determine transport in complex fluid flows. This understanding is useful for designing control algorithms and for optimizing the use of autonomous vehicles operating in the stochastic and time-dependent ocean environment

    Spectral, Combinatorial, and Probabilistic Methods in Analyzing and Visualizing Vector Fields and Their Associated Flows

    Get PDF
    In this thesis, we introduce several tools, each coming from a different branch of mathematics, for analyzing real vector fields and their associated flows. Beginning with a discussion about generalized vector field decompositions, that mainly have been derived from the classical Helmholtz-Hodge-decomposition, we decompose a field into a kernel and a rest respectively to an arbitrary vector-valued linear differential operator that allows us to construct decompositions of either toroidal flows or flows obeying differential equations of second (or even fractional) order and a rest. The algorithm is based on the fast Fourier transform and guarantees a rapid processing and an implementation that can be directly derived from the spectral simplifications concerning differentiation used in mathematics. Moreover, we present two combinatorial methods to process 3D steady vector fields, which both use graph algorithms to extract features from the underlying vector field. Combinatorial approaches are known to be less sensitive to noise than extracting individual trajectories. Both of the methods are extensions of an existing 2D technique to 3D fields. We observed that the first technique can generate overly coarse results and therefore we present a second method that works using the same concepts but produces more detailed results. Finally, we discuss several possibilities for categorizing the invariant sets with respect to the flow. Existing methods for analyzing separation of streamlines are often restricted to a finite time or a local area. In the frame of this work, we introduce a new method that complements them by allowing an infinite-time-evaluation of steady planar vector fields. Our algorithm unifies combinatorial and probabilistic methods and introduces the concept of separation in time-discrete Markov chains. We compute particle distributions instead of the streamlines of single particles. We encode the flow into a map and then into a transition matrix for each time direction. Finally, we compare the results of our grid-independent algorithm to the popular Finite-Time-Lyapunov-Exponents and discuss the discrepancies. Gauss\'' theorem, which relates the flow through a surface to the vector field inside the surface, is an important tool in flow visualization. We are exploiting the fact that the theorem can be further refined on polygonal cells and construct a process that encodes the particle movement through the boundary facets of these cells using transition matrices. By pure power iteration of transition matrices, various topological features, such as separation and invariant sets, can be extracted without having to rely on the classical techniques, e.g., interpolation, differentiation and numerical streamline integration

    Uncertainty in finite-time Lyapunov exponent computations

    Get PDF
    The Finite-Time Lyapunov Exponent (FTLE) is a well-established numerical tool for assessing stretching rates of initial parcels of fluid, which are advected according to a given time-varying velocity field (which is often available only as data). When viewed as a field over initial conditions, the FTLE's spatial structure is often used to infer the nonhomogeneous transport. Given the measurement and resolution errors inevitably present in the unsteady velocity data, the computed FTLE field should in reality be treated only as an approximation. A method which, for the first time, is able for attribute spatially-varying errors to the FTLE field is developed. The formulation is, however, confined to two-dimensional flows.Knowledge of the errors prevent reaching erroneous conclusions based only on the FTLE field. Moreover, it is established that increasing the spatial resolution does not improve the accuracy of the FTLE field in the presence of velocity uncertainties, and indeed has the opposite effect. Stochastic simulations are used to validate and exemplify these results, and demonstrate the computability of the error field.Sanjeeva Balasuriy

    Visuelle Analyse großer Partikeldaten

    Get PDF
    Partikelsimulationen sind eine bewährte und weit verbreitete numerische Methode in der Forschung und Technik. Beispielsweise werden Partikelsimulationen zur Erforschung der Kraftstoffzerstäubung in Flugzeugturbinen eingesetzt. Auch die Entstehung des Universums wird durch die Simulation von dunkler Materiepartikeln untersucht. Die hierbei produzierten Datenmengen sind immens. So enthalten aktuelle Simulationen Billionen von Partikeln, die sich über die Zeit bewegen und miteinander interagieren. Die Visualisierung bietet ein großes Potenzial zur Exploration, Validation und Analyse wissenschaftlicher Datensätze sowie der zugrundeliegenden Modelle. Allerdings liegt der Fokus meist auf strukturierten Daten mit einer regulären Topologie. Im Gegensatz hierzu bewegen sich Partikel frei durch Raum und Zeit. Diese Betrachtungsweise ist aus der Physik als das lagrange Bezugssystem bekannt. Zwar können Partikel aus dem lagrangen in ein reguläres eulersches Bezugssystem, wie beispielsweise in ein uniformes Gitter, konvertiert werden. Dies ist bei einer großen Menge an Partikeln jedoch mit einem erheblichen Aufwand verbunden. Darüber hinaus führt diese Konversion meist zu einem Verlust der Präzision bei gleichzeitig erhöhtem Speicherverbrauch. Im Rahmen dieser Dissertation werde ich neue Visualisierungstechniken erforschen, welche speziell auf der lagrangen Sichtweise basieren. Diese ermöglichen eine effiziente und effektive visuelle Analyse großer Partikeldaten

    Stochastic sensitivity: a computable Lagrangian uncertainty measure for unsteady flows

    Get PDF
    Uncertainties in velocity data are often ignored when computing Lagrangian particle trajectories of fluids. Modeling these as noise in the velocity field leads to a random deviation from each trajectory. This deviation is examined within the context of small (multiplicative) stochasticity applying to a two-dimensional unsteady flow operating over a finite time. These assumptions are motivated precisely by standard availability expectations of realistic velocity data. Explicit expressions for the deviation's expected size and anisotropy are obtained using an Itô calculus approach, thereby characterizing the uncertainty in the Lagrangian trajectory's final location with respect to lengthscale and direction. These provide a practical methodology for ascribing spatially nonuniform uncertainties to predictions of flows, and also provide new tools for extracting fluid regions that remain robust under velocity fluctuations.Sanjeeva Balasuriy

    Lagrangian Descriptors with Uncertainty

    Full text link
    Lagrangian descriptors provide a global dynamical picture of the geometric structures for arbitrarily time-dependent flows with broad applications. This paper develops a mathematical framework for computing Lagrangian descriptors when uncertainty appears. The uncertainty originates from estimating the underlying flow field as a natural consequence of data assimilation or statistical forecast. It also appears in the resulting Lagrangian trajectories. The uncertainty in the flow field directly affects the path integration of the crucial nonlinear positive scalar function in computing the Lagrangian descriptor, making it fundamentally different from many other diagnostic methods. Despite being highly nonlinear and non-Gaussian, closed analytic formulae are developed to efficiently compute the expectation of such a scalar function due to the uncertain velocity field by exploiting suitable approximations. A rapid and accurate sampling algorithm is then built to assist the forecast of the probability density function (PDF) of the Lagrangian trajectories. Such a PDF provides the weight to combine the Lagrangian descriptors along different paths. Simple but illustrative examples are designed to show the distinguished behavior of using Lagrangian descriptors in revealing the flow field when uncertainty appears. Uncertainty can either completely erode the coherent structure or barely affect the underlying geometry of the flow field. The method is also applied for eddy identification, indicating that uncertainty has distinct impacts on detecting eddies at different time scales. Finally, when uncertainty is incorporated into the Lagrangian descriptor for inferring the source target, the likelihood criterion provides a very different conclusion from the deterministic methods.Comment: 51 pages, 17 figure
    corecore