545,801 research outputs found

    Efficient CPU-Optimized Parameter Estimation for Modeling Fish Schooling Behavior in Large Particle Systems

    Full text link
    The schooling behavior of fish can be studied through simulations involving a large number of interacting particles. In such systems, each individual particle is guided by behavior rules, which include aggregation towards a centroid, collision avoidance, and direction alignment. The movement vector of each particle may be expressed as a linear combination of behaviors, with unknown parameters that define a trade-off among several behavioral constraints. A fitness function for collective schooling behavior encompasses all individual particle parameters. For a large number of interacting particles in a complex environment, heuristic methods, such as evolutionary algorithms, are used to optimize the fitness function, ensuring that the resulting decision rule preserves collective behavior. However, these algorithms exhibit slow convergence, making them inefficient in terms of CPU time cost. This paper proposes a CPU-efficient iterative (Cluster, Partition, Refine -- CPR) algorithm for estimating decision rule parameters for a large number of interacting particles. In the first step, we employ the K-Means (unsupervised learning) algorithm to cluster candidate solutions. Then, we partition the search space using Voronoi tessellation over the defined clusters. We assess the quality of each cluster based on the fitness function, with the centroid of their Voronoi cells representing the clusters. Subsequently, we refine the search space by introducing new cells into a number of identified well-fitting Voronoi cells. This process is repeated until convergence. A comparison of the performance of the CPR algorithm with a standard Genetic Algorithm reveals that the former converges faster than the latter. We also demonstrate that the application of the CPR algorithm results in a schooling behavior consistent with empirical observations.Comment: 10page

    Fast B-spline Curve Fitting by L-BFGS

    Full text link
    We propose a novel method for fitting planar B-spline curves to unorganized data points. In traditional methods, optimization of control points and foot points are performed in two very time-consuming steps in each iteration: 1) control points are updated by setting up and solving a linear system of equations; and 2) foot points are computed by projecting each data point onto a B-spline curve. Our method uses the L-BFGS optimization method to optimize control points and foot points simultaneously and therefore it does not need to perform either matrix computation or foot point projection in every iteration. As a result, our method is much faster than existing methods

    Fitting Jump Models

    Get PDF
    We describe a new framework for fitting jump models to a sequence of data. The key idea is to alternate between minimizing a loss function to fit multiple model parameters, and minimizing a discrete loss function to determine which set of model parameters is active at each data point. The framework is quite general and encompasses popular classes of models, such as hidden Markov models and piecewise affine models. The shape of the chosen loss functions to minimize determine the shape of the resulting jump model.Comment: Accepted for publication in Automatic

    The use of the bimodal production decline curve for the analysis of hydraulically fractured shale/tight gas reservoirs

    Get PDF
    The capability to conduct a rapid, near real-time model-based analysis of production data from tight/shale (TS) gas fields is important in determining fracture and matrix properties. Model-based analysis of production can range from simple analytical solutions to complex numerical models. The objective of this study is to develop a simple, Excel-based tool for the analysis of the complex problem of gas production from a fractured TS gas reservoir that is based on a robust model that is faithful to the underlying physics and can provide rapid estimates of the important system parameters. The scientifically robust model used as the basis for this tool is a significant modification and expansion of the bimodal production decline curve of Silin and Kneafsey (2012). The production period is divided into two regimes: an early-time regime before the extent of the stimulated reservoir volume (SRV) is felt, where an analytical similarity solution for gas production rate is obtained, and a late-time regime where the rate can be approximated with an exponential decline or more accurately represented with a numerical integration. Our basic model follows Silin and Kneafsey (2012) and produces the widely observed -½ slope on a log-log plot of early-time production decline curves, while our expanded model generalizes this slope to –n, where 0 < n < 1, to represent non-ideal flow geometries. The expanded model was programmed into an Excel spreadsheet to develop an interactive, user-friendly application for curve matching of well production data to the bimodal curve, from which matrix and fracture properties can be extracted. This tool allows significant insight into the model parameters that control the reservoir behavior and production: the geometry of the hydraulically-induced fracture network, its flow and transport properties, and the optimal operational parameters. This information enables informed choices about future operations, and is valuable in several different ways: (a) to estimate reserves and to predict future production, including expected ultimate recovery and the useful lifetime of the stage or the well; (b) if curve-matching is unsuccessful, to indicate the inadequacy of the mathematical model and the need for more complex numerical model to analyze the system; (c) to verify/validate numerical models, and to identify anomalous behavior or measurement errors in the data. The present approach can be adapted to gas-flow problems in dual-permeability media (hydraulically or naturally fractured) or highly heterogeneous sedimentary rock, as well as to retrograde condensation

    Time-dependent mode structure for Lyapunov vectors as a collective movement in quasi-one-dimensional systems

    Full text link
    Time dependent mode structure for the Lyapunov vectors associated with the stepwise structure of the Lyapunov spectra and its relation to the momentum auto-correlation function are discussed in quasi-one-dimensional many-hard-disk systems. We demonstrate mode structures (Lyapunov modes) for all components of the Lyapunov vectors, which include the longitudinal and transverse components of their spatial and momentum parts, and their phase relations are specified. These mode structures are suggested from the form of the Lyapunov vectors corresponding to the zero-Lyapunov exponents. Spatial node structures of these modes are explained by the reflection properties of the hard-walls used in the models. Our main interest is the time-oscillating behavior of Lyapunov modes. It is shown that the largest time-oscillating period of the Lyapunov modes is twice as long as the time-oscillating period of the longitudinal momentum auto-correlation function. This relation is satisfied irrespective of the particle number and boundary conditions. A simple explanation for this relation is given based on the form of the Lyapunov vector.Comment: 39 pages, 21 figures, Manuscript including the figures of better quality is available from http://www.phys.unsw.edu.au/~gary/Research.htm

    Numerical Fitting-based Likelihood Calculation to Speed up the Particle Filter

    Get PDF
    The likelihood calculation of a vast number of particles is the computational bottleneck for the particle filter in applications where the observation information is rich. For fast computing the likelihood of particles, a numerical fitting approach is proposed to construct the Likelihood Probability Density Function (Li-PDF) by using a comparably small number of so-called fulcrums. The likelihood of particles is thereby analytically inferred, explicitly or implicitly, based on the Li-PDF instead of directly computed by utilizing the observation, which can significantly reduce the computation and enables real time filtering. The proposed approach guarantees the estimation quality when an appropriate fitting function and properly distributed fulcrums are used. The details for construction of the fitting function and fulcrums are addressed respectively in detail. In particular, to deal with multivariate fitting, the nonparametric kernel density estimator is presented which is flexible and convenient for implicit Li-PDF implementation. Simulation comparison with a variety of existing approaches on a benchmark 1-dimensional model and multi-dimensional robot localization and visual tracking demonstrate the validity of our approach.Comment: 42 pages, 17 figures, 4 tables and 1 appendix. This paper is a draft/preprint of one paper submitted to the IEEE Transaction
    • …
    corecore