2,367 research outputs found

    From Classical to Quantum and Back: Hamiltonian Adaptive Resolution Path Integral, Ring Polymer, and Centroid Molecular Dynamics

    Full text link
    Path integral-based simulation methodologies play a crucial role for the investigation of nuclear quantum effects by means of computer simulations. However, these techniques are significantly more demanding than corresponding classical simulations. To reduce this numerical effort, we recently proposed a method, based on a rigorous Hamiltonian formulation, which restricts the quantum modeling to a small but relevant spatial region within a larger reservoir where particles are treated classically. In this work, we extend this idea and show how it can be implemented along with state-of-the-art path integral simulation techniques, such as ring polymer and centroid molecular dynamics, which allow the approximate calculation of both quantum statistical and quantum dynamical properties. To this end, we derive a new integration algorithm which also makes use of multiple time-stepping. The scheme is validated via adaptive classical--path-integral simulations of liquid water. Potential applications of the proposed multiresolution method are diverse and include efficient quantum simulations of interfaces as well as complex biomolecular systems such as membranes and proteins

    Particle-by-Particle Reconstruction of Ultrafiltration Cakes in 3D from Binarized TEM Images

    Get PDF
    Transmission electron microscopy (TEM) imaging is one of the few techniques available for direct observation of the microstructure of ultrafiltration cakes. TEM images yield local microstructural information in the form of two-dimensional grayscale images of slices a few particle diameters in thickness. This work presents an innovative particle-by-particle reconstruction scheme for simulating ultrafiltration cake microstructure in three dimensions from TEM images. The scheme uses binarized TEM images, thereby permitting use of lesser-quality images. It is able to account for short- and long-range order within ultrafiltration cake structure by matching the morphology of simulated and measured microstructures at a number of resolutions and scales identifiable within the observed microstructure. In the end, simulated microstructures are intended for improving our understanding of the relationships between cake morphology, ultrafiltration performance, and operating conditions

    The Beylkin-Cramer Summation Rule and A New Fast Algorithm of Cosmic Statistics for Large Data Sets

    Full text link
    Based on the Beylkin-Cramer summation rule, we introduce a new fast algorithm that enable us to explore the high order statistics efficiently in large data sets. Central to this technique is to make decomposition both of fields and operators within the framework of multi-resolution analysis (MRA), and realize theirs discrete representations. Accordingly, a homogenous point process could be equivalently described by a operation of a Toeplitz matrix on a vector, which is accomplished by making use of fast Fourier transformation. The algorithm could be applied widely in the cosmic statistics to tackle large data sets. Especially, we demonstrate this novel technique using the spherical, cubic and cylinder counts in cells respectively. The numerical test shows that the algorithm produces an excellent agreement with the expected results. Moreover, the algorithm introduces naturally a sharp-filter, which is capable of suppressing shot noise in weak signals. In the numerical procedures, the algorithm is somewhat similar to particle-mesh (PM) methods in N-body simulations. As scaled with O(Nlog⁥N)O(N\log N), it is significantly faster than the current particle-based methods, and its computational cost does not relies on shape or size of sampling cells. In addition, based on this technique, we propose further a simple fast scheme to compute the second statistics for cosmic density fields and justify it using simulation samples. Hopefully, the technique developed here allows us to make a comprehensive study of non-Guassianity of the cosmic fields in high precision cosmology. A specific implementation of the algorithm is publicly available upon request to the author.Comment: 27 pages, 9 figures included. revised version, changes include (a) adding a new fast algorithm for 2nd statistics (b) more numerical tests including counts in asymmetric cells, the two-point correlation functions and 2nd variances (c) more discussions on technic

    Detection of hidden structures on all scales in amorphous materials and complex physical systems: basic notions and applications to networks, lattice systems, and glasses

    Full text link
    Recent decades have seen the discovery of numerous complex materials. At the root of the complexity underlying many of these materials lies a large number of possible contending atomic- and larger-scale configurations and the intricate correlations between their constituents. For a detailed understanding, there is a need for tools that enable the detection of pertinent structures on all spatial and temporal scales. Towards this end, we suggest a new method by invoking ideas from network analysis and information theory. Our method efficiently identifies basic unit cells and topological defects in systems with low disorder and may analyze general amorphous structures to identify candidate natural structures where a clear definition of order is lacking. This general unbiased detection of physical structure does not require a guess as to which of the system properties should be deemed as important and may constitute a natural point of departure for further analysis. The method applies to both static and dynamic systems.Comment: (23 pages, 9 figures

    Locally adaptive image denoising by a statistical multiresolution criterion

    Full text link
    We demonstrate how one can choose the smoothing parameter in image denoising by a statistical multiresolution criterion, both globally and locally. Using inhomogeneous diffusion and total variation regularization as examples for localized regularization schemes, we present an efficient method for locally adaptive image denoising. As expected, the smoothing parameter serves as an edge detector in this framework. Numerical examples illustrate the usefulness of our approach. We also present an application in confocal microscopy

    Particle Methods in Bluff Body Aerodynamics

    Get PDF

    Wavelets and Fast Numerical Algorithms

    Full text link
    Wavelet based algorithms in numerical analysis are similar to other transform methods in that vectors and operators are expanded into a basis and the computations take place in this new system of coordinates. However, due to the recursive definition of wavelets, their controllable localization in both space and wave number (time and frequency) domains, and the vanishing moments property, wavelet based algorithms exhibit new and important properties. For example, the multiresolution structure of the wavelet expansions brings about an efficient organization of transformations on a given scale and of interactions between different neighbouring scales. Moreover, wide classes of operators which naively would require a full (dense) matrix for their numerical description, have sparse representations in wavelet bases. For these operators sparse representations lead to fast numerical algorithms, and thus address a critical numerical issue. We note that wavelet based algorithms provide a systematic generalization of the Fast Multipole Method (FMM) and its descendents. These topics will be the subject of the lecture. Starting from the notion of multiresolution analysis, we will consider the so-called non-standard form (which achieves decoupling among the scales) and the associated fast numerical algorithms. Examples of non-standard forms of several basic operators (e.g. derivatives) will be computed explicitly.Comment: 32 pages, uuencoded tar-compressed LaTeX file. Uses epsf.sty (see `macros'
    • 

    corecore