1,863 research outputs found

    Some Critical Thoughts on Computational Materials Science

    Get PDF
    1. A Model is a Model is a Model is a Model The title of this report is of course meant to provoke. Why? Because there always exists a menace of confusing models with reality. Does anyone now refer to “first principles simulations”? This point is well taken. However, practically all of the current predictions in this domain are based on simulating electron dynamics using local density functional theory. These simulations, though providing a deep insight into materials ground states, are not exact but approximate solutions of the Schrödinger equation, which - not to forget - is a model itself [1]. Does someone now refer to “finite element simulations”? This point is also well taken. However, also in this case one has to admit that approximate solutions to large sets of non-linear differential equations formulated for a (non-existing) continuum under idealized boundary conditions is what it is: a model of nature but not reality. But us let calm down and render the discussion a bit more serious: current methods of ground state calculations are definitely among the cutting-edge disciplines in computational materials science and the community has learnt much from it during the last years. Similar aspects apply for some continuum-based finite element simulations. After all this report is meant to attract readers into this exciting field and not to repulse them. And for this reason I feel obliged to first make a point in underscoring that any interpretation of a research result obtained by computer simulation should be accompanied by scrutinizing the model ingredients and boundary conditions of that calculation in the same critical way as an experimentalist would check his experimental set-up

    Generative Modeling and Inference in Directed and Undirected Neural Networks

    Get PDF
    Generative modeling and inference are two broad categories in unsupervised learning whose goal is to answer the following questions, respectively: 1. Given a dataset, how do we (either implicitly or explicitly) model the underlying probability distribution from which the data came and draw samples from that distribution? 2. How can we learn an underlying abstract representation of the data? In this dissertation we provide three studies that each in a different way improve upon specific generative modeling and inference techniques. First, we develop a state-of-the-art estimator of a generic probability distribution's partition function, or normalizing constant, during simulated tempering. We then apply our estimator to the specific case of training undirected probabilistic graphical models and find our method able to track log-likelihoods during training at essentially no extra computational cost. We then shift our focus to variational inference in directed probabilistic graphical models (Bayesian networks) for generative modeling and inference. First, we generalize the aggregate prior distribution to decouple the variational and generative models to provide the model with greater flexibility and find improvements in the model's log-likelihood of test data as well as a better latent representation. Finally, we study the variational loss function and argue under a typical architecture the data-dependent term of the gradient decays to zero as the latent space dimensionality increases. We use this result to propose a simple modification to random weight initialization and show in certain models the modification gives rise to substantial improvement in training convergence time. Together, these results improve quantitative performance of popular generative modeling and inference models in addition to furthering our understanding of them

    A Phase Field Model for Continuous Clustering on Vector Fields

    Get PDF
    A new method for the simplification of flow fields is presented. It is based on continuous clustering. A well-known physical clustering model, the Cahn Hilliard model, which describes phase separation, is modified to reflect the properties of the data to be visualized. Clusters are defined implicitly as connected components of the positivity set of a density function. An evolution equation for this function is obtained as a suitable gradient flow of an underlying anisotropic energy functional. Here, time serves as the scale parameter. The evolution is characterized by a successive coarsening of patterns-the actual clustering-during which the underlying simulation data specifies preferable pattern boundaries. We introduce specific physical quantities in the simulation to control the shape, orientation and distribution of the clusters as a function of the underlying flow field. In addition, the model is expanded, involving elastic effects. In the early stages of the evolution shear layer type representation of the flow field can thereby be generated, whereas, for later stages, the distribution of clusters can be influenced. Furthermore, we incorporate upwind ideas to give the clusters an oriented drop-shaped appearance. Here, we discuss the applicability of this new type of approach mainly for flow fields, where the cluster energy penalizes cross streamline boundaries. However, the method also carries provisions for other fields as well. The clusters can be displayed directly as a flow texture. Alternatively, the clusters can be visualized by iconic representations, which are positioned by using a skeletonization algorithm.
    corecore