162,836 research outputs found

    Review of the Synergies Between Computational Modeling and Experimental Characterization of Materials Across Length Scales

    Full text link
    With the increasing interplay between experimental and computational approaches at multiple length scales, new research directions are emerging in materials science and computational mechanics. Such cooperative interactions find many applications in the development, characterization and design of complex material systems. This manuscript provides a broad and comprehensive overview of recent trends where predictive modeling capabilities are developed in conjunction with experiments and advanced characterization to gain a greater insight into structure-properties relationships and study various physical phenomena and mechanisms. The focus of this review is on the intersections of multiscale materials experiments and modeling relevant to the materials mechanics community. After a general discussion on the perspective from various communities, the article focuses on the latest experimental and theoretical opportunities. Emphasis is given to the role of experiments in multiscale models, including insights into how computations can be used as discovery tools for materials engineering, rather than to "simply" support experimental work. This is illustrated by examples from several application areas on structural materials. This manuscript ends with a discussion on some problems and open scientific questions that are being explored in order to advance this relatively new field of research.Comment: 25 pages, 11 figures, review article accepted for publication in J. Mater. Sc

    The Data Big Bang and the Expanding Digital Universe: High-Dimensional, Complex and Massive Data Sets in an Inflationary Epoch

    Get PDF
    Recent and forthcoming advances in instrumentation, and giant new surveys, are creating astronomical data sets that are not amenable to the methods of analysis familiar to astronomers. Traditional methods are often inadequate not merely because of the size in bytes of the data sets, but also because of the complexity of modern data sets. Mathematical limitations of familiar algorithms and techniques in dealing with such data sets create a critical need for new paradigms for the representation, analysis and scientific visualization (as opposed to illustrative visualization) of heterogeneous, multiresolution data across application domains. Some of the problems presented by the new data sets have been addressed by other disciplines such as applied mathematics, statistics and machine learning and have been utilized by other sciences such as space-based geosciences. Unfortunately, valuable results pertaining to these problems are mostly to be found only in publications outside of astronomy. Here we offer brief overviews of a number of concepts, techniques and developments, some "old" and some new. These are generally unknown to most of the astronomical community, but are vital to the analysis and visualization of complex datasets and images. In order for astronomers to take advantage of the richness and complexity of the new era of data, and to be able to identify, adopt, and apply new solutions, the astronomical community needs a certain degree of awareness and understanding of the new concepts. One of the goals of this paper is to help bridge the gap between applied mathematics, artificial intelligence and computer science on the one side and astronomy on the other.Comment: 24 pages, 8 Figures, 1 Table. Accepted for publication: "Advances in Astronomy, special issue "Robotic Astronomy

    History of art paintings through the lens of entropy and complexity

    Full text link
    Art is the ultimate expression of human creativity that is deeply influenced by the philosophy and culture of the corresponding historical epoch. The quantitative analysis of art is therefore essential for better understanding human cultural evolution. Here we present a large-scale quantitative analysis of almost 140 thousand paintings, spanning nearly a millennium of art history. Based on the local spatial patterns in the images of these paintings, we estimate the permutation entropy and the statistical complexity of each painting. These measures map the degree of visual order of artworks into a scale of order-disorder and simplicity-complexity that locally reflects qualitative categories proposed by art historians. The dynamical behavior of these measures reveals a clear temporal evolution of art, marked by transitions that agree with the main historical periods of art. Our research shows that different artistic styles have a distinct average degree of entropy and complexity, thus allowing a hierarchical organization and clustering of styles according to these metrics. We have further verified that the identified groups correspond well with the textual content used to qualitatively describe the styles, and that the employed complexity-entropy measures can be used for an effective classification of artworks.Comment: 10 two-column pages, 5 figures; accepted for publication in PNAS [supplementary information available at http://www.pnas.org/highwire/filestream/824089/field_highwire_adjunct_files/0/pnas.1800083115.sapp.pdf

    On-the-fly Data Assessment for High Throughput X-ray Diffraction Measurement

    Full text link
    Investment in brighter sources and larger and faster detectors has accelerated the speed of data acquisition at national user facilities. The accelerated data acquisition offers many opportunities for discovery of new materials, but it also presents a daunting challenge. The rate of data acquisition far exceeds the current speed of data quality assessment, resulting in less than optimal data and data coverage, which in extreme cases forces recollection of data. Herein, we show how this challenge can be addressed through development of an approach that makes routine data assessment automatic and instantaneous. Through extracting and visualizing customized attributes in real time, data quality and coverage, as well as other scientifically relevant information contained in large datasets is highlighted. Deployment of such an approach not only improves the quality of data but also helps optimize usage of expensive characterization resources by prioritizing measurements of highest scientific impact. We anticipate our approach to become a starting point for a sophisticated decision-tree that optimizes data quality and maximizes scientific content in real time through automation. With these efforts to integrate more automation in data collection and analysis, we can truly take advantage of the accelerating speed of data acquisition

    Characterizing neuromorphologic alterations with additive shape functionals

    Full text link
    The complexity of a neuronal cell shape is known to be related to its function. Specifically, among other indicators, a decreased complexity in the dendritic trees of cortical pyramidal neurons has been associated with mental retardation. In this paper we develop a procedure to address the characterization of morphological changes induced in cultured neurons by over-expressing a gene involved in mental retardation. Measures associated with the multiscale connectivity, an additive image functional, are found to give a reasonable separation criterion between two categories of cells. One category consists of a control group and two transfected groups of neurons, and the other, a class of cat ganglionary cells. The reported framework also identified a trend towards lower complexity in one of the transfected groups. Such results establish the suggested measures as an effective descriptors of cell shape
    corecore