10,444 research outputs found

    Modeling Heterogeneous Materials via Two-Point Correlation Functions: II. Algorithmic Details and Applications

    Full text link
    In the first part of this series of two papers, we proposed a theoretical formalism that enables one to model and categorize heterogeneous materials (media) via two-point correlation functions S2 and introduced an efficient heterogeneous-medium (re)construction algorithm called the "lattice-point" algorithm. Here we discuss the algorithmic details of the lattice-point procedure and an algorithm modification using surface optimization to further speed up the (re)construction process. The importance of the error tolerance, which indicates to what accuracy the media are (re)constructed, is also emphasized and discussed. We apply the algorithm to generate three-dimensional digitized realizations of a Fontainebleau sandstone and a boron carbide/aluminum composite from the two- dimensional tomographic images of their slices through the materials. To ascertain whether the information contained in S2 is sufficient to capture the salient structural features, we compute the two-point cluster functions of the media, which are superior signatures of the micro-structure because they incorporate the connectedness information. We also study the reconstruction of a binary laser-speckle pattern in two dimensions, in which the algorithm fails to reproduce the pattern accurately. We conclude that in general reconstructions using S2 only work well for heterogeneous materials with single-scale structures. However, two-point information via S2 is not sufficient to accurately model multi-scale media. Moreover, we construct realizations of hypothetical materials with desired structural characteristics obtained by manipulating their two-point correlation functions.Comment: 35 pages, 19 figure

    An evaluation of planarity of the spatial QRS loop by three dimensional vectorcardiography: its emergence and loss

    Get PDF
    Aims: To objectively characterize and mathematically justify the observation that vectorcardiographic QRS loops in normal individuals are more planar than those from patients with ST elevation myocardial infarction (STEMI). Methods: Vectorcardiograms (VCGs) were constructed from three simultaneously recorded quasi-orthogonal leads, I, aVF and V2 (sampled at 1000 samples/s). The planarity of these QRS loops was determined by fitting a surface to each loop. Goodness of fit was expressed in numerical terms. Results: 15 healthy individuals aged 35–65 years (73% male) and 15 patients aged 45–70 years (80% male) with diagnosed acute STEMI were recruited. The spatial-QRS loop was found to lie in a plane in normal controls. In STEMI patients, this planarity was lost. Calculation of goodness of fit supported these visual observations. Conclusions: The degree of planarity of the VCG loop can differentiate healthy individuals from patients with STEMI. This observation is compatible with our basic understanding of the electrophysiology of the human heart

    Using the Global Positioning System (GPS) in household surveys for better economics and better policy

    Get PDF
    Distance and location are important determinants of many choices that economists study. While these variables can sometimes be obtained from secondary data, economists often rely on information that is self-reported by respondents in surveys. These self-reports are used especially for the distance from households or community centers to various features such as roads, markets, schools, clinics and other public services. There is growing evidence that self-reported distance is measured with error and that these errors are correlated with outcomes of interest. In contrast to self-reports, the Global Positioning System (GPS) can determine almost exact location (typically within 15 meters). The falling cost of GPS receivers (typically below US$100) makes it increasingly feasible for field surveys to use GPS as a better method of measuring location and distance. In this paper we review four ways that GPS can lead to better economics and better policy: (i) through constructing instrumental variables that can be used to understand the causal impact of policies, (ii) by helping to understand policy externalities and spillovers, (iii) through better understanding of the access to services, and (iv) by improving the collection of household survey data. We also discuss several pitfalls and unresolved problems with using GPS in household surveys

    Histopathological image analysis : a review

    Get PDF
    Over the past decade, dramatic increases in computational power and improvement in image analysis algorithms have allowed the development of powerful computer-assisted analytical approaches to radiological data. With the recent advent of whole slide digital scanners, tissue histopathology slides can now be digitized and stored in digital image form. Consequently, digitized tissue histopathology has now become amenable to the application of computerized image analysis and machine learning techniques. Analogous to the role of computer-assisted diagnosis (CAD) algorithms in medical imaging to complement the opinion of a radiologist, CAD algorithms have begun to be developed for disease detection, diagnosis, and prognosis prediction to complement the opinion of the pathologist. In this paper, we review the recent state of the art CAD technology for digitized histopathology. This paper also briefly describes the development and application of novel image analysis technology for a few specific histopathology related problems being pursued in the United States and Europe

    Some comments on particle image displacement velocimetry

    Get PDF
    Laser speckle velocimetry (LSV) or particle image displacement velocimetry, is introduced. This technique provides the simultaneous visualization of the two-dimensional streamline pattern in unsteady flows as well as the quantification of the velocity field over an entire plane. The advantage of this technique is that the velocity field can be measured over an entire plane of the flow field simultaneously, with accuracy and spatial resolution. From this the instantaneous vorticity field can be easily obtained. This constitutes a great asset for the study of a variety of flows that evolve stochastically in both space and time. The basic concept of LSV; methods of data acquisition and reduction, examples of its use, and parameters that affect its utilization are described

    Pilot study of the potential contributions of LANDSAT data in the construction of area sampling frames

    Get PDF
    There are no author-identified significant results in this report
    • 

    corecore