3,563 research outputs found

    Enhancing urban analysis through lacunarity multiscale measurement

    Get PDF
    Urban spatial configurations in most part of the developing countries showparticular urban forms associated with the more informal urban development ofthese areas. Latin American cities are prime examples of this sort, butinvestigation of these urban forms using up to date computational and analyticaltechniques are still scarce. The purpose of this paper is to examine and extendthe methodology of multiscale analysis for urban spatial patterns evaluation. Weexplain and explore the use of Lacunarity based measurements to follow a lineof research that might make more use of new satellite imagery information inurban planning contexts. A set of binary classifications is performed at differentthresholds on selected neighbourhoods of a small Brazilian town. Theclassifications are appraised and lacunarity measurements are compared in faceof the different geographic referenced information for the same neighbourhoodareas. It was found that even with the simple image classification procedure, animportant amount of spatial configuration characteristics could be extracted withthe analytical procedure that, in turn, may be used in planning and other urbanstudies purposes

    Toward reduction of artifacts in fused images

    Get PDF
    Most fusion satellite image methodologies at pixel-level introduce false spatial details, i.e.artifacts, in the resulting fusedimages. In many cases, these artifacts appears because image fusion methods do not consider the differences in roughness or textural characteristics between different land covers. They only consider the digital values associated with single pixels. This effect increases as the spatial resolution image increases. To minimize this problem, we propose a new paradigm based on local measurements of the fractal dimension (FD). Fractal dimension maps (FDMs) are generated for each of the source images (panchromatic and each band of the multi-spectral images) with the box-counting algorithm and by applying a windowing process. The average of source image FDMs, previously indexed between 0 and 1, has been used for discrimination of different land covers present in satellite images. This paradigm has been applied through the fusion methodology based on the discrete wavelet transform (DWT), using the à trous algorithm (WAT). Two different scenes registered by optical sensors on board FORMOSAT-2 and IKONOS satellites were used to study the behaviour of the proposed methodology. The implementation of this approach, using the WAT method, allows adapting the fusion process to the roughness and shape of the regions present in the image to be fused. This improves the quality of the fusedimages and their classification results when compared with the original WAT metho

    The Data Big Bang and the Expanding Digital Universe: High-Dimensional, Complex and Massive Data Sets in an Inflationary Epoch

    Get PDF
    Recent and forthcoming advances in instrumentation, and giant new surveys, are creating astronomical data sets that are not amenable to the methods of analysis familiar to astronomers. Traditional methods are often inadequate not merely because of the size in bytes of the data sets, but also because of the complexity of modern data sets. Mathematical limitations of familiar algorithms and techniques in dealing with such data sets create a critical need for new paradigms for the representation, analysis and scientific visualization (as opposed to illustrative visualization) of heterogeneous, multiresolution data across application domains. Some of the problems presented by the new data sets have been addressed by other disciplines such as applied mathematics, statistics and machine learning and have been utilized by other sciences such as space-based geosciences. Unfortunately, valuable results pertaining to these problems are mostly to be found only in publications outside of astronomy. Here we offer brief overviews of a number of concepts, techniques and developments, some "old" and some new. These are generally unknown to most of the astronomical community, but are vital to the analysis and visualization of complex datasets and images. In order for astronomers to take advantage of the richness and complexity of the new era of data, and to be able to identify, adopt, and apply new solutions, the astronomical community needs a certain degree of awareness and understanding of the new concepts. One of the goals of this paper is to help bridge the gap between applied mathematics, artificial intelligence and computer science on the one side and astronomy on the other.Comment: 24 pages, 8 Figures, 1 Table. Accepted for publication: "Advances in Astronomy, special issue "Robotic Astronomy

    A fractal fragmentation model for rockfalls

    Get PDF
    The final publication is available at Springer via http://dx.doi.org/10.1007/s10346-016-0773-8The impact-induced rock mass fragmentation in a rockfall is analyzed by comparing the in situ block size distribution (IBSD) of the rock mass detached from the cliff face and the resultant rockfall block size distribution (RBSD) of the rockfall fragments on the slope. The analysis of several inventoried rockfall events suggests that the volumes of the rockfall fragments can be characterized by a power law distribution. We propose the application of a three-parameter rockfall fractal fragmentation model (RFFM) for the transformation of the IBSD into the RBSD. A discrete fracture network model is used to simulate the discontinuity pattern of the detached rock mass and to generate the IBSD. Each block of the IBSD of the detached rock mass is an initiator. A survival rate is included to express the proportion of the unbroken blocks after the impact on the ground surface. The model was calibrated using the volume distribution of a rockfall event in Vilanova de Banat in the Cadí Sierra, Eastern Pyrenees, Spain. The RBSD was obtained directly in the field, by measuring the rock block fragments deposited on the slope. The IBSD and the RBSD were fitted by exponential and power law functions, respectively. The results show that the proposed fractal model can successfully generate the RBSD from the IBSD and indicate the model parameter values for the case study.Peer ReviewedPostprint (author's final draft

    Combining local regularity estimation and total variation optimization for scale-free texture segmentation

    Get PDF
    Texture segmentation constitutes a standard image processing task, crucial to many applications. The present contribution focuses on the particular subset of scale-free textures and its originality resides in the combination of three key ingredients: First, texture characterization relies on the concept of local regularity ; Second, estimation of local regularity is based on new multiscale quantities referred to as wavelet leaders ; Third, segmentation from local regularity faces a fundamental bias variance trade-off: In nature, local regularity estimation shows high variability that impairs the detection of changes, while a posteriori smoothing of regularity estimates precludes from locating correctly changes. Instead, the present contribution proposes several variational problem formulations based on total variation and proximal resolutions that effectively circumvent this trade-off. Estimation and segmentation performance for the proposed procedures are quantified and compared on synthetic as well as on real-world textures

    Nonlinear time-series analysis revisited

    Full text link
    In 1980 and 1981, two pioneering papers laid the foundation for what became known as nonlinear time-series analysis: the analysis of observed data---typically univariate---via dynamical systems theory. Based on the concept of state-space reconstruction, this set of methods allows us to compute characteristic quantities such as Lyapunov exponents and fractal dimensions, to predict the future course of the time series, and even to reconstruct the equations of motion in some cases. In practice, however, there are a number of issues that restrict the power of this approach: whether the signal accurately and thoroughly samples the dynamics, for instance, and whether it contains noise. Moreover, the numerical algorithms that we use to instantiate these ideas are not perfect; they involve approximations, scale parameters, and finite-precision arithmetic, among other things. Even so, nonlinear time-series analysis has been used to great advantage on thousands of real and synthetic data sets from a wide variety of systems ranging from roulette wheels to lasers to the human heart. Even in cases where the data do not meet the mathematical or algorithmic requirements to assure full topological conjugacy, the results of nonlinear time-series analysis can be helpful in understanding, characterizing, and predicting dynamical systems
    corecore