1,179 research outputs found

    Advanced of Mathematics-Statistics Methods to Radar Calibration for Rainfall Estimation; A Review

    Get PDF
    Ground-based radar is known as one of the most important systems for precipitation measurement at high spatial and temporal resolutions. Radar data are recorded in digital manner and readily ingested to any statistical analyses. These measurements are subjected to specific calibration to eliminate systematic errors as well as minimizing the random errors, respectively. Since statistical methods are based on mathematics, they offer more precise results and easy interpretation with lower data detail. Although they have challenge to interpret due to their mathematical structure, but the accuracy of the conclusions and the interpretation of the output are appropriate. This article reviews the advanced methods in using the calibration of ground-based radar for forecasting meteorological events include two aspects: statistical techniques and data mining. Statistical techniques refer to empirical analyses such as regression, while data mining includes the Artificial Neural Network (ANN), data Kriging, Nearest Neighbour (NN), Decision Tree (DT) and fuzzy logic. The results show that Kriging is more applicable for interpolation. Regression methods are simple to use and data mining based on Artificial Intelligence is very precise. Thus, this review explores the characteristics of the statistical parameters in the field of radar applications and shows which parameters give the best results for undefined cases. DOI: 10.17762/ijritcc2321-8169.15012

    Motion Segmentation Aided Super Resolution Image Reconstruction

    Get PDF
    This dissertation addresses Super Resolution (SR) Image Reconstruction focusing on motion segmentation. The main thrust is Information Complexity guided Gaussian Mixture Models (GMMs) for Statistical Background Modeling. In the process of developing our framework we also focus on two other topics; motion trajectories estimation toward global and local scene change detections and image reconstruction to have high resolution (HR) representations of the moving regions. Such a framework is used for dynamic scene understanding and recognition of individuals and threats with the help of the image sequences recorded with either stationary or non-stationary camera systems. We introduce a new technique called Information Complexity guided Statistical Background Modeling. Thus, we successfully employ GMMs, which are optimal with respect to information complexity criteria. Moving objects are segmented out through background subtraction which utilizes the computed background model. This technique produces superior results to competing background modeling strategies. The state-of-the-art SR Image Reconstruction studies combine the information from a set of unremarkably different low resolution (LR) images of static scene to construct an HR representation. The crucial challenge not handled in these studies is accumulating the corresponding information from highly displaced moving objects. In this aspect, a framework of SR Image Reconstruction of the moving objects with such high level of displacements is developed. Our assumption is that LR images are different from each other due to local motion of the objects and the global motion of the scene imposed by non-stationary imaging system. Contrary to traditional SR approaches, we employed several steps. These steps are; the suppression of the global motion, motion segmentation accompanied by background subtraction to extract moving objects, suppression of the local motion of the segmented out regions, and super-resolving accumulated information coming from moving objects rather than the whole scene. This results in a reliable offline SR Image Reconstruction tool which handles several types of dynamic scene changes, compensates the impacts of camera systems, and provides data redundancy through removing the background. The framework proved to be superior to the state-of-the-art algorithms which put no significant effort toward dynamic scene representation of non-stationary camera systems

    Image Fuzzy Enhancement Based on Self-Adaptive Bee Colony Algorithm

    Get PDF
    In the image acquisition or transmission, the image may be damaged and distorted due to various reasons; therefore, in order to satisfy people’s visual effects, these images with degrading quality must be processed to meet practical needs. Integrating artificial bee colony algorithm and fuzzy set, this paper introduces fuzzy entropy into the self-adaptive fuzzy enhancement of image so as to realize the self-adaptive parameter selection. In the meanwhile, based on the exponential properties of information increase, it proposes a new definition of fuzzy entropy and uses artificial bee colony algorithm to realize the self-adaptive contrast enhancement under the maximum entropy criterion. The experimental result shows that the method proposed in this paper can increase the dynamic range compression of the image, enhance the visual effects of the image, enhance the image details, have some color fidelity capacity and effectively overcome the deficiencies of traditional image enhancement methods

    Identification of pore spaces in 3D CT soil images using a PFCM partitional clustering

    Get PDF
    Recent advances in non-destructive imaging techniques, such as X-ray computed tomography (CT), make it possible to analyse pore space features from the direct visualisation from soil structures. A quantitative characterisation of the three-dimensional solid-pore architecture is important to understand soil mechanics, as they relate to the control of biological, chemical, and physical processes across scales. This analysis technique therefore offers an opportunity to better interpret soil strata, as new and relevant information can be obtained. In this work, we propose an approach to automatically identify the pore structure of a set of 200-2D images that represent slices of an original 3D CT image of a soil sample, which can be accomplished through non-linear enhancement of the pixel grey levels and an image segmentation based on a PFCM (Possibilistic Fuzzy C-Means) algorithm. Once the solids and pore spaces have been identified, the set of 200-2D images is then used to reconstruct an approximation of the soil sample by projecting only the pore spaces. This reconstruction shows the structure of the soil and its pores, which become more bounded, less bounded, or unbounded with changes in depth. If the soil sample image quality is sufficiently favourable in terms of contrast, noise and sharpness, the pore identification is less complicated, and the PFCM clustering algorithm can be used without additional processing; otherwise, images require pre-processing before using this algorithm. Promising results were obtained with four soil samples, the first of which was used to show the algorithm validity and the additional three were used to demonstrate the robustness of our proposal. The methodology we present here can better detect the solid soil and pore spaces on CT images, enabling the generation of better 2D?3D representations of pore structures from segmented 2D images

    Benthic algal communities of shallow reefs in the Eastern Cape: availability of abalone habitat

    Get PDF
    Marine ranching has been identified as an alternative to traditional aquacultural rearing and growing organisms for consumption. In the Eastern Cape, abalone ranching is a new and experimental industry. The aims of the research were to: first develop a GIS model to assist management in site selection for abalone seeding; and secondly to develop and standardize the sampling methodology in order to ground truth the sites, and assist in the monitoring and habitat identification of abalone. The GIS model developed in Chapter 3 was created using an unsupervised classification and fuzzy logic approach. Both vector and raster datasets were utilized to represent 7 different layers. Predominantly satellite imagery was used to classify the different substrate groups according to pixel colour signatures. The basic process was to apply a fuzzy rule set (membership) to rasters which gave an output raster (Fuzzification). The membership output rasters were overlaid which creates a single model output. It was found that model accuracy increased significantly as more layers were overlaid, due to the high variability within each of the individual layers. Model ground-truthing showed a strong and significant correlation (r2 = 0.91; p < 0.001) between the model outputs and actual site suitability based on in situ evaluation. Chapter 4 describes the investigation towards the optimal sampling methods for abalone ranching habitat assessments. Both destructive sampling methods and imagery methods were considered as methods of data collection. The study also evaluated whether quadrat and transects were going to be suitable methods to assess sites, and what size or length respectively they should be to collect the appropriate data. Transect length showed great variation according to the factor assessed. A transect of 15 metres was found to be optimal. Abalone counts showed no significant (p = 0.1) change in the Coefficent of Variance (CV) for transect lengths greater than 15m, and had a mean of 0.2 abalone per metre. Quadrat size showed a significant difference in functional group richness between quadrat sizes of 0.0625m2, and 0.25m2 but no difference between 0.25m2 and 1m2 quadrats for both scape and photographic quadrats. It was also found that between 5 and 10 replicates (p = 0.08) represents the functional groups appropriately using quadrats and that a 0.25m2 quadrat is most suitable for sampling. Chapter 5 describes the benthic community structure of Cape Recife shallow water reefs. Using the standardized methodology previously mentioned, 45 sites were assessed to identify the community structure. These sites were grouped into 5 different groups influenced by depth and substrate, as well as functional group composition according to a Wards classification. The community structure showed that depth and substrate play a significant role (p < 0.05) in the community type. There is also a significant relationship (p < 0.05) between complexity, rugosity, abalone presence and substrate. During this study the basic protocols for site selection and benthic community monitoring have been developed to support the abalone ranching initiative in the Cape Recife area. It has also provided a baseline of the benthic community in the ranching concession area which will be used as a benchmark for future monitoring efforts. The site selection, sampling, and monitoring methods developed during the course of this work have now been rolled out as Standard Operating Procedures for the ranching programme in this area

    Practicable methodologies for delivering comprehensive spatial soils information

    Get PDF
    This thesis is concerned with practicable methodologies for delivering comprehensive spatial soil information to end-users. There is a need for relevant spatial soil information to complement objective decision-making for addressing current problems associated with soil degradation; for modelling, monitoring and measurement of particular soil services; and for the general management of soil resources. These are real-world situations, which operate at spatial scales ranging from field to global scales. As such, comprehensive spatial soil information is tailored to meet the spatial scale specifications of the end user, and is of a nature that fully characterises the whole-soil profile with associated prediction uncertainties, and where possible, both the predictions and uncertainties have been independently validated. ‘Practicable’ is an idealistic pursuit but nonetheless necessary because of a need to equip land-holders, private-sector and non-governmental stakeholders and, governmental departments including soil mapping agencies with the necessary tools to ensure wide application of the methodologies to match the demand for relevant spatial soil information. Practicable methodologies are general and computationally efficient; can be applied to a wide range of soil attributes; can handle variable qualities of data; and are effective when working with very large datasets. In this thesis, delivering comprehensive spatial soil information relies on coupling legacy soil information (principally site observations made in the field) with Digital Soil Mapping (DSM) which comprises quantitative, state-of-the-art technologies for soil mapping. After the General Introduction, a review of the literature is given in Chapter 1 which describes the research context of the thesis. The review describes soil mapping first from a historical perspective and rudimentary efforts of mapping soils and then tracks the succession of advances that have been made towards the realisation of populated, digital spatial soil information databases where measures of prediction certainties are also expressed. From the findings of the review, in order to deliver comprehensive spatial soil information to end-users, new research was required to investigate: 1) a general method for digital soil mapping the whole-profile (effectively pseudo-3D) distribution of soil properties; 2) a general method for quantifying the total prediction uncertainties of the digital soil maps that describe the whole-profile distribution of soil properties; 3) a method for validating the whole-profile predictions of soil properties and the quantifications of their uncertainties; 4) a systematic framework for scale manipulations or upscaling and downscaling techniques for digital soil mapping as a means of generating soil information products tailored to the needs of soil information users. Chapters 2 to 6 set about investigating how we might go about doing these with a succession of practicable methodologies. Chapter 2 addressed the need for whole-profile mapping of soil property distribution. Equal-area spline depth functions coupled with DSM facilitated continuous mapping the lateral and vertical distribution of soil properties. The spline function is a useful tool for deriving the continuous variation of soil properties from soil profile and core observations and is also suitable to use for a number of different soil properties. Generally, mapping the continuous depth function of soil properties reveals that the accuracy of the models is highest at the soil surface but progressively decreases with increasing soil depth. Chapter 3 complements the investigations made in Chapter 2 where an empirical method of quantifying prediction uncertainties from DSM was devised. This method was applied for quantifying the uncertainties of whole-profile digital soil maps. Prediction uncertainty with the devised empirical method is expressed as a prediction interval of the underlying model errors. The method is practicable in the sense that it accounts for all sources of uncertainty and is computationally efficient. Furthermore the method is amenable in situations where complex spatial soil prediction functions such as regression kriging approaches are used. Proper evaluation of digital soil maps requires testing the predictions and the quantification of the prediction uncertainties. Chapter 4 devised two new criteria in which to properly evaluate digital soil maps when additional soil samples collected by probability sampling are used for validation. The first criterion addresses the accuracy of the predictions in the presence of uncertainties and is the spatial average of the statistical expectation of the Mean Square Error of a simulated random value (MSES). The second criterion addresses the quality of the uncertainties which is estimated as the total proportion of the study area where the (1-α)-prediction interval (PI) covers the true value (APCP). Ideally these criteria will be coupled with conventional measures of map quality so that objective decisions can be made about the reliability and subsequent suitability of a map for a given purpose. It was revealed in Chapter 4, that the quantifications of uncertainty are susceptible to bias as a result of using legacy soil data to construct spatial soil prediction functions. As a consequence, in addition to an increasing uncertainty with soil depth, there is increasing misspecification of the prediction uncertainties. Chapter 2, 3, and 4 thus represent a framework for delivering whole-soil profile predictions of soil properties and their uncertainties, where both have been assessed or validated across mapping domains at a range of spatial scales for addressing field, farm, regional, catchment, national, continental or global soil-related problems. The direction of Chapters 5 and 6 however addresses issues specifically related to tailoring spatial soil information to the scale specifications of the end-user through the use of scale manipulations on existing digital soil maps. What is proposed in Chapter 5 is a scaling framework that takes into account the scaling triplet of digital soil maps—extent, resolution, and support—and recommends pedometric methodologies for scale manipulation based on the scale entities of the source and destination maps. Upscaling and downscaling are descriptors for moving up to coarser or down to finer scales respectively but may be too general for DSM. Subsequently Fine-gridding and coarse-gridding are operations where the grid spacing changes but support remains unchanged. Deconvolution and convolution are operations where the support always changes, which may or may not involve changing the grid spacing. While disseveration and conflation operations occur when the support and grid size are equal and both are then changed equally and simultaneously. There is an increasing richness of data sources describing the physical distribution of the Earth’s resources with improved qualities and resolutions. To take advantage of this, Chapter 6 devises a novel procedure for downscaling, involving disseveration. The method attempts to maintain the mass balance of the fine scaled predictions with the available coarse scaled information, through an iterative algorithm which attempts to reconstruct the variation of a property at a prescribed fine scale through an empirical function using environmental or covariate information. One of the advantages associated with the devised method is that soil property uncertainties at the coarse scale can be incorporated into the downscaling algorithm. Finally Chapter 7 presents a synthesis of the investigations made in Chapters 2 to 6 and summarises the pertinent findings. Directly from the investigations carried out during this project there are opportunities for further work; both in terms of addressing shortcomings that were highlighted but not investigated in the thesis, and more generally for advancing digital soil mapping to an operational status and beyond

    Modeling flood estimation using fuzzy logic & artificial neural network

    Get PDF
    Estimates of flood discharge with various risks of exceedance are needed for a wide range of engineering problems: examples are culvert and bridge design and construction floods in major projects. At a site with a long record of measured floods, these estimates may be derived by statistical analysis of the flow series. Alternatively the storm magnitude of an appropriate duration, aerial coverage and return period may be estimated and converted into the flood of a given return period using a rainfall/runoff model such as the unit hydrograph. However, in cases where adequate rainfall or river flow records are not available at or near the site of interest, it is difficult for hydrologists and engineers to derive reliable flood estimates directly and regional studies can be useful. This is particularly true in the case of semi-arid areas, where, in general, flow records are scarce. The problem of assigning a flood risk to a particular flow value is one which has received considerable attention in the literature. The estimation of flood risk through the evaluation of a flood frequency distribution is complicated, however, by the lack of a sufficient temporal characterization of the underlying distribution of flood events. The inadequacies in the data availability necessitate the estimation of the flood risk associated with events which have a return period beyond the length of the historical record. Regional flood frequency analysis can be effective in substituting an increased spatial characterization of the data for an insufficient temporal characterization, although problems exist with the implementation of regional flood frequency analysis techniques
    corecore