1,147 research outputs found

    Laser-scanning based tomato plant modeling for virtual greenhouse environment.

    Get PDF

    The impact of natural organic matter on floc structure

    Get PDF
    The removal of natural organic matter (NOM) at water treatment works (WTW) is essential in order to prevent toxic compounds forming during subsequent disinfection. Coagulation and flocculation processes remain the most common way of removing NOM. The properties of the resulting flocs that form are fundamental to the efficient removal of organic material. Periods of elevated NOM loads at WTW can lead to operational problems as a result of the deterioration in floc structural quality. Assessment of floc physical characteristics can therefore be a crucial tool in order to determine and predict solid-liquid removal performance at WTW. Here the growth, size, breakage, strength, re-growth, fractal dimension and settling velocity were measured for flocs formed from a NOM rich water source. NOM floc structural characteristics were measured and evaluated over a one year period in order to monitor the seasonal variation in floc structure. The results showed that a significant improvement in floc size and strength was seen during autumn and summer months. It was subsequently shown that as the organic fraction in the floc increases the floc size, settling velocity and fractal dimension all decrease. A model was proposed showing how these changes were dependent upon the adsorption of NOM onto primary particle surfaces. A range of different chemical coagulant treatment options were applied for NOM removal and the resulting floc structure compared. Considering both floc structure and optimum NOM removal the treatment systems were of the following order (best to worst): MIEX® + Fe > Fe > Fe + polymer > Al > polyDADMAC. NOM floc re-growth was shown to be limited for all the treatment systems investigated. The practical implications of the results were: (1) The requirement for careful coagulant dosing or order to achieve optimum floc characteristics. (2) The use of a pre-treatment anionic ion-exchange stage prior to coagulation. (3) A comparison of alum and ferric based coagulants suggested the ferric coagulants gave better floc structure and improved NOM removal rates.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    A fractal approach to mixing-microstructure-property relationship for rubber compounds

    Get PDF
    The research is concerned with· exploration of the utility of fractal methods for characterising the mixing treatment applied to a rubber compound and also for characterising the microstructure developed during mixing (filler dispersion). Fractal analysis is also used for characterisation of the fracture surfaces generated during tensile testing of vulcanised samples. For these purposes, Maximum Entropy Method and Box Counting Method are developed and they are applied to analyse the mixing treatment and the filler dispersion, respectively. These methods are effectively used and it is found that fractal dimensions of mixer-power-traces and fracture surfaces of vulcanised rubber decrease with the evolution of mixing time while the fractal dimension of the state-of-mix (filler dispersion) also decreases. The relationship of the fractal dimensions thus determined with conventional properties, such as viscosity, tensile strength and heat transfer coefficient are then explored For example, a series of thennal measurements are carried out during vulcanisation process and the data are analysed for determining the heat transfer coefficient Nuclear Magnetic Resonance is used to obtain the properties of bound rubber and a quantitative analysis is also carried out and possible mechanisms for the relationships between the parameters are discussed based on existing interpretations. Fmally, the utility of the fractal methods for establishing mixing-microstructureproperty relationships is compared with more conventional and well established methods. For this purpose, the fractal dimension of the state-of-mix is compared to conventional methods such as the Payne Effect, electrical conductivity and carbon black dispersion (ASTM D2663 Method C). It is found that the characterisation by the fractal concept agrees with the conclusions from these conventional methods. In addition, it becomes possible to interpret the relationships between these conventional methods with the help of the fractal concept

    Geospatial big data and cartography : research challenges and opportunities for making maps that matter

    Get PDF
    Geospatial big data present a new set of challenges and opportunities for cartographic researchers in technical, methodological, and artistic realms. New computational and technical paradigms for cartography are accompanying the rise of geospatial big data. Additionally, the art and science of cartography needs to focus its contemporary efforts on work that connects to outside disciplines and is grounded in problems that are important to humankind and its sustainability. Following the development of position papers and a collaborative workshop to craft consensus around key topics, this article presents a new cartographic research agenda focused on making maps that matter using geospatial big data. This agenda provides both long-term challenges that require significant attention as well as short-term opportunities that we believe could be addressed in more concentrated studies.PostprintPeer reviewe

    Quantification of flow impairment in faulted sandstone reservoirs.

    Get PDF
    Abstract unavailable please refer to PD

    A knowledge-based design system for a housing project for altzheimer\u27s [sic] disease patients and their caregivers

    Get PDF
    Since the beginning of the 20th century, architecture went through several different developments. Throughout these developments a large number of movements and design models Emerged, based on different design knowledge. The motivation for design projects is related Mainly to historical, functional, or site related issues. Today\u27s changes in social relationships and therefore living arrangements question the validity of existing traditional design solutions, especially in the case of disabled clients. A new or modified specific design knowledge base has to be considered in order to meet the specific client\u27s needs. To satisfy the wide range of changing nontraditional criteria that face todays architects, the new design procedure has to result in a simple yet flexible model. This thesis proposes one such solution as it is applied to Alzheimer\u27s disease patients and their caregivers

    Modeling of probe-surface interactions in nanotopographic measurements

    Get PDF
    Contact stylus methods remain important tools in surface roughness measurement, but as metrological capability increases there is growing need for better understanding of the complex interactions between a stylus tip and a surface. For example, questions arise about the smallest scales of topographic features that can be described with acceptable uncertainty, or about how to compare results taken with different types of probe. This thesis uses simulation methods to address some aspects of this challenge. A new modelling and simulation program has been developed and used to examine the measuring of the fine structure of the real and simulated surfaces by the stylus method. Although able to scan any arbitrary surface with any arbitrary stylus shape, the majority of the results given here uses idealized stylus shapes and ‘real’ ground steel surfaces. The simulation is not only used to measure the roughness of the surface but also to show the contacts distribution on the tip when scanning a surface. Surface maps of the fine structure of ground steel surfaces were measured by Atomic Force Microscopy (AFM) to ensure high lateral resolution compared to the capability of the target profilometry instruments. The data collected by the AFM were checked for missing data and interpolated by the scanning probe image processor (SPIP) software. Three basic computer generated stylus tips with different shapes have been used: conical, pyramid and spherical shapes. This work proposes and explores in detail the novel concept of “thresholding” as an adjunct to kinematic contact modelling; the tip is incremented downwards 'into' the surface and resulting contact regions (or islands) compared to the position of the initial kinematic contact. Essentially the research questions have been inquiring into the effectiveness of so-called kinematic contact models by modifying them in various ways and judging whether significantly different results arise. Initial evidence shows that examination of the contact patterns as the threshold increases can identify the intensity with which different asperity regions interact with the stylus. In the context of sections of the ground surface with a total height variation in the order of 500 nm to 1 μm, for example, a 5 nm threshold caused little change in contact sizes from the kinematic point, but 50 nm caused them to grow asymmetrically, eventually picking out the major structures of the surface. The simulations have naturally confirmed that the stylus geometry and size can have a significant effect on most roughness parameters of the measured surface in 3D. Therefore the major contribution is an investigation of the inherent (finite probe) distortions during topographic analysis using a stylus-based instrument. The surprising finding which is worthy of greater investigation, is how insensitive to major changes in stylus condition some of the popular parameters are, even when dealing with very fine structure within localized areas of a ground surface. For these reasons, it is concluded that thresholding is not likely to become a major tool in analysis, although it can certainly be argued that it retains some practical value as a diagnostic of the measurement process. This research will ultimately allow better inter-comparison between measurements from different instruments by allowing a ‘software translator’ between them. Short of fully realizing this ambitious aim, the study also contributes to improving uncertainty models for stylus instruments

    Regular Hierarchical Surface Models: A conceptual model of scale variation in a GIS and its application to hydrological geomorphometry

    Get PDF
    Environmental and geographical process models inevitably involve parameters that vary spatially. One example is hydrological modelling, where parameters derived from the shape of the ground such as flow direction and flow accumulation are used to describe the spatial complexity of drainage networks. One way of handling such parameters is by using a Digital Elevation Model (DEM), such modelling is the basis of the science of geomorphometry. A frequently ignored but inescapable challenge when modellers work with DEMs is the effect of scale and geometry on the model outputs. Many parameters vary with scale as much as they vary with position. Modelling variability with scale is necessary to simplify and generalise surfaces, and desirable to accurately reconcile model components that are measured at different scales. This thesis develops a surface model that is optimised to represent scale in environmental models. A Regular Hierarchical Surface Model (RHSM) is developed that employs a regular tessellation of space and scale that forms a self-similar regular hierarchy, and incorporates Level Of Detail (LOD) ideas from computer graphics. Following convention from systems science, the proposed model is described in its conceptual, mathematical, and computational forms. The RHSM development was informed by a categorisation of Geographical Information Science (GISc) surfaces within a cohesive framework of geometry, structure, interpolation, and data model. The positioning of the RHSM within this broader framework made it easier to adapt algorithms designed for other surface models to conform to the new model. The RHSM has an implicit data model that utilises a variation of Middleton and Sivaswamy (2001)’s intrinsically hierarchical Hexagonal Image Processing referencing system, which is here generalised for rectangular and triangular geometries. The RHSM provides a simple framework to form a pyramid of coarser values in a process characterised as a scaling function. In addition, variable density realisations of the hierarchical representation can be generated by defining an error value and decision rule to select the coarsest appropriate scale for a given region to satisfy the modeller’s intentions. The RHSM is assessed using adaptions of the geomorphometric algorithms flow direction and flow accumulation. The effects of scale and geometry on the anistropy and accuracy of model results are analysed on dispersive and concentrative cones, and Light Detection And Ranging (LiDAR) derived surfaces of the urban area of Dunedin, New Zealand. The RHSM modelling process revealed aspects of the algorithms not obvious within a single geometry, such as, the influence of node geometry on flow direction results, and a conceptual weakness of flow accumulation algorithms on dispersive surfaces that causes asymmetrical results. In addition, comparison of algorithm behaviour between geometries undermined the hypothesis that variance of cell cross section with direction is important for conversion of cell accumulations to point values. The ability to analyse algorithms for scale and geometry and adapt algorithms within a cohesive conceptual framework offers deeper insight into algorithm behaviour than previously achieved. The deconstruction of algorithms into geometry neutral forms and the application of scaling functions are important contributions to the understanding of spatial parameters within GISc

    Applied Fracture Mechanics

    Get PDF
    The book "Applied Fracture Mechanics" presents a collection of articles on application of fracture mechanics methods to materials science, medicine, and engineering. In thirteen chapters, a wide range of topics is discussed, including strength of biological tissues, safety of nuclear reactor components, fatigue effects in pipelines, environmental effects on fracture among others. In addition, the book presents mathematical and computational methods underlying the fracture mechanics applications, and also developments in statistical modeling of fatigue. The work presented in this book will be useful, effective, and beneficial to mechanical engineers, civil engineers, and material scientists from industry, research, and education
    corecore