256 research outputs found

    GIS in Malaysia

    Get PDF

    Well-point techniques and the shallow water table in boulder clay

    Get PDF
    The reliability of water-table measurements in clay soil is currently under review (Twocock, 1971; Bonell, 1971; Visvalingam, 1972). This paper summarizes some of the experimental results from a boulder clay catchment in East Yorkshire. The experiments investigated the functioning characteristics of cased auger holes and piezometers in clay soil and compared the results with observations made with a neutron moisture probe. It appears that well-point technique, especially piezometers, are extremely unreliable in clay soil. The measured water level is demonstrated to be influenced by not only the position of the ‘water table’ but also the permeability of the soil; in which context the type, diameter, and length of tubing, as well as the time of installation, become important considerations

    A computer science perspective on the bendsimplification algorithm

    Get PDF
    The primary aim of this study was to evaluate whether the use of bends provides a better basis than point elimination for research on line structuring. These investigations were undertaken using Arc/Info 7.1.1. Comparative experimental results suggest that the algorithm may not be as widely applicable as the much simpler geometric filters, such as the Douglas-Peucker or Visvalingam algorithms. The paper therefore provides a brief review of these three algorithms. A more detailed conceptual and empirical evaluation of the bendsimplification system follows, highlighting some problems with implementing the system in Arc/Info. The paper then questions the value of over-coupling model- and image-oriented generalization processes within the black-box bendsimplification system. It suggests the type of parameters which could enhance the utility and usability of the Bendsimplify option within the Arc/Info (and perhaps also within the ArcView) environment and provides some pointers for further research. With respect to the main aim of the research, the evidence suggests that bendsimplification is less useful for line segmentation than Visvalingam's algorithm. Further research is needed to assess the value of the iterative bend elimination operator within bendsimplification

    Line generalisation by repeated elimination of points

    Get PDF
    This paper presents a new approach to line generalisation which uses the concept of ‘effective area’ for progressive simplification of a line by point elimination. Two coastlines are used to compare the performance of this, with that of the widely used Douglas-Peucker, algorithm. The results from the area-based algorithm compare favourably with manual generalisation of the same lines. It is capable of achieving both imperceptible minimal simplifications and caricatural generalisations. By careful selection of cutoff values, it is possible to use the same algorithm for scale-dependent and scale-independent generalisations. More importantly, it offers scope for modelling cartographic lines as consisting of features within features so that their geometric manipulation may be modified by application- and/or user-defined rules and weights. The paper examines the merits and limitations of the algorithm and the opportunities it offers for further research and progress in the field of line generalisation. © 1993 Maney Publishing

    Deconstruction of fractals and its implications for cartographic education

    Get PDF
    The research reported here was designed for two reasons: firstly, to involve anyone with an interest in cartographic visualization to participate in eliciting cartographic knowledge and to provide them with the opportunity to contribute their practical knowledge and opinions; and secondly, to inform the design of algorithms for line generalization. In the past, there has been some resistance to such mining and codification of expert knowledge. However, many cartographers now welcome highly interactive computer graphics, computer mapping, and virtual reality systems as providing them with new opportunities for launching cartography into a new creative age. Despite nearly thirty years of research on line generalization algorithms, the available algorithms are somewhat simplistic. This research, undertaken under the auspices of the BCS Design Group, explored the behavioural tendencies of cartographers engaged in line filtering. The results show that a carefully contrived, even if obviously artificial, exercise on the deconstruction of lines into meaningless forms can prompt cartographers to observe, record, and discuss their own cognitive processing

    Cartographic Algorithms: Problems of Implementation and Evaluation and the Impact of Digitising Errors

    Get PDF
    Cartographic generalisation remains one of the outstanding challenges in digital cartography and Geographical Information Systems (GIS). It is generally assumed that computerisation will lead to the removal of spurious variability introduced by the subjective decisions of individual cartographers. This paper demonstrates through an in‐depth study of a line simplification algorithm that computerisation introduces its own sources of variability. The algorithm, referred to as the Douglas‐Peucker algorithm in cartographic literature, has been widely used in image processing, pattern recognition and GIS for some 20 years. An analysis of this algorithm and study of some implementations in wide use identify the presence of variability resulting from the subjective decisions of software implementors. Spurious variability in software complicates the processes of evaluation and comparison of alternative algorithms for cartographic tasks. No doubt, variability in implementation could be removed by rigorous study and specification of algorithms. Such future work must address the presence of digitising error in cartographic data. Our analysis suggests that it would be difficult to adapt the Douglas‐Peucker algorithm to cope with digitising error without altering the method. Copyright © 1991, Wiley Blackwell. All rights reserve

    Simplification and generalization of large scale data for roads : a comparison of two filtering algorithms

    Get PDF
    This paper reports the results of an in-depth study which investigated two algorithms for line simplification and caricatural generalization (namely, those developed by Douglas and Peucker, and Visvalingam, respectively) in the context of a wider program of research on scale-free mapping. The use of large-scale data for man-designed objects, such as roads, has led to a better understanding of the properties of these algorithms and of their value within the spectrum of scale-free mapping. The Douglas-Peucker algorithm is better at minimal simplification. The large-scale data for roads makes it apparent that Visvalingam's technique is not only capable of removing entire scale-related features, but that it does so in a manner which preserves the shape of retained features. This technique offers some prospects for the construction of scale-free databases since it offers some scope for achieving balanced generalizations of an entire map, consisting of several complex lines. The results also suggest that it may be easier to formulate concepts and strategies for automatic segmentation of in-line features using large-scale road data and Visvalingam's algorithm. In addition, the abstraction of center lines may be facilitated by the inclusion of additional filtering rules with Visvalingam's algorithm

    Testing implementations of Visvalingam's algorithm for line generalisation

    Get PDF
    There are a growing number of open source and commercial implementations of the Visvalingam algorithm for line generalisation. The algorithm provides scope for implementation-specific interpretations, with different outcomes. This is inevitable and sometimes necessary and, they do not necessarily imply that an implementation is flawed. The only restriction is that the output must not be so inconsistent with the intent of the algorithm that it becomes unusable. This paper provides some ideas, data and sample output to help users compare the output from their implementations with those produced by Visvalingam. This may help them ascertain whether some problems they may encounter appear to be specific to their implementation or whether they are a general feature of the algorithm. This paper assesses the utility and limitations of the Mapshaper options for Visvalingam’s algorithm. Similar, but not identical, depictions of coastlines are produced by Visvalingam’s implementation and by Mapshaper. However, the programs produce very dissimilar output for the rectangular Koch island, also known as the quadratic Koch island - Mapshaper’s output is unbalanced for both its Visvalingam and Douglas-Peucker options. This suggests that the problem, which is not immediately obvious, is in some function inherited by both options. Both programs produce near identical output when Mapshaper’s Visvalingam/weighted area option was compared using coastlines. This suggests that the problem arises from Mapshaper’s treatment of equal-valued metrics; this can be changed. Implementers and users may wish to use the data and methods given in this paper to test their own implementations if and when necessary

    Indexing with coded deltas—a data compaction technique

    Get PDF
    The paper describes the coded delta scheme, which is one of the methods used by the Census Research Unit, University of Durham, for compacting the 1971 U.K. census data. It evaluates the merits and limitations of the technique in relation to the characteristics of the data set and other techniques available for compact encoding of numeric and string data
    • 

    corecore