75 research outputs found

    Line generalisation by repeated elimination of points

    Get PDF
    This paper presents a new approach to line generalisation which uses the concept of ‘effective area’ for progressive simplification of a line by point elimination. Two coastlines are used to compare the performance of this, with that of the widely used Douglas-Peucker, algorithm. The results from the area-based algorithm compare favourably with manual generalisation of the same lines. It is capable of achieving both imperceptible minimal simplifications and caricatural generalisations. By careful selection of cutoff values, it is possible to use the same algorithm for scale-dependent and scale-independent generalisations. More importantly, it offers scope for modelling cartographic lines as consisting of features within features so that their geometric manipulation may be modified by application- and/or user-defined rules and weights. The paper examines the merits and limitations of the algorithm and the opportunities it offers for further research and progress in the field of line generalisation. © 1993 Maney Publishing

    Cartographic Algorithms: Problems of Implementation and Evaluation and the Impact of Digitising Errors

    Get PDF
    Cartographic generalisation remains one of the outstanding challenges in digital cartography and Geographical Information Systems (GIS). It is generally assumed that computerisation will lead to the removal of spurious variability introduced by the subjective decisions of individual cartographers. This paper demonstrates through an in‐depth study of a line simplification algorithm that computerisation introduces its own sources of variability. The algorithm, referred to as the Douglas‐Peucker algorithm in cartographic literature, has been widely used in image processing, pattern recognition and GIS for some 20 years. An analysis of this algorithm and study of some implementations in wide use identify the presence of variability resulting from the subjective decisions of software implementors. Spurious variability in software complicates the processes of evaluation and comparison of alternative algorithms for cartographic tasks. No doubt, variability in implementation could be removed by rigorous study and specification of algorithms. Such future work must address the presence of digitising error in cartographic data. Our analysis suggests that it would be difficult to adapt the Douglas‐Peucker algorithm to cope with digitising error without altering the method. Copyright © 1991, Wiley Blackwell. All rights reserve

    Simplification and generalization of large scale data for roads : a comparison of two filtering algorithms

    Get PDF
    This paper reports the results of an in-depth study which investigated two algorithms for line simplification and caricatural generalization (namely, those developed by Douglas and Peucker, and Visvalingam, respectively) in the context of a wider program of research on scale-free mapping. The use of large-scale data for man-designed objects, such as roads, has led to a better understanding of the properties of these algorithms and of their value within the spectrum of scale-free mapping. The Douglas-Peucker algorithm is better at minimal simplification. The large-scale data for roads makes it apparent that Visvalingam's technique is not only capable of removing entire scale-related features, but that it does so in a manner which preserves the shape of retained features. This technique offers some prospects for the construction of scale-free databases since it offers some scope for achieving balanced generalizations of an entire map, consisting of several complex lines. The results also suggest that it may be easier to formulate concepts and strategies for automatic segmentation of in-line features using large-scale road data and Visvalingam's algorithm. In addition, the abstraction of center lines may be facilitated by the inclusion of additional filtering rules with Visvalingam's algorithm

    The Douglas-Peucker algorithm for line simplification: Re-evaluation through visualization

    Get PDF
    The primary aim of this paper is to illustrate the value of visualization in cartography and to indicate that tools for the generation and manipulation of realistic images are of limited value within this application. This paper demonstrates the value of visualization within one problem in cartography, namely the generalisation of lines. It reports on the evaluation of the Douglas-Peucker algorithm for line simplification. Visualization of the simplification process and of the results suggest that the mathematical measures of performance proposed by some other researchers are inappropriate, misleading and questionable

    The neutron method for measuring soil moisture content - a review

    Get PDF
    THE various methods of measuring the soil moisture content and its variations in space and time have been reviewed by many authors (Taylor, 1955; Marshall, 1959; Todd, 1960; Ballard and Gardner, 1965; Cope and Trickett, 1965). The moisture content is either measured directly, as in the gravimetric method, or it is estimated by determining its relationship to some other property of the soil as in the electric resistance, tensiometer and neutron scattering methods. The neutron scattering method estimates the moisture content of the soil by measurement of its hydrogen content. This paper summarizes the theoretical and practical aspects of the method and provides a bibliography which includes references to papers published more recently than those provided by Sweeny (1962), Ballard and Gardner (1965), and the Commonwealth Bureau of Soils (1968)

    Speeding up Simplification of Polygonal Curves using Nested Approximations

    Full text link
    We develop a multiresolution approach to the problem of polygonal curve approximation. We show theoretically and experimentally that, if the simplification algorithm A used between any two successive levels of resolution satisfies some conditions, the multiresolution algorithm MR will have a complexity lower than the complexity of A. In particular, we show that if A has a O(N2/K) complexity (the complexity of a reduced search dynamic solution approach), where N and K are respectively the initial and the final number of segments, the complexity of MR is in O(N).We experimentally compare the outcomes of MR with those of the optimal "full search" dynamic programming solution and of classical merge and split approaches. The experimental evaluations confirm the theoretical derivations and show that the proposed approach evaluated on 2D coastal maps either shows a lower complexity or provides polygonal approximations closer to the initial curves.Comment: 12 pages + figure

    Risk factors for non-communicable diseases at baseline and their short-term changes in a workplace cohort in Singapore

    Get PDF
    Copyright © 2019 by the authors. We aimed to examine the behavioural and clinical risk factors for non-communicable diseases (NCDs) at baseline and their changes over 12 months in a workplace cohort in Singapore. A total of 464 full-time employees (age ≄ 21 years) were recruited from a variety of occupational settings, including offices, control rooms, and workshops. Of these, 424 (91.4%) were followed-up at three months and 334 (72.0%) were followed up at 12 months. Standardized questionnaires were used to collect data on health behaviours and clinical measurements were performed by trained staff using standard instruments and protocols. Age-adjusted changes in risk factors over time were examined using generalized estimating equations or linear mixed-effects models where appropriate. The mean age of the participants at baseline was 39.0 (SD: 11.4) years and 79.5% were men. Nearly a quarter (24.4%) were current smokers, slightly more than half (53.5%) were alcohol drinkers, two-thirds (66%) were consuming <5 servings of fruit and vegetables per day, and 23.1% were physically inactive. More than two-thirds (67%) were overweight or obese and 34.5% had central obesity. The mean follow-up was 8.6 months. After adjusting for age, over 12 months, there was a significant increase in the proportion consuming <5 servings of fruit and vegetables per day by 33% (p = 0.030), who were physically inactive by 64% (p < 0.001), and of overweight or obese people by 15% (p = 0.018). The burden of several key NCD risk factors at baseline was high and some worsened within a short period of time in this working population. There is a need for more targeted strategies for behaviour change towards a healthy lifestyle as part of the ongoing health and wellness programs at workplaces in SingaporeSingapore Ministry of National Development and the National Research Foundation; Prime Minister’s Office under the Land and Liveability National Innovation Challenge (L2 NIC) Research Programme

    Health Effects of Underground Workspaces cohort: study design and baseline characteristics

    Get PDF
    The development of underground workspaces is a strategic effort towards healthy urban growth in cities with ever-increasing land scarcity. Despite the growth in underground workspaces, there is limited information regarding the impact of this environment on workers’ health. The Health Effects of Underground Workspaces (HEUW) study is a cohort study that was set up to examine the health effects of working in underground workspaces. In this paper, we describe the rationale for the study, study design, data collection, and baseline characteristics of participants. The HEUW study recruited 464 participants at baseline, of whom 424 (91.4%) were followed-up at 3 months and 334 (72.0%) at 12 months from baseline. We used standardized and validated questionnaires to collect information on socio-demographic and lifestyle characteristics, medical history, family history of chronic diseases, sleep quality, health-related quality of life, chronotype, psychological distress, occupational factors, and comfort levels with indoor environmental quality parameters. Clinical and anthropometric parameters including blood pressure, spirometry, height, weight, and waist and hip circumference were also measured. Biochemical tests of participants’ blood and urine samples were conducted to measure levels of glucose, lipids, and melatonin. We also conducted objective measurements of individuals’ workplace environment, assessing air quality, light intensity, temperature, thermal comfort, and bacterial and fungal counts. The findings this study will help to identify modifiable lifestyle and environmental parameters that are negatively affecting workers’ health. The findings may be used to guide the development of more health-promoting workspaces that attempt to negate any potential deleterious health effects from working in underground workspaces.Singapore Ministry of National Development; National Research Foundation; Prime Minister’s Office; Singapore
    • 

    corecore