23 research outputs found

    Experiments to Distribute Map Generalization Processes

    Get PDF
    version étendue publiée : hal-02155541International audienceAutomatic map generalization requires the use of computationally intensive processes often unable to deal with large datasets. Distributing the generalization process is the only way to make them scalable and usable in practice. But map generalization is a highly contextual process, and the surroundings of a generalized map feature needs to be known to generalize the feature, which is a problem as distribution might partition the dataset and parallelize the processing of each part. This paper proposes experiments to evaluate the past propositions to distribute map generalization, and to identify the main remaining issues. The past propositions to distribute map generalization are first discussed, and then the experiment hypotheses and apparatus are described. The experiments confirmed that regular partitioning was the quickest strategy, but also the less effective in taking context into account. The geographical partitioning, though less effective for now, is quite promising regarding the quality of the results as it better integrates the geographical context

    Modelling geographic phenomena at multiple levels of detail : a model generalisation approach based on aggregation

    No full text
    Considerable interest remains in capturing once geographical information at the fine scale, and from this, automatically deriving information at various levels of detail and scale via the process of map generalisation. This research aims to develop a methodology for transformation of geographic phenomena at a high level of detail directly into geographic phenomena at higher levels of abstraction. Intuitive and meaningful interpretation of geographical phenomena requires their representation at multiple levels of detail. This is due to the scale dependent nature of their properties. Prior to the cartographic portrayal of that information, model generalisation is required in order to derive higher order phenomena typically associated with the smaller scales. This research presents a model generalisation approach able to support the derivation of phenomena typically present at 1:250,000 scale mapping, directly from a large scale topographic database (1:1250/1:2500/1:10,000). Such a transformation involves creation of higher order or composite objects, such as settlement, forest, hills and ranges, from lower order or component objects, such as buildings, trees, streets, and vegetation, in the source database. In order to perform this transformation it is important to model the meaning and relationships among source database objects rather than to consider the object in terms of their geometric primitives (points, lines and polygons). This research focuses on two types of relationships: taxonomic and partonomic. These relationships provide different but complimentary strategies for transformation of source database objects into required target database objects. The proposed methodology highlights the importance of partonomic relations for transformation of spatial databases over large changes in levels of detail. The proposed approach involves identification of these relationships and then utilising these relationships to create higher order objects. The utility of the results obtained, via the implementation of the proposed methodology, is demonstrated using spatial analysis techniques and the creation of ‘links’ between objects at different representations needed for multiple representation databases. The output database can then act as input to cartographic generalisation in order to create maps (digital or paper). The results are evaluated using manually generalised datasets.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Experiments to Distribute and Parallelize Map Generalization Processes

    No full text
    International audienceAutomatic map generalization requires the use of computationally intensive processes often unable to deal with large datasets. Distributing the generalization process is the only way to make them scalable and usable in practice. But map generalization is a highly contextual process, and the surroundings of a generalized map feature needs to be known to generalize the feature, which is a problem as distribution might partition the dataset and parallelize the processing of each part. This paper proposes experiments to evaluate the past propositions to distribute map generalization, and to identify the main remaining issues. The past propositions to distribute map generalization are first discussed, and then the experiment hypotheses and apparatus are described. The experiments confirmed that regular partitioning was the quickest strategy, but less effective when taking context into account. The geographical partitioning, though less effective for now, is quite promising regarding the quality of the results as it better integrates the geographical context

    A functional perspective on map generalisation

    No full text
    In the context of map generalisation, the ambition is to store once and then maintain a very detailed geographic database. Using a mix of modelling and cartographic generalisation techniques, the intention is to derive map products at varying levels of detail – from the fine scale to the highly synoptic. We argue that in modelling this process, it is highly advantageous to take a ‘functional perspective’ on map generalisation – rather than a geometric one. In other words to model the function as it manifests itself in the shapes and patterns of distribution of the phenomena being mapped – whether it be hospitals, airports, or cities. By modelling the functional composition of such features we can create relationships (partonomic, taxonomic and topological) that lend themselves directly to modelling, to analysis and most importantly to the process of generalisation. Borrowing from ideas in robotic vision this paper presents an approach for the automatic identification of functional sites (a collection of topographic features that perform a collective function) and demonstrates their utility in multi-scale representation and generalisation
    corecore