429 research outputs found
Cartographic Algorithms: Problems of Implementation and Evaluation and the Impact of Digitising Errors
Cartographic generalisation remains one of the outstanding challenges in digital cartography and Geographical Information Systems (GIS). It is generally assumed that computerisation will lead to the removal of spurious variability introduced by the subjective decisions of individual cartographers. This paper demonstrates through an inâdepth study of a line simplification algorithm that computerisation introduces its own sources of variability. The algorithm, referred to as the DouglasâPeucker algorithm in cartographic literature, has been widely used in image processing, pattern recognition and GIS for some 20 years. An analysis of this algorithm and study of some implementations in wide use identify the presence of variability resulting from the subjective decisions of software implementors. Spurious variability in software complicates the processes of evaluation and comparison of alternative algorithms for cartographic tasks. No doubt, variability in implementation could be removed by rigorous study and specification of algorithms. Such future work must address the presence of digitising error in cartographic data. Our analysis suggests that it would be difficult to adapt the DouglasâPeucker algorithm to cope with digitising error without altering the method. Copyright © 1991, Wiley Blackwell. All rights reserve
Line generalisation by repeated elimination of points
This paper presents a new approach to line generalisation which uses the concept of âeffective areaâ for progressive simplification of a line by point elimination. Two coastlines are used to compare the performance of this, with that of the widely used Douglas-Peucker, algorithm. The results from the area-based algorithm compare favourably with manual generalisation of the same lines. It is capable of achieving both imperceptible minimal simplifications and caricatural generalisations. By careful selection of cutoff values, it is possible to use the same algorithm for scale-dependent and scale-independent generalisations. More importantly, it offers scope for modelling cartographic lines as consisting of features within features so that their geometric manipulation may be modified by application- and/or user-defined rules and weights. The paper examines the merits and limitations of the algorithm and the opportunities it offers for further research and progress in the field of line generalisation. © 1993 Maney Publishing
LIFE ETHICS EVINCED IN TAMIL LITERATURE
It is an array of scholarly articles in Tamil Literature
INTERNATIONAL JOURNAL OF TAMIL LANGUAGE AND LITERARY STUDIES (Ijtlls)
The articles are about noble ideas and thoughts evinced in Tamil Literature
INTERNATIONAL JOURNAL OFTAMIL LANGUAGEAND LITERARY STUDIES
This is the collection of issues of Vol 2 Issue 1 of the journal
International Journal of Tamil Language and Literary Studies
The conference is an outcome to bring out the LIFE ETHICS EVINCED IN TAMIL LITERATURE
A Trapezoidal Fuzzy Membership Genetic Algorithm (TFMGA) for Energy and Network Lifetime Maximization under Coverage Constrained Problems in Heterogeneous Wireless Sensor Networks
Network lifetime maximization of Wireless Heterogeneous Wireless Sensor Networks (HWSNs) is a difficult problem. Though many methods have been introduced and developed in the recent works to solve network lifetime maximization. However, in HWSNs, the energy efficiency of sensor nodes becomes also a very difficult issue. On the other hand target coverage problem have been also becoming most important and difficult problem. In this paper, new Markov Chain Monte Carlo (MCMC) is introduced which solves the energy efficiency of sensor nodes in HWSN. At initially graph model is modeled to represent HWSNs with each vertex representing the assignment of a sensor nodes in a subset. At the same time, Trapezoidal Fuzzy Membership Genetic Algorithm (TFMGA) is proposed to maximize the number of Disjoint Connected Covers (DCC) and K-Coverage (KC) known as TFMGA-MDCCKC. Based on gene and chromosome information from the TFMGA, the gene seeks an optimal path on the construction graph model that maximizes the MDCCKC. In TFMGA gene thus focuses on finding one more connected covers and avoids creating subsets particularly. A local search procedure is designed to TFMGA thus increases the search efficiency. The proposed TFMGA-MDCCKC approach has been applied to a variety of HWSNs. The results show that the TFMGA-MDCCKC approach is efficient and successful in finding optimal results for maximizing the lifetime of HWSNs. Experimental results show that proposed TFMGA-MDCCKC approach performs better than Bacteria Foraging Optimization (BFO) based approach, Ant Colony Optimization (ACO) method and the performance of the TFMGA-MDCCKC approach is closer to the energy-conserving strategy
Image Steganography using Hybrid Edge Detector and Ridgelet Transform
Steganography is the art of hiding high sensitive information in digital image, text, video, and audio. In this paper, authors have proposed a frequency domain steganography method operating in the Ridgelet transform. Authors engage the advantage of ridgelet transform, which represents the digital image with straight edges. In the embedding phase, the proposed hybrid edge detector acts as a preprocessing step to obtain the edge image from the cover image, then the edge image is partitioned into several blocks to operate with straight edges and Ridgelet transform is applied to each block. Then, the most significant gradient vectors (or significant edges) are selected to embed the secret data. The proposed method has shown the advantages of imperceptibility of the stego image is increased because the secret data is hidden in the significant gradient vector. Authors employed the hybrid edge detector to obtain the edge image, which increases the embedding capacity. Experimental results demonstrates that peak signal-to-noise (PSNR) ratio of stego image generated by this method versus the cover image is guaranteed to be above 49 dB. PSNR is much higher than that of all data hiding techniques reported in the literature.Defence Science Journal, Vol. 65, No. 3, May 2015, pp.214-219, DOI: http://dx.doi.org/10.14429/dsj.65.787
In-medium dependence and Coulomb effects of the pion production in heavy ion collisions
The properties of the high energy pions observed in heavy ion collisions, in
particular in the system Au on Au at 1 GeV/nucleon are investigated. The
reaction dynamics is described within the Quantum Molecular Dynamics (QMD)
approach. It is shown that high energy pions freeze out early and originate
from the hot, compressed matter. --resonances are found to give an
important contribution toward the high energy tail of the pion. Further the
role of in-medium effects in the description of charged pion yields and spectra
is investigated using a microscopic potential derived from the Brueckner
G-matrix which is obtained with the Reid soft-core potential. It is seen that
the high energy part of the spectra is relatively more suppressed due to
in-medium effects as compared to the low energy part. A comparision to
experiments further demonstrates that the present calculations describe
reasonably well the neutral (TAPS) and charged (FOPI) pion spectra. The
observed energy dependence of the ratio, i.e. deviations from the
isobar model prediction, is due to Coulomb effects and again indicate that high
energy pions probe the hot and dense phase of the reaction. These findings are
confirmed independently by a simple phase space analysis.Comment: 28 pages Latex, prepared with elsevier-style, 13 PS-figure
- âŠ