810 research outputs found

    Tree Species Composition and Forest Stratification along the Gradients in the Dry Deciduous Forests of Godavari Valley, Telangana, India

    Get PDF
    It is important to understand the tree species composition, abundance, species diversity and stratification in tropical dry deciduous forests that are under threat. A quadrat study was attempted in the dry deciduous forests along the ecological gradients in the Godavari Valley of northern Telangana, India. The study records the presence of 110 flowering plant taxa belonging to 82 genera and 37 families in 120 sampled plots, and there was enumeration of 15,192 individuals of ≥10 cm girth at breast height. Tectona grandis (teak) is the principal forest cover component in the region, which often formed pure stands in Adilabad and, to some extent, in Nizamabad districts. Further down to the Warangal district, teak was gradually replaced by Terminalia alata. Twenty tree species were found dominant at one place to the other, and the top 10 dominant taxa have shared nearly 41% of the total density of the forest cover. The tree relative density ranged from 0.007% to 20.84%. The values of Importance Value Index were between 0.245 (12 spp. including some exotics) and 32.6 (teak). These baseline data help to know the change detection along the gradients in the tropical forest ecosystem of a major river valley in the region and the drivers of change

    Purging of untrustworthy recommendations from a grid

    Full text link
    In grid computing, trust has massive significance. There is lot of research to propose various models in providing trusted resource sharing mechanisms. The trust is a belief or perception that various researchers have tried to correlate with some computational model. Trust on any entity can be direct or indirect. Direct trust is the impact of either first impression over the entity or acquired during some direct interaction. Indirect trust is the trust may be due to either reputation gained or recommendations received from various recommenders of a particular domain in a grid or any other domain outside that grid or outside that grid itself. Unfortunately, malicious indirect trust leads to the misuse of valuable resources of the grid. This paper proposes the mechanism of identifying and purging the untrustworthy recommendations in the grid environment. Through the obtained results, we show the way of purging of untrustworthy entities.Comment: 8 pages, 4 figures, 1 table published by IJNGN journal; International Journal of Next-Generation Networks (IJNGN) Vol.3, No.4, December 201

    Characterization of sleep EEG

    Get PDF
    The physiological relationship between the various components of sleep and its variation due to drug administration has been used as one of the primary tools to analyze the performance of drug. A number of studies have been performed in recent years in this direction. Electroencephalogram (EEG) has been characterized with the help of variables ranging from measurements of the duration of different sleep stages to the activities that define the stages themselves. Advances in computer hardware and software have improved the methods of data acquisition and storage. Analysis of long stretch of data has always been a problem considering the time and storage. The present study is aimed at characterizing sleep data from a subject suffering from neurologic disorder. It also aims at identifying the effect of Oxycodon a Narcotic drug on the subject during sleep. The data considered for analysis is the output of a whole night recording. It is for a duration of six hours. EEG signals are analyzed using random data analysis procedures. The assumption of stationarity will be used as the basis of analysis. However the fact that analysis on long stretch of data introducing nonstationarity will not be ruled out. The analysis will be performed using Fast Fourier Transformation. Spectral analysis will be used as the primary tool in identifying the activities of various frequency components and its variation with time. The three parameters that will be considered are Mean square values and correlation function in time domain, Spectral analysis application in frequency domain and the probability density and distribution functions in the amplitude domain. Algorithms will be developed for computing these parameters and other statistical properties

    AFIS Based Likelihood Ratios for Latent Fingerprint Comparisons

    Get PDF
    Latent fingerprints are one of the most common pieces of evidence found on a crime scene and represent accidental or unintentional prints collected as part of a criminal investigation. They are caused when the friction ridge skin comes in contact with a surface, and thus requires the use of chemical processing to be visualized with the naked eye. The comparison and identification of fingerprints depends on various factors such as the substrate quality, surface, duration, environmental factors and examiner experience. These factors can result in reduced clarity or content, and can even cause distortions as compared to a fingerprint taken under controlled conditions. Since the release of the National Academy of Sciences (NAS) report in 2009, the field of fingerprint analysis has come under much scrutiny. Specifically, the need for more research into the determination of the accuracy and reliability of the identifications made by fingerprint examiners has been raised.;One such method used for the comparison of latent fingerprint to known prints is through an Automated Fingerprint Identification System (AFIS). The AFIS used in this research was the AFIX Tracker R where where variables were assessed: match score, match minutiae, match status, delta match score and marked minutiae, to determine which variable(s) was a better indicator of a true match. Bayesian networks were then constructed to compute the likelihood ratios to evaluate the dependency of the variables on one another,where the performance of the likelihood ratios in determining the identity of the unknown latent was assessed using Tippett and ECE plots. Receiver Operating Characteristic (ROC) curves and Bayesian networks were constructed to perform statistical analysis of the matches obtained while comparing a latent print to a ten-print card. A combination of Tippett and Empirical Cross Entropy (ECE) plots were used to assess the performance of the AFIX Tracker R in classifying unknown prints. It was observed that a match minutiae of 15 or higher resulted in a 100% true match result whereas for the non-matches,no more than 13 match minutiae were found. Moreover, the delta match scores difference between the matches and non-matches were notable (delta score of 0.1-153 for matches compared to a score of 0-0.1 for the non-matches). Overall, it was determined that approximately 87% of the time a randomly selected known match would have a higher number of match minutiae as compared to a non-match
    • …
    corecore