21 research outputs found

    Agonistic behavior of captive saltwater crocodile, crocodylus porosus in Kota Tinggi, Johor

    Get PDF
    Agonistic behavior in Crocodylus porosus is well known in the wild, but the available data regarding this behavior among the captive individuals especially in a farm setting is rather limited. Studying the aggressive behavior of C. porosus in captivity is important because the data obtained may contribute for conservation and the safety for handlers and visitors. Thus, this study focuses on C. porosus in captivity to describe systematically the agonistic behaviour of C. porosus in relation to feeding time, daytime or night and density per pool. This study was carried out for 35 days in two different ponds. The data was analysed using Pearson’s chi-square analysis to see the relationship between categorical factors. The study shows that C. porosus was more aggressive during daylight, feeding time and non-feeding time in breeding enclosure (Pond C, stock density =0.0369 crocodiles/m2) as compared to non-breeding pond (Pond B, stock density =0.3317 crocodiles/m2) where it is only aggressive during the nighttime. Pond C shows the higher domination in the value of aggression in feeding and non-feeding time where it is related to its function as breeding ground. Chi-square analysis shows that there is no significant difference between ponds (p=0.47, χ2= 2.541, df= 3), thus, there is no relationship between categorical factors. The aggressive behaviour of C. porosus is important for the farm management to evaluate the risk in future for the translocation process and conservation of C. porosus generally

    Pseudorandom Number Generation in Smart Cards: An Implementation, Performance and Randomness Analysis

    Get PDF
    Smart cards rely on pseudorandom number generators to provide uniqueness and freshness in their cryptographic services i.e. encryption and digital signatures. Their implementations are kept proprietary by smart card manufacturers in order to remain competitive. In this paper we look at how these generators are implemented in general purpose computers. How architecture of such generators can be modified to suit the smart card environment. Six variations of this modified model were implemented in Java Card along with the analysis of their performance and randomness. To analyse the randomness of the implemented algorithms, the NIST statistical test suite is used. Finally, an overall analysis is provided, that is useful for smart card designers to make informed decisions when implementing pseudorandom number generators

    Data analytics 2016: proceedings of the fifth international conference on data analytics

    Get PDF

    Addressing the new generation of spam (Spam 2.0) through Web usage models

    Get PDF
    New Internet collaborative media introduce new ways of communicating that are not immune to abuse. A fake eye-catching profile in social networking websites, a promotional review, a response to a thread in online forums with unsolicited content or a manipulated Wiki page, are examples of new the generation of spam on the web, referred to as Web 2.0 Spam or Spam 2.0. Spam 2.0 is defined as the propagation of unsolicited, anonymous, mass content to infiltrate legitimate Web 2.0 applications.The current literature does not address Spam 2.0 in depth and the outcome of efforts to date are inadequate. The aim of this research is to formalise a definition for Spam 2.0 and provide Spam 2.0 filtering solutions. Early-detection, extendibility, robustness and adaptability are key factors in the design of the proposed method.This dissertation provides a comprehensive survey of the state-of-the-art web spam and Spam 2.0 filtering methods to highlight the unresolved issues and open problems, while at the same time effectively capturing the knowledge in the domain of spam filtering.This dissertation proposes three solutions in the area of Spam 2.0 filtering including: (1) characterising and profiling Spam 2.0, (2) Early-Detection based Spam 2.0 Filtering (EDSF) approach, and (3) On-the-Fly Spam 2.0 Filtering (OFSF) approach. All the proposed solutions are tested against real-world datasets and their performance is compared with that of existing Spam 2.0 filtering methods.This work has coined the term ‘Spam 2.0’, provided insight into the nature of Spam 2.0, and proposed filtering mechanisms to address this new and rapidly evolving problem

    EEG Biometrics During Sleep and Wakefulness: Performance Optimization and Security Implications

    Get PDF
    L’internet des objets et les mégadonnées ont un grand choix de domaines d’application. Dans les soins de santé ils ont le potentiel de déclencher les diagnostics à distance et le suivi en temps réel. Les capteurs pour la santé et la télémédecine promettent de fournir un moyen économique et efficace pour décentraliser des hôpitaux en soulageant leur charge. Dans ce type de système, la présence physique n’est pas contrôlée et peut engendrer des fraudes d’identité. Par conséquent, l'identité du patient doit être confirmée avant que n'importe quelle décision médicale ou financière soit prise basée sur les données surveillées. Des méthodes d’identification/authentification traditionnelles, telles que des mots de passe, peuvent être données à quelqu’un d’autre. Et la biométrie basée sur trait, telle que des empreintes digitales, peut ne pas couvrir le traitement entier et mènera à l’utilisation non autorisée post identification/authentification. Un corps naissant de recherche propose l’utilisation d’EEG puisqu’il présente des modèles uniques difficiles à émuler et utiles pour distinguer des sujets. Néanmoins, certains inconvénients doivent être surmontés pour rendre possible son adoption dans la vraie vie : 1) nombre d'électrodes, 2) identification/authentification continue pendant les différentes tâches cognitives et 3) la durée d’entraînement et de test. Pour adresser ces points faibles et leurs solutions possibles ; une perspective d'apprentissage machine a été employée. Premièrement, une base de données brute de 38 sujets aux étapes d'éveil (AWA) et de sommeil (Rem, S1, S2, SWS) a été employée. En effet, l'enregistrement se fait sur chaque sujet à l’aide de 19 électrodes EEG du cuir chevelu et ensuite des techniques de traitement de signal ont été appliquées pour enlever le bruit et faire l’extraction de 20 attribut dans le domaine fréquentiel. Deux ensembles de données supplémentaires ont été créés : SX (tous les stades de sommeil) et ALL (vigilance + tous les stades de sommeil), faisant 7 le nombre d’ensembles de données qui ont été analysés dans cette thèse. En outre, afin de tester les capacités d'identification et d'authentification tous ces ensembles de données ont été divises en les ensembles des Légitimes et des Intrus. Pour déterminer quels sujets devaient appartenir à l’ensemble des Légitimes, un ratio de validation croisée de 90-10% a été évalué avec différentes combinaisons en nombre de sujets. A la fin, un équilibre entre le nombre de sujets et la performance des algorithmes a été trouvé avec 21 sujets avec plus de 44 epochs dans chaque étape. Le reste (16 sujets) appartient à l’ensemble des Intrus.De plus, un ensemble Hold-out (4 epochs enlevées au hasard de chaque sujet dans l’ensemble des Légitimes) a été créé pour évaluer des résultats dans les données qui n'ont été jamais employées pendant l’entraînement.----------ABSTRACT : Internet of Things and Big Data have a variety of application domains. In healthcare they have the potential to give rise to remote health diagnostics and real-time monitoring. Health sensors and telemedicine applications promise to provide and economic and efficient way to ease patients load in hospitals. The lack of physical presence introduces security risks of identity fraud in this type of system. Therefore, patient's identity needs to be confirmed before any medical or financial decision is made based on the monitored data. Traditional identification/authentication methods, such as passwords, can be given to someone else. And trait-based biometrics, such as fingerprints, may not cover the entire treatment and will lead to unauthorized post-identification/authentication use. An emerging body of research proposes the use of EEG as it exhibits unique patterns difficult to emulate and useful to distinguish subjects. However certain drawbacks need to be overcome to make possible the adoption of EEG biometrics in real-life scenarios: 1) number of electrodes, 2) continuous identification/authentication during different brain stimulus and 3) enrollment and identification/authentication duration. To address these shortcomings and their possible solutions; a machine learning perspective has been applied. Firstly, a full night raw database of 38 subjects in wakefulness (AWA) and sleep stages (Rem, S1, S2, SWS) was used. The recording consists of 19 scalp EEG electrodes. Signal pre-processing techniques were applied to remove noise and extract 20 features in the frequency domain. Two additional datasets were created: SX (all sleep stages) and ALL (wakefulness + all sleep stages), making 7 the number of datasets that were analysed in this thesis. Furthermore, in order to test identification/authentication capabilities all these datasets were split in Legitimates and Intruders sets. To determine which subjects were going to belong to the Legitimates set, a 90-10% cross validation ratio was evaluated with different combinations in number of subjects. At the end, a balance between the number of subjects and algorithm performance was found with 21 subjects with over 44 epochs in each stage. The rest (16 subjects) belongs to the Intruders set. Also, a Hold out set (4 randomly removed epochs from each subject in the Legitimate set) was produced to evaluate results in data that has never been used during training

    Power Modeling and Resource Optimization in Virtualized Environments

    Get PDF
    The provisioning of on-demand cloud services has revolutionized the IT industry. This emerging paradigm has drastically increased the growth of data centers (DCs) worldwide. Consequently, this rising number of DCs is contributing to a large amount of world total power consumption. This has directed the attention of researchers and service providers to investigate a power-aware solution for the deployment and management of these systems and networks. However, these solutions could be bene\ufb01cial only if derived from a precisely estimated power consumption at run-time. Accuracy in power estimation is a challenge in virtualized environments due to the lack of certainty of actual resources consumed by virtualized entities and of their impact on applications\u2019 performance. The heterogeneous cloud, composed of multi-tenancy architecture, has also raised several management challenges for both service providers and their clients. Task scheduling and resource allocation in such a system are considered as an NP-hard problem. The inappropriate allocation of resources causes the under-utilization of servers, hence reducing throughput and energy e\ufb03ciency. In this context, the cloud framework needs an e\ufb00ective management solution to maximize the use of available resources and capacity, and also to reduce the impact of their carbon footprint on the environment with reduced power consumption. This thesis addresses the issues of power measurement and resource utilization in virtualized environments as two primary objectives. At \ufb01rst, a survey on prior work of server power modeling and methods in virtualization architectures is carried out. This helps investigate the key challenges that elude the precision of power estimation when dealing with virtualized entities. A di\ufb00erent systematic approach is then presented to improve the prediction accuracy in these networks, considering the resource abstraction at di\ufb00erent architectural levels. Resource usage monitoring at the host and guest helps in identifying the di\ufb00erence in performance between the two. Using virtual Performance Monitoring Counters (vPMCs) at a guest level provides detailed information that helps in improving the prediction accuracy and can be further used for resource optimization, consolidation and load balancing. Later, the research also targets the critical issue of optimal resource utilization in cloud computing. This study seeks a generic, robust but simple approach to deal with resource allocation in cloud computing and networking. The inappropriate scheduling in the cloud causes under- and over- utilization of resources which in turn increases the power consumption and also degrades the system performance. This work \ufb01rst addresses some of the major challenges related to task scheduling in heterogeneous systems. After a critical analysis of existing approaches, this thesis presents a rather simple scheduling scheme based on the combination of heuristic solutions. Improved resource utilization with reduced processing time can be achieved using the proposed energy-e\ufb03cient scheduling algorithm
    corecore