1,051 research outputs found

    Hardware Acceleration of Network Intrusion Detection System Using FPGA

    Get PDF
    This thesis presents new algorithms and hardware designs for Signature-based Network Intrusion Detection System (SB-NIDS) optimisation exploiting a hybrid hardwaresoftware co-designed embedded processing platform. The work describe concentrates on optimisation of a complete SB-NIDS Snort application software on a FPGA based hardware-software target rather than on the implementation of a single functional unit for hardware acceleration. Pattern Matching Hardware Accelerator (PMHA) based on Bloom filter was designed to optimise SB-NIDS performance for execution on a Xilinx MicroBlaze soft-core processor. The Bloom filter approach enables the potentially large number of network intrusion attack patterns to be efficiently represented and searched primarily using accesses to FPGA on-chip memory. The thesis demonstrates, the viability of hybrid hardware-software co-designed approach for SB-NIDS. Future work is required to investigate the effects of later generation FPGA technology and multi-core processors in order to clearly prove the benefits over conventional processor platforms for SB-NIDS. The strengths and weaknesses of the hardware accelerators and algorithms are analysed, and experimental results are examined to determine the effectiveness of the implementation. Experimental results confirm that the PMHA is capable of performing network packet analysis for gigabit rate network traffic. Experimental test results indicate that our SB-NIDS prototype implementation on relatively low clock rate embedded processing platform performance is approximately 1.7 times better than Snort executing on a general purpose processor on PC when comparing processor cycles rather than wall clock time

    A Security Solution for Wireless Local Area Network (WLAN) Using Firewall and VPN

    Get PDF
    In the era of internet millions of users share resource for different purpose. The chances of security risks are more when a user connected with internet. Internet technology plays an important role in every aspect of human life. We can create virtual connectivity with-in seconds with anyone in the world and can exchange or share the information through internet. Sometimes these information is very useful for Defense, and personal use. Sometimes this information is stolen on the internet or we can say destroyed so that receiver cannot receive that information, so for successful communication on internet our connection should be protected. For this protection we can use Firewall protection, VPN Network. These Networks is much more protected than normal Network. Network with VPN and Firewall is faster and efficient rather than normal connection. In normal Network user may faces unexpected delay due to malware and virus. In this paper we have described and analyze impact of Virtual Private Network technology and firewall with normal network. We have simulated three scenarios without firewall, with firewall and Firewall_VPN. The simulation results of three scenarios are compared over WLAN and analyze the impact of Firewall and VPN on network performance. OPNET 14.5 is used for simulator work. Keywords: VPN, Firewall, Security, WLAN, OPNET 14.5

    Optimising Firewall Performance in Dynamic Networks

    Get PDF
    More and more devices connect to the internet, this means that a lot sensitive information will be stored in various networks. In order to secure this information and manage the large amount of inevitable network traffic that these devices create, an optimised firewall is needed. In order to meet this demand, the thesis proposes two algorithms for solving the problem. The first algorithm will minimise the rule matching time by using a simple condition for performing swapping that both preserves the firewall consistency, the firewall integrity and ensures a greedy reduction of the matching time. The solution is novel in itself and can be considered as a generalisation of the algorithm proposed by Fulp in the paper 'Optimization of network firewall policies using ordered sets and directed acyclical graphs'. The second algorithm will read the network traffic and provide network statistics to the first algorithm. The solution is a novel modification of the algorithm by Oommen and Rueda in the paper 'Stochastic learning-based weak estimation of multinomial random variables and its applications to pattern recognition in non-stationary environments'. It will be shown that both algorithms, through experiments, are able to satisfy the problem of optimising a firewall

    How to accelerate your internet : a practical guide to bandwidth management and optimisation using open source software

    Get PDF
    xiii, 298 p. : ill. ; 24 cm.Libro ElectrónicoAccess to sufficient Internet bandwidth enables worldwide electronic collaboration, access to informational resources, rapid and effective communication, and grants membership to a global community. Therefore, bandwidth is probably the single most critical resource at the disposal of a modern organisation. The goal of this book is to provide practical information on how to gain the largest possible benefit from your connection to the Internet. By applying the monitoring and optimisation techniques discussed here, the effectiveness of your network can be significantly improved

    AI Solutions for MDS: Artificial Intelligence Techniques for Misuse Detection and Localisation in Telecommunication Environments

    Get PDF
    This report considers the application of Articial Intelligence (AI) techniques to the problem of misuse detection and misuse localisation within telecommunications environments. A broad survey of techniques is provided, that covers inter alia rule based systems, model-based systems, case based reasoning, pattern matching, clustering and feature extraction, articial neural networks, genetic algorithms, arti cial immune systems, agent based systems, data mining and a variety of hybrid approaches. The report then considers the central issue of event correlation, that is at the heart of many misuse detection and localisation systems. The notion of being able to infer misuse by the correlation of individual temporally distributed events within a multiple data stream environment is explored, and a range of techniques, covering model based approaches, `programmed' AI and machine learning paradigms. It is found that, in general, correlation is best achieved via rule based approaches, but that these suffer from a number of drawbacks, such as the difculty of developing and maintaining an appropriate knowledge base, and the lack of ability to generalise from known misuses to new unseen misuses. Two distinct approaches are evident. One attempts to encode knowledge of known misuses, typically within rules, and use this to screen events. This approach cannot generally detect misuses for which it has not been programmed, i.e. it is prone to issuing false negatives. The other attempts to `learn' the features of event patterns that constitute normal behaviour, and, by observing patterns that do not match expected behaviour, detect when a misuse has occurred. This approach is prone to issuing false positives, i.e. inferring misuse from innocent patterns of behaviour that the system was not trained to recognise. Contemporary approaches are seen to favour hybridisation, often combining detection or localisation mechanisms for both abnormal and normal behaviour, the former to capture known cases of misuse, the latter to capture unknown cases. In some systems, these mechanisms even work together to update each other to increase detection rates and lower false positive rates. It is concluded that hybridisation offers the most promising future direction, but that a rule or state based component is likely to remain, being the most natural approach to the correlation of complex events. The challenge, then, is to mitigate the weaknesses of canonical programmed systems such that learning, generalisation and adaptation are more readily facilitated

    System Architecture and Web Development for Healthcare Big Data Driven Application

    Get PDF
    With the current increase in the volume, variety, and complexity of data, Big Data is increasingly becoming a needed paradigm in any sector of activity. Given these facts, the need for computer systems capable of responding to these same data, especially in processing, storage, and presentation is increasing. All of these points are fundamental so that it becomes to work with the data in a way that it is possible to extract value and knowledge from it; whether to intensify productivity on an assembly line, increase a business’ revenue, or improve the quality of life of a given population. The question then arises ofhowwe can developsuchcomputersystems in the context ofBigDataappliedtothehealthcaresector. Torespondtothechallengesimposedbythis scenario, it is necessary to integrate multiple data sources, process and present them to the end-user in an understandable and timely manner so that their use is viable. As a solution proposal, a system architecture based on microservices is presented, in which the presentation of data uses the latest Web development tools. Such an architecture uses a Cloud infrastructure to take advantage of the inherent advantages, such as scalability, security, and flexibility. From the analysis of data from different sources, with various clinical practices which add volume on which to infer, it is expected that advanced data processing techniques will support the development of new treatment methodologies, support current methods, or even create fertile ground for the creation of practices that could improve the quality of oncological patients.Com o aumento no volume, variedade e complexidade dos dados, cada vez mais se caminha para um paradigma de Big data em todo e qualquer sector de ativade. Perante tais factos, surge cada vez mais a necessidade de existirem sistemas informáticos capazes de dar resposta às necessidades impostas por estes mesmos dados, sobretudo em aspetos como o seu processamento, armazenamento e apresentação. Todos estes pontos são fundamentais para que seja possível trabalhar os dados de forma a que se possa extrair valor e conhecimento destes, seja com vista à intensificação da produtividade numa linha de montagem, aumento das receitas de um negócio ou melhorar a qualidade de vida de uma determinada população. Surge então a questão de como podemos desenvolver tais sistemas informáticos num contexto de Big data aplicado ao sector da saúde. Tendo em conta a forma de responder aos desafios impostos, é necessária a integração de múltiplas fontes de dados assim como a capacidade de as tratar e apresentar ao utilizador final de forma compreensível e em tempo útil, para que a sua utilização seja viável. Como proposta de solução, é proposta uma arquitetura de sistema baseada em microsserviços em que a apresentação dos dados recorre às mais recentes ferramentas de desenvolvimento Web. Tal arquitetura serve-se da infraestrutura Cloud por forma a tirar partido das vantagens inerentes à mesma, tais como escalabilidade, segurança e flexibilidade. Com a análise e integração de diferentes fontes de dados e recorrendo a técnicas avançadas de processamento de dados,é esperado que sejam oferecidas novas perspetivas que possam apoiar o desenvolvimento de novos métodos de tratamento, adoptar aqueles que já existam e criar solo fértil para a criação de novas práticas, em que o objetivo passa por melhorar a qualidade de vida de pacientes oncológicos
    • …
    corecore