332 research outputs found
AI Solutions for MDS: Artificial Intelligence Techniques for Misuse Detection and Localisation in Telecommunication Environments
This report considers the application of Articial Intelligence (AI) techniques to
the problem of misuse detection and misuse localisation within telecommunications
environments. A broad survey of techniques is provided, that covers inter alia
rule based systems, model-based systems, case based reasoning, pattern matching,
clustering and feature extraction, articial neural networks, genetic algorithms, arti
cial immune systems, agent based systems, data mining and a variety of hybrid
approaches. The report then considers the central issue of event correlation, that
is at the heart of many misuse detection and localisation systems. The notion of
being able to infer misuse by the correlation of individual temporally distributed
events within a multiple data stream environment is explored, and a range of techniques,
covering model based approaches, `programmed' AI and machine learning
paradigms. It is found that, in general, correlation is best achieved via rule based approaches,
but that these suffer from a number of drawbacks, such as the difculty of
developing and maintaining an appropriate knowledge base, and the lack of ability
to generalise from known misuses to new unseen misuses. Two distinct approaches
are evident. One attempts to encode knowledge of known misuses, typically within
rules, and use this to screen events. This approach cannot generally detect misuses
for which it has not been programmed, i.e. it is prone to issuing false negatives.
The other attempts to `learn' the features of event patterns that constitute normal
behaviour, and, by observing patterns that do not match expected behaviour, detect
when a misuse has occurred. This approach is prone to issuing false positives,
i.e. inferring misuse from innocent patterns of behaviour that the system was not
trained to recognise. Contemporary approaches are seen to favour hybridisation,
often combining detection or localisation mechanisms for both abnormal and normal
behaviour, the former to capture known cases of misuse, the latter to capture
unknown cases. In some systems, these mechanisms even work together to update
each other to increase detection rates and lower false positive rates. It is concluded
that hybridisation offers the most promising future direction, but that a rule or state
based component is likely to remain, being the most natural approach to the correlation
of complex events. The challenge, then, is to mitigate the weaknesses of
canonical programmed systems such that learning, generalisation and adaptation
are more readily facilitated
Cloud technology options towards Free Flow of Data
This whitepaper collects the technology solutions that the projects in the Data Protection, Security and Privacy Cluster propose to address the challenges raised by the working areas of the Free Flow of Data initiative. The document describes the technologies, methodologies, models, and tools researched and developed by the clustered projects mapped to the ten areas of work of the Free Flow of Data initiative. The aim is to facilitate the identification of the state-of-the-art of technology options towards solving the data security and privacy challenges posed by the Free Flow of Data initiative in Europe. The document gives reference to the Cluster, the individual projects and the technologies produced by them
Spectrum Sensing and Mitigation of Primary User Emulation Attack in Cognitive Radio
The overwhelming growth of wireless communication has led to spectrum shortage issues. In recent days, cognitive radio (CR) has risen as a complete solution for the issue. It is an artificial intelligence-based radio which is capable of finding the free spectrum and utilises it by adapting itself to the environment. Hence, searching of the free spectrum becomes the key task of the cognitive radio termed as spectrum sensing. Some malicious users disrupt the decision-making ability of the cognitive radio. Proper selection of the spectrum scheme and decision-making capability of the cognitive reduces the chance of colliding with the primary user. This chapter discusses the suitable spectrum sensing scheme for low noise environment and a trilayered solution to mitigate the primary user emulation attack (PUEA) in the physical layer of the cognitive radio. The tag is generated in three ways. Sequences were generated using DNA and chaotic algorithm. These sequences are then used as the initial seed value for the generation of gold codes. The output of the generator is considered as the authentication tag. This tag is used to identify the malicious user, thereby PUEA is mitigated. Threat-free environment enables the cognitive radio to come up with a precise decision about the spectrum holes
Automatic construction and updating of knowledge base from log data
Large software systems can be very complex, and they get more and more complex in the recently popular Microservice Architecture due to increasing interactions among more components in bigger systems. Plain model-based and data-driven diagnosis approaches can be used for fault detection, but they are usually opaque and demand massive computing power. On the other hand, knowledge-based methods have shown to be not only effective but explainable and human-friendly for various tasks such as Fault Analysis, but are dependent on having a knowledge base. The construction and maintenance of knowledge bases are not a trivial problem, which is referred to as the knowledge bottleneck.
Software system logs are the primary and most available, sometimes the only available data that record system runtime information, which are critical for software system Operation and Maintenance (O\&M). I proposed the TREAT framework, which can automate the construction and update a knowledge base from a continual stream of logs, which aims to, as faithfully as possible, reflect the latest states of the assisted software system, and facilitate downstream tasks, typically fault localisation. To the best of our knowledge, this is the first effort to construct a fully automated ever-updating knowledge base from logs that aims at reflecting the internal changing states of a software system. To evaluate the TREAT framework, I devised a knowledge-based solution involving logic programming and inductive logic programming that makes use of a TREAT-powered knowledge base to fault localisation and conducted empirical experiments of this solution on a real-life 5G network test bed system.
Since evaluating the TREAT framework by fault localisation is indirect and involves many confounding factors, e.g., the specific solution to fault localisation, I explored and came up with a novel method called LP-Measure that can directly assess the quality of a given knowledge base, in particular the robustness and redundancy of a knowledge graph.
Besides, it was observed that although the extracted knowledge is of high quality in general, there are also errors in the knowledge extraction process. I surveyed the way to quantify the uncertainty during the knowledge extraction process and assign probabilities of correct extraction to every piece of knowledge, which led to a deep investigation into probability calibration and knowledge graph embeddings, specifically testing and confirming the phenomenon of uncalibrated probabilities in knowledge graph embeddings and how to choose specific calibration models from the existing toolbox
Towards Deterministic Communications in 6G Networks: State of the Art, Open Challenges and the Way Forward
Over the last decade, society and industries are undergoing rapid
digitization that is expected to lead to the evolution of the cyber-physical
continuum. End-to-end deterministic communications infrastructure is the
essential glue that will bridge the digital and physical worlds of the
continuum. We describe the state of the art and open challenges with respect to
contemporary deterministic communications and compute technologies: 3GPP 5G,
IEEE Time-Sensitive Networking, IETF DetNet, OPC UA as well as edge computing.
While these technologies represent significant technological advancements
towards networking Cyber-Physical Systems (CPS), we argue in this paper that
they rather represent a first generation of systems which are still limited in
different dimensions. In contrast, realizing future deterministic communication
systems requires, firstly, seamless convergence between these technologies and,
secondly, scalability to support heterogeneous (time-varying requirements)
arising from diverse CPS applications. In addition, future deterministic
communication networks will have to provide such characteristics end-to-end,
which for CPS refers to the entire communication and computation loop, from
sensors to actuators. In this paper, we discuss the state of the art regarding
the main challenges towards these goals: predictability, end-to-end technology
integration, end-to-end security, and scalable vertical application
interfacing. We then present our vision regarding viable approaches and
technological enablers to overcome these four central challenges. Key
approaches to leverage in that regard are 6G system evolutions, wireless
friendly integration of 6G into TSN and DetNet, novel end-to-end security
approaches, efficient edge-cloud integrations, data-driven approaches for
stochastic characterization and prediction, as well as leveraging digital twins
towards system awareness.Comment: 22 pages, 8 figure
AN ENERGY EFFICIENT CROSS-LAYER NETWORK OPERATION MODEL FOR MOBILE WIRELESS SENSOR NETWORKS
Wireless sensor networks (WSNs) are modern technologies used to sense/control the environment whether indoors or outdoors. Sensor nodes are miniatures that can sense a specific event according to the end user(s) needs. The types of applications where such technology can be utilised and implemented are vast and range from households’ low end simple need applications to high end military based applications. WSNs are resource limited. Sensor nodes are expected to work on a limited source of power (e.g., batteries). The connectivity quality and reliability of the nodes is dependent on the quality of the hardware which the nodes are made of. Sensor nodes are envisioned to be either stationary or mobile. Mobility increases the issues of the quality of the operation of the network because it effects directly on the quality of the connections between the nodes
- …