4,800 research outputs found
Recommended from our members
A dubiety-determining based model for database cumulated anomaly intrusion
The concept of Cumulated Anomaly (CA), which describes a new type of database anomalies, is addressed. A
typical CA intrusion is that when a user who is authorized to modify data records under certain constraints deliberately
hides his/her intentions to change data beyond constraints in different operations and different transactions. It happens
when some appearing to be authorized and normal transactions lead to certain accumulated results out of given thresholds.
The existing intrusion techniques are unable to deal with CAs. This paper proposes a detection model,
Dubiety-Determining Model (DDM), for Cumulated Anomaly. This model is mainly based on statistical theories and fuzzy
set theories. It measures the dubiety degree, which is presented by a real number between 0 and 1, for each database
transaction, to show the likelihood of a transaction to be intrusive. The algorithms used in the DDM are introduced. A
DDM-based software architecture has been designed and implemented for monitoring database transactions. The
experimental results show that the DDM method is feasible and effective
Autonomic computing architecture for SCADA cyber security
Cognitive computing relates to intelligent computing platforms that are based on the disciplines of artificial intelligence, machine learning, and other innovative technologies. These technologies can be used to design systems that mimic the human brain to learn about their environment and can autonomously predict an impending anomalous situation. IBM first used the term ‘Autonomic Computing’ in 2001 to combat the looming complexity crisis (Ganek and Corbi, 2003). The concept has been inspired by the human biological autonomic system. An autonomic system is self-healing, self-regulating, self-optimising and self-protecting (Ganek and Corbi, 2003). Therefore, the system should be able to protect itself against both malicious attacks and unintended mistakes by the operator
Graph based Anomaly Detection and Description: A Survey
Detecting anomalies in data is a vital task, with numerous high-impact applications in areas such as security, finance, health care, and law enforcement. While numerous techniques have been developed in past years for spotting outliers and anomalies in unstructured collections of multi-dimensional points, with graph data becoming ubiquitous, techniques for structured graph data have been of focus recently. As objects in graphs have long-range correlations, a suite of novel technology has been developed for anomaly detection in graph data. This survey aims to provide a general, comprehensive, and structured overview of the state-of-the-art methods for anomaly detection in data represented as graphs. As a key contribution, we give a general framework for the algorithms categorized under various settings: unsupervised vs. (semi-)supervised approaches, for static vs. dynamic graphs, for attributed vs. plain graphs. We highlight the effectiveness, scalability, generality, and robustness aspects of the methods. What is more, we stress the importance of anomaly attribution and highlight the major techniques that facilitate digging out the root cause, or the ‘why’, of the detected anomalies for further analysis and sense-making. Finally, we present several real-world applications of graph-based anomaly detection in diverse domains, including financial, auction, computer traffic, and social networks. We conclude our survey with a discussion on open theoretical and practical challenges in the field
- …