1,109 research outputs found
Recommended from our members
Smart Computer Security Audit: Reinforcement Learning with a Deep Neural Network Approximator
A significant challenge in modern computer security is the growing skill gap as intruder capabilities increase, making it necessary to begin automating elements of penetration testing so analysts can contend with the growing number of cyber threats. In this paper, we attempt to assist human analysts by automating a single host penetration attack. To do so, a smart agent performs different attack sequences to find vulnerabilities in a target system. As it does so, it accumulates knowledge, learns new attack sequences and improves its own internal penetration testing logic. As a result, this agent (AgentPen for simplicity) is able to successfully penetrate hosts it has never interacted with before. A computer security administrator using this tool would receive a comprehensive, automated sequence of actions leading to a security breach, highlighting potential vulnerabilities, and reducing the amount of menial tasks a typical penetration tester would need to execute. To achieve autonomy, we apply an unsupervised machine learning algorithm, Q-learning, with an approximator that incorporates a deep neural network architecture. The security audit itself is modelled as a Markov Decision Process in order to test a number of decisionmaking strategies and compare their convergence to optimality. A series of experimental results is presented to show how this approach can be effectively used to automate penetration testing using a scalable, i.e. not exhaustive, and adaptive approach
Recommended from our members
Stochastic modeling for performance evaluation of database replication protocols
Performance is often the most important non-functional property for database systems and associated replication solutions. This is true at least in in-dustrial contexts. Evaluating performance using real systems, however, is com-putationally demanding and costly. In many cases, choosing between several competing replication protocols poses a difficulty in ranking these protocols meaningfully: the ranking is determined not so much by the quality of the com-peting protocols but, instead, by the quality of the available implementations. Addressing this difficulty requires a level of abstraction in which the impact on the comparison of the implementations is reduced, or entirely eliminated. We propose a stochastic model for performance evaluation of database replication protocols, paying particular attention to: i) empirical validation of a number of assumptions used in the stochastic model, and ii) empirical validation of model accuracy for a chosen replication protocol. For the empirical validations we used the TPC-C benchmark. Our implementation of the model is based on Stochastic Activity Networks (SAN), extended by bespoke code. The model may reduce the cost of performance evaluation in comparison with empirical measurements, while keeping the accuracy of the assessment to an acceptable level
Smart Security Audit: Reinforcement Learning with a Deep Neural Network Approximator
No embargo require
Total quality in laboratory diagnostics: the role of commercial companies
Quality is a key focus for clinical laboratories, since it is viewed as a prerequisite for patient safety. It permeates all three phases of the total testing process (preanalytical, analytical and postanalytical), and relies heavily on the quality of diagnostic products, such as in vitro (IVD) devices (instruments, assays, reagents and specimen collection tubes) and medical devices (blood collection needles and sets). The diagnostic industry has implemented strict criteria to assure that the quality of their products throughout their life cycle meets the needs of their customers. This is accomplished through established processes to develop products that meet customer needs, as well as regulatory requirements needed to assure their safety and efficacy while adhering to good clinical practices and maintaining high level of safety of human subjects that participate in clinical trials. At the same time, the commercial companies follow good manufacturing practices to reduce variability within their manufacturing processes and deliver products that are within established specifications. However, the highest level of quality can only be achieved when instrument/assay manufacturers work closely with specimen containment manufacturers to assure total system performance
Total quality in laboratory diagnostics: the role of commercial companies
Quality is a key focus for clinical laboratories, since it is viewed as a prerequisite for patient safety. It permeates all three phases of the total testing process (preanalytical, analytical and postanalytical), and relies heavily on the quality of diagnostic products, such as in vitro (IVD) devices (instruments, assays, reagents and specimen collection tubes) and medical devices (blood collection needles and sets). The diagnostic industry has implemented strict criteria to assure that the quality of their products throughout their life cycle meets the needs of their customers. This is accomplished through established processes to develop products that meet customer needs, as well as regulatory requirements needed to assure their safety and efficacy while adhering to good clinical practices and maintaining high level of safety of human subjects that participate in clinical trials. At the same time, the commercial companies follow good manufacturing practices to reduce variability within their manufacturing processes and deliver products that are within established specifications. However, the highest level of quality can only be achieved when instrument/assay manufacturers work closely with specimen containment manufacturers to assure total system performance
Physical origin of the power-law tailed statistical distributions
Starting from the BBGKY hierarchy, describing the kinetics of nonlinear
particle system, we obtain the relevant entropy and stationary distribution
function. Subsequently, by employing the Lorentz transformations we propose the
relativistic generalization of the exponential and logarithmic functions. The
related particle distribution and entropy represents the relativistic extension
of the classical Maxwell-Boltzmann distribution and of the Boltzmann entropy
respectively and define the statistical mechanics presented in [Phys. Rev. E
{\bf 66}, 056125 (2002)] and [Phys. Rev. E {\bf 72}, 036108 (2005). The
achievements of the present effort, support the idea that the experimentally
observed power law tailed statistical distributions in plasma physics, are
enforced by the relativistic microscopic particle dynamics.Comment: 6 pages. arXiv admin note: substantial text overlap with
arXiv:1110.3944, arXiv:1012.390
Towards a Formal Verification Methodology for Collective Robotic Systems
We introduce a UML-based notation for graphically modeling
systemsā security aspects in a simple and intuitive
way and a model-driven process that transforms graphical
specifications of access control policies in XACML. These
XACML policies are then translated in FACPL, a policy
language with a formal semantics, and the resulting policies
are evaluated by means of a Java-based software tool
Narrowband angle of arrival estimation exploiting graph topology and graph signals
Based on recent results of applying graph signal processing (GSP) to narrowband angle of arrival estimation for uniform linear arrays, we generalise the analysis to the case of arrays with elements placed arbitrarily in three dimensional space. We comment on the selection of the adjacency matrix, analyse how this new approach compares to the multiple signal classification (MUSIC) algorithm, and provide an efficient implementation. We demonstrate that the GSP approach can perform as well as the MUSIC algorithm in terms of accuracy and computational cost. Simulations indicate that the proposed GSP approach avoids the severe performance degradation with which MUSIC is associated at low signal to noise ratios
Normal pre-B cells express a receptor complex of mu heavy chains and surrogate light-chain proteins.
- ā¦