48 research outputs found
Towards a fuller understanding of neurons with Clustered Compositional Explanations
Compositional Explanations is a method for identifying logical formulas of
concepts that approximate the neurons' behavior. However, these explanations
are linked to the small spectrum of neuron activations (i.e., the highest ones)
used to check the alignment, thus lacking completeness. In this paper, we
propose a generalization, called Clustered Compositional Explanations, that
combines Compositional Explanations with clustering and a novel search
heuristic to approximate a broader spectrum of the neurons' behavior. We define
and address the problems connected to the application of these methods to
multiple ranges of activations, analyze the insights retrievable by using our
algorithm, and propose desiderata qualities that can be used to study the
explanations returned by different algorithms.Comment: Accepted at NeurIPS 202
Optimal deployment of components of cloud-hosted application for guaranteeing multitenancy isolation
One of the challenges of deploying multitenant cloud-hosted
services that are designed to use (or be integrated with) several
components is how to implement the required degree
of isolation between the components when there is a change
in the workload. Achieving the highest degree of isolation
implies deploying a component exclusively for one tenant;
which leads to high resource consumption and running cost
per component. A low degree of isolation allows sharing of
resources which could possibly reduce cost, but with known
limitations of performance and security interference. This
paper presents a model-based algorithm together with four
variants of a metaheuristic that can be used with it, to provide
near-optimal solutions for deploying components of a
cloud-hosted application in a way that guarantees multitenancy
isolation. When the workload changes, the model based
algorithm solves an open multiclass QN model to
determine the average number of requests that can access
the components and then uses a metaheuristic to provide
near-optimal solutions for deploying the components. Performance
evaluation showed that the obtained solutions had
low variability and percent deviation when compared to the
reference/optimal solution. We also provide recommendations
and best practice guidelines for deploying components
in a way that guarantees the required degree of isolation
Scheduling of a Cyber-Physical System Simulation
The work carried out in this Ph.D. thesis is part of a broader effort to automate industrial simulation systems. In the aeronautics industry, and more especially within Airbus, the historical application of simulation is pilot training. There are also more recent uses in the design of systems, as well as in the integration of these systems. These latter applications require a very high degree of representativeness, where historically the most important factor has been the pilot’s feeling.
Systems are now divided into several subsystems that are designed, implemented and validated independently, in order to maintain their control despite the increase in their complexity, and the reduction in time-to-market. Airbus already has expertise in the simulation of these subsystems, as well as their integration into a simulation. This expertise is empirical; simulation specialists use the previous integrations schedulings and adapt it to a new integration. This is a process that can sometimes be time-consuming and can introduce errors.
The current trends in the industry are towards flexible production methods, integration of logistics tools for tracking, use of simulation tools in production, as well as resources optimization. Products are increasingly iterations of older, improved products, and tests and simulations are increasingly integrated into their life cycles.
Working empirically in an industry that requires flexibility is a constraint, and nowadays it is essential to facilitate the modification of simulations. The problem is, therefore, to set up methods and tools allowing a priori to generate representative simulation schedules.
In order to solve this problem, we have developed a method to describe the elements of a simulation, as well as how this simulation can be executed, and functions to generate schedules. Subsequently, we implemented a tool to automate the scheduling search, based on heuristics. Finally, we tested and verified our method and tools in academic and industrial case studies
Detecting malicious VBscripts using anomaly host based IDS based on principal component analysis (PCA)
Intrusion detection research over the last twenty years has focused on the threat of individuals illegally hacking into systems. Nowadays, intrusion threat to computer systems has changed radically. Instead of dealing with hackers, most current works focus on defending the system against code-driven attacks. Today’s web script codes such as VBScript are receiving increasing focus as a backdoor for attacking many computers through e-mail attachments or infected web sites. The nature of these malicious codes is that they can spread widely causing serious damages to many applications. Moreover, the majority of anti-virus tools used today are able to detect known attacks but are unable to detect new and unknown attacks. The work in this thesis presents an Anomaly host based Intrusion Detection System (IDS) that provides protection against web attacks from malicious VBScripts. The core of the system treats anomalies as outliers and this IDS model uses a Multivariate Statistical technique, Principal Component Analysis (PCA) to reduce the dimensionality of the problem while keeping the major principal components of benign instances. Hence, the system can easily filter malicious scripts that deviate from normal behavior and allow for normal scripts to bypass; so any future or unknown VBScript attacks are effectively captured while maintaining a low rate of false alarms