6,311 research outputs found

    Security consideration for virtualization

    Get PDF
    Virtualization is not a new technology, but has recently experienced a resurgence of interest among industry and research. New products and technologies are emerging quickly, and are being deployed with little considerations to security concerns. It is vital to understand that virtualization does not improve security by default. Hence, any aspect of virtualization needs to undergo constant security analysis and audit. Virtualization is a changeable and very dynamic field with an uncertain outcome. In this paper we outline the security model of hypervisors and illustrate the significance of ongoing security analysis by describing different state of the art threat models. Finally, we provide recommendations and design considerations for a more secure virtual infrastructure

    CamFlow: Managed Data-sharing for Cloud Services

    Full text link
    A model of cloud services is emerging whereby a few trusted providers manage the underlying hardware and communications whereas many companies build on this infrastructure to offer higher level, cloud-hosted PaaS services and/or SaaS applications. From the start, strong isolation between cloud tenants was seen to be of paramount importance, provided first by virtual machines (VM) and later by containers, which share the operating system (OS) kernel. Increasingly it is the case that applications also require facilities to effect isolation and protection of data managed by those applications. They also require flexible data sharing with other applications, often across the traditional cloud-isolation boundaries; for example, when government provides many related services for its citizens on a common platform. Similar considerations apply to the end-users of applications. But in particular, the incorporation of cloud services within `Internet of Things' architectures is driving the requirements for both protection and cross-application data sharing. These concerns relate to the management of data. Traditional access control is application and principal/role specific, applied at policy enforcement points, after which there is no subsequent control over where data flows; a crucial issue once data has left its owner's control by cloud-hosted applications and within cloud-services. Information Flow Control (IFC), in addition, offers system-wide, end-to-end, flow control based on the properties of the data. We discuss the potential of cloud-deployed IFC for enforcing owners' dataflow policy with regard to protection and sharing, as well as safeguarding against malicious or buggy software. In addition, the audit log associated with IFC provides transparency, giving configurable system-wide visibility over data flows. [...]Comment: 14 pages, 8 figure

    Plugging Side-Channel Leaks with Timing Information Flow Control

    Get PDF
    The cloud model's dependence on massive parallelism and resource sharing exacerbates the security challenge of timing side-channels. Timing Information Flow Control (TIFC) is a novel adaptation of IFC techniques that may offer a way to reason about, and ultimately control, the flow of sensitive information through systems via timing channels. With TIFC, objects such as files, messages, and processes carry not just content labels describing the ownership of the object's "bits," but also timing labels describing information contained in timing events affecting the object, such as process creation/termination or message reception. With two system design tools-deterministic execution and pacing queues-TIFC enables the construction of "timing-hardened" cloud infrastructure that permits statistical multiplexing, while aggregating and rate-limiting timing information leakage between hosted computations.Comment: 5 pages, 3 figure

    Machine Learning based Anomaly Detection for Cybersecurity Monitoring of Critical Infrastructures

    Get PDF
    openManaging critical infrastructures requires to increasingly rely on Information and Communi- cation Technologies. The last past years showed an incredible increase in the sophistication of attacks. For this reason, it is necessary to develop new algorithms for monitoring these infrastructures. In this scenario, Machine Learning can represent a very useful ally. After a brief introduction on the issue of cybersecurity in Industrial Control Systems and an overview of the state of the art regarding Machine Learning based cybersecurity monitoring, the present work proposes three approaches that target different layers of the control network architecture. The first one focuses on covert channels based on the DNS protocol, which can be used to establish a command and control channel, allowing attackers to send malicious commands. The second one focuses on the field layer of electrical power systems, proposing a physics-based anomaly detection algorithm for Distributed Energy Resources. The third one proposed a first attempt to integrate physical and cyber security systems, in order to face complex threats. All these three approaches are supported by promising results, which gives hope to practical applications in the next future.openXXXIV CICLO - SCIENZE E TECNOLOGIE PER L'INGEGNERIA ELETTRONICA E DELLE TELECOMUNICAZIONI - Elettromagnetismo, elettronica, telecomunicazioniGaggero, GIOVANNI BATTIST

    Discovering New Vulnerabilities in Computer Systems

    Get PDF
    Vulnerability research plays a key role in preventing and defending against malicious computer system exploitations. Driven by a multi-billion dollar underground economy, cyber criminals today tirelessly launch malicious exploitations, threatening every aspect of daily computing. to effectively protect computer systems from devastation, it is imperative to discover and mitigate vulnerabilities before they fall into the offensive parties\u27 hands. This dissertation is dedicated to the research and discovery of new design and deployment vulnerabilities in three very different types of computer systems.;The first vulnerability is found in the automatic malicious binary (malware) detection system. Binary analysis, a central piece of technology for malware detection, are divided into two classes, static analysis and dynamic analysis. State-of-the-art detection systems employ both classes of analyses to complement each other\u27s strengths and weaknesses for improved detection results. However, we found that the commonly seen design patterns may suffer from evasion attacks. We demonstrate attacks on the vulnerabilities by designing and implementing a novel binary obfuscation technique.;The second vulnerability is located in the design of server system power management. Technological advancements have improved server system power efficiency and facilitated energy proportional computing. However, the change of power profile makes the power consumption subjected to unaudited influences of remote parties, leaving the server systems vulnerable to energy-targeted malicious exploit. We demonstrate an energy abusing attack on a standalone open Web server, measure the extent of the damage, and present a preliminary defense strategy.;The third vulnerability is discovered in the application of server virtualization technologies. Server virtualization greatly benefits today\u27s data centers and brings pervasive cloud computing a step closer to the general public. However, the practice of physical co-hosting virtual machines with different security privileges risks introducing covert channels that seriously threaten the information security in the cloud. We study the construction of high-bandwidth covert channels via the memory sub-system, and show a practical exploit of cross-virtual-machine covert channels on virtualized x86 platforms
    corecore