10,666 research outputs found

    Immutable Infrastructure Calls for Immutable Architecture

    Get PDF
    With the advent of cloud computing and the concept of immutable infrastructure, the scaling and deployment of applications has become significantly easier. This increases the possibility of “configuration drift” as an operations team manages this cluster of machines, both virtual and actual. In this paper we propose a revised view on configuration and architecture. We propose that software deployed on a public or private cloud should, to the furthest possible extent, be immutable and source controlled. This reduces configuration drift and ensures no configuration problems in production as a result of updates or changes. We will show an example of a software project deployed on Amazon Web Services with an immutable Jenkins setup which manages updating the whole cluster and is self-regenerating. We will also discuss how this lends itself naturally to interoperability between clouds, because of the infrastructure-agnostic nature of this approach

    Blockchain For Food: Making Sense of Technology and the Impact on Biofortified Seeds

    Get PDF
    The global food system is under pressure and is in the early stages of a major transition towards more transparency, circularity, and personalisation. In the coming decades, there is an increasing need for more food production with fewer resources. Thus, increasing crop yields and nutritional value per crop is arguably an important factor in this global food transition. Biofortification can play an important role in feeding the world. Biofortified seeds create produce with increased nutritional values, mainly minerals and vitamins, while using the same or less resources as non-biofortified variants. However, a farmer cannot distinguish a biofortified seed from a regular seed. Due to the invisible nature of the enhanced seeds, counterfeit products are common, limiting wide-scale adoption of biofortified crops. Fraudulent seeds pose a major obstacle in the adoption of biofortified crops. A system that could guarantee the origin of the biofortified seeds is therefore required to ensure widespread adoption. This trust-ensuring immutable proof for the biofortified seeds, can be provided via blockchain technology

    Kevoree Modeling Framework (KMF): Efficient modeling techniques for runtime use

    Get PDF
    The creation of Domain Specific Languages(DSL) counts as one of the main goals in the field of Model-Driven Software Engineering (MDSE). The main purpose of these DSLs is to facilitate the manipulation of domain specific concepts, by providing developers with specific tools for their domain of expertise. A natural approach to create DSLs is to reuse existing modeling standards and tools. In this area, the Eclipse Modeling Framework (EMF) has rapidly become the defacto standard in the MDSE for building Domain Specific Languages (DSL) and tools based on generative techniques. However, the use of EMF generated tools in domains like Internet of Things (IoT), Cloud Computing or Models@Runtime reaches several limitations. In this paper, we identify several properties the generated tools must comply with to be usable in other domains than desktop-based software systems. We then challenge EMF on these properties and describe our approach to overcome the limitations. Our approach, implemented in the Kevoree Modeling Framework (KMF), is finally evaluated according to the identified properties and compared to EMF.Comment: ISBN 978-2-87971-131-7; N° TR-SnT-2014-11 (2014

    IPCFA: A Methodology for Acquiring Forensically-Sound Digital Evidence in the Realm of IAAS Public Cloud Deployments

    Get PDF
    Cybercrimes and digital security breaches are on the rise: savvy businesses and organizations of all sizes must ready themselves for the worst. Cloud computing has become the new normal, opening even more doors for cybercriminals to commit crimes that are not easily traceable. The fast pace of technology adoption exceeds the speed by which the cybersecurity community and law enforcement agencies (LEAs) can invent countermeasures to investigate and prosecute such criminals. While presenting defensible digital evidence in courts of law is already complex, it gets more complicated if the crime is tied to public cloud computing, where storage, network, and computing resources are shared and dispersed over multiple geographical areas. Investigating such crimes involves collecting evidence data from the public cloud that is court-sound. Digital evidence court admissibility in the U.S. is governed predominantly by the Federal Rules of Evidence and Federal Rules of Civil Procedures. Evidence authenticity can be challenged by the Daubert test, which evaluates the forensic process that took place to generate the presented evidence. Existing digital forensics models, methodologies, and processes have not adequately addressed crimes that take place in the public cloud. It was only in late 2020 that the Scientific Working Group on Digital Evidence (SWGDE) published a document that shed light on best practices for collecting evidence from cloud providers. Yet SWGDE’s publication does not address the gap between the technology and the legal system when it comes to evidence admissibility. The document is high level with more focus on law enforcement processes such as issuing a subpoena and preservation orders to the cloud provider. This research proposes IaaS Public Cloud Forensic Acquisition (IPCFA), a methodology to acquire forensic-sound evidence from public cloud IaaS deployments. IPCFA focuses on bridging the gap between the legal and technical sides of evidence authenticity to help produce admissible evidence that can withstand scrutiny in U.S. courts. Grounded in design research science (DSR), the research is rigorously evaluated using two hypothetical scenarios for crimes that take place in the public cloud. The first scenario takes place in AWS and is hypothetically walked-thru. The second scenario is a demonstration of IPCFA’s applicability and effectiveness on Azure Cloud. Both cases are evaluated using a rubric built from the federal and civil digital evidence requirements and the international best practices for iv digital evidence to show the effectiveness of IPCFA in generating cloud evidence sound enough to be considered admissible in court

    CernVM Online and Cloud Gateway: a uniform interface for CernVM contextualization and deployment

    Full text link
    In a virtualized environment, contextualization is the process of configuring a VM instance for the needs of various deployment use cases. Contextualization in CernVM can be done by passing a handwritten context to the user data field of cloud APIs, when running CernVM on the cloud, or by using CernVM web interface when running the VM locally. CernVM Online is a publicly accessible web interface that unifies these two procedures. A user is able to define, store and share CernVM contexts using CernVM Online and then apply them either in a cloud by using CernVM Cloud Gateway or on a local VM with the single-step pairing mechanism. CernVM Cloud Gateway is a distributed system that provides a single interface to use multiple and different clouds (by location or type, private or public). Cloud gateway has been so far integrated with OpenNebula, CloudStack and EC2 tools interfaces. A user, with access to a number of clouds, can run CernVM cloud agents that will communicate with these clouds using their interfaces, and then use one single interface to deploy and scale CernVM clusters. CernVM clusters are defined in CernVM Online and consist of a set of CernVM instances that are contextualized and can communicate with each other.Comment: Conference paper at the 2013 Computing in High Energy Physics (CHEP) Conference, Amsterda

    Lex Informatica: The Formulation of Information Policy Rules through Technology

    Get PDF
    Historically, law and government regulation have established default rules for information policy, including constitutional rules on freedom of expression and statutory rights of ownership of information. This Article will show that for network environments and the Information Society, however, law and government regulation are not the only source of rule-making. Technological capabilities and system design choices impose rules on participants. The creation and implementation of information policy are embedded in network designs and standards as well as in system configurations. Even user preferences and technical choices create overarching, local default rules. This Article argues, in essence, that the set of rules for information flows imposed by technology and communication networks form a “Lex Informatica” that policymakers must understand, consciously recognize, and encourage
    • 

    corecore