100 research outputs found
Risk and Compliance Management for Cloud Computing Services: Designing a Reference Model
More and more companies are making use of Cloud Computing Services in order to reduce costs and to increase theflexibility of their IT infrastructures. Currently, the focus is shifting towards problems of risk and compliance which includeas well the realm of Cloud Computing security. For instance, since the storage locations of data may shift or remain unknownto the user, the problem of the applicable jurisdiction arises and impede the adoption and management of Cloud ComputingServices. Therefore, companies need new methods to avoid being fined for compliance violations, to manage risk factors aswell as to manage processes and decision rights. This paper presents a reference model that serves to support companies inmanaging and reducing risk and compliance efforts. We developed the model on the solid basis of a systematic literaturereview and practical requirements by analyzing Cloud Computing Service offers
Panel: Why do we toil? Benefiting research at the cost of practice or vice versa?
In this paper, we present a systematic literature review in the field of IT Outsourcing with a focus on
risk management. The source material of the review consists of 97 high quality journal articles
published in 18 journals between 2001 and September 2008. Besides an analysis of related work, this
review provides an overview of applied research methods and theories in the field of IT Outsourcing.
The articles are then analyzed from a risk management point of view to highlight key risk factors and
their specific impact on IT Outsourcing. Identified risk factors are further analyzed in order to assign
each risk factor to the phases of a typical IT Outsourcing process (life-cycle). The results of the review
show that empirical research is the most applied method and that action research and reference
modelling have not been used at all so far. Furthermore, elements of a research agenda are discussed
in order to determine further steps to the construction of a reference model for risk management in IT
Outsourcing. This paper mainly aims at an audience of experienced researchers in the field of IT
Outsourcing who are looking for research ideas and at junior scientists (e.g. PhD students) entering
this emerging field of research
Understanding the Cloud Computing Ecosystem: Results from a Quantitative Content Analysis
An increasing number of companies make use of CloudComputing services in order to reduce costs and increaseflexibility of their IT infrastructure. This has enlivened a debateon the benefits and risks of Cloud Computing, among bothpractitioners and researchers. This study applies quantitativecontent analysis to explore the Cloud Computing ecosystem. Theanalyzed data comprises high quality research articles andpractitioner-oriented articles from magazines and web sites. Weapply n-grams and the cluster algorithm k-means to analyze theliterature. The contribution of this paper is twofold: First, itidentifies the key terms and topics that are part of the CloudComputing ecosystem which we aggregated to a comprehensivemodel. Second, this paper discloses the sentiments of key topicsas reflected in articles from both practice and academia
DESIGN AND IMPLEMENTATION OF A COMMUNITY PLATFORM FOR THE EVALUATION AND SELECTION OF CLOUD COMPUTING SERVICES: A MARKET ANALYSIS
The large number of available Cloud Computing Services makes it hard for companies to keep an overview of the market and to identify the services that best fit their needs. Also, the search for the most suitable Cloud Computing Services often takes too much time and money. The community platform presented in this article was designed to assist companies and users in solving this problem by enabling them to identify relevant Cloud Computing Services. Furthermore, users have the option of evaluating individual services and get access to the evaluations submitted to the community platform by other users. The paper describes the design and the prototypical implementation of the platform and introduces a maturity model for the quality assessment of Cloud Computing Services listed in the platform’s underlying database. The authors also provide recommendations for further action based on a first analysis of the market situation. Our research can be characterized as a design-oriented research approach that focuses on the design of IT artifacts (i.e. community platform and underlying maturity model). Both IT artifacts are evaluated by means of expert interviews and by users giving us feedback when testing our community platform
Earliest Query Answering for Deterministic Nested Word Automata
International audienceEarliest query answering (EQA) is an objective of many recent streaming algorithms for XML query answering, that aim for close to optimal memory management. In this paper, we show that EQA is infeasible even for a small fragment of Forward XPath except if P=NP. We then present an EQA algorithm for queries and schemas defined by deterministic nested word automata (dNWAs) and distinguish a large class of dNWAs for which streaming query answering is feasible in polynomial space and time
Bounded Delay and Concurrency for Earliest Query Answering
International audienceEarliest query answering is needed for streaming XML processing with optimal memory management. We study the feasibility of earliest query answering for node selection queries. Tractable queries are distinguished by a bounded number of concurrently alive answer candidates at every time point, and a bounded delay for node selection. We show that both properties are decidable in polynomial time for queries defined by deterministic automata for unranked trees. Our results are obtained by reduction to the bounded valuedness problem for recognizable relations between unranked trees
The status of the energy calibration, polarization and monochromatization of the FCC-ee
The Future Circular electron-positron Collider, FCC- ee, is designed for unprecedented precision for particle physics experiments from the Z-pole up to above the top-pair-threshold, corresponding to a beam energy range from 45.6 to 182.5 GeV. Performing collisions at various particle-physics resonances requires precise knowledge of the centre-of-mass energy (ECM) and collision boosts at all four interaction points. Measurement of the ECM by resonant depolarization of transversely polarized pilot bunches in combination with a 3D polarimeter, aims to achieve a systematic uncertainty of 4 and 100 keV for the Z-pole and W-pair-threshold energies respectively. The ECM itself depends on the RF-cavity locations, beamstrahlung, longitudinal impedance, the Earth’s tides, opposite sign dispersion and possible collision offsets. Application of monochromatization schemes are envisaged at certain beam energies to reduce the energy spread. The latest results of studies of the energy calibration, polarization and monochromatization are reported here
Summary of the ISEV workshop on extracellular vesicles as disease biomarkers, held in Birmingham, UK, during December 2017
This report summarises the presentations and activities of the ISEV Workshop on extracellular vesicle biomarkers held in Birmingham, UK during December 2017. Among the key messages was broad agreement about the importance of biospecimen science. Much greater attention needs to be paid towards the provenance of collected samples. The workshop also highlighted clear gaps in our knowledge about pre-analytical factors that alter extracellular vesicles (EVs). The future utility of certified standards for credentialing of instruments and software, to analyse EV and for tracking the influence of isolation steps on the structure and content of EVs were also discussed. Several example studies were presented, demonstrating the potential utility for EVs in disease diagnosis, prognosis, longitudinal serial testing and stratification of patients. The conclusion of the workshop was that more effort focused on pre-analytical issues and benchmarking of isolation methods is needed to strengthen collaborations and advance more effective biomarkers
Summary of the ISEV workshop on extracellular vesicles as disease biomarkers, held in Birmingham, UK, during December 2017: Meeting report
This report summarises the presentations and activities of the ISEV Workshop on extracellular
vesicle biomarkers held in Birmingham, UK during December 2017. Among the key messages was
broad agreement about the importance of biospecimen science. Much greater attention needs to
be paid towards the provenance of collected samples. The workshop also highlighted clear gaps
in our knowledge about pre-analytical factors that alter extracellular vesicles (EVs). The future
utility of certified standards for credentialing of instruments and software, to analyse EV and for
tracking the influence of isolation steps on the structure and content of EVs were also discussed.
Several example studies were presented, demonstrating the potential utility for EVs in disease
diagnosis, prognosis, longitudinal serial testing and stratification of patients. The conclusion of
the workshop was that more effort focused on pre-analytical issues and benchmarking of
isolation methods is needed to strengthen collaborations and advance more effective biomarkers
The Large Hadron-Electron Collider at the HL-LHC
The Large Hadron-Electron Collider (LHeC) is designed to move the field of deep inelastic scattering (DIS) to the energy and intensity frontier of particle physics. Exploiting energy-recovery technology, it collides a novel, intense electron beam with a proton or ion beam from the High-Luminosity Large Hadron Collider (HL-LHC). The accelerator and interaction region are designed for concurrent electron-proton and proton-proton operations. This report represents an update to the LHeC's conceptual design report (CDR), published in 2012. It comprises new results on the parton structure of the proton and heavier nuclei, QCD dynamics, and electroweak and top-quark physics. It is shown how the LHeC will open a new chapter of nuclear particle physics by extending the accessible kinematic range of lepton-nucleus scattering by several orders of magnitude. Due to its enhanced luminosity and large energy and the cleanliness of the final hadronic states, the LHeC has a strong Higgs physics programme and its own discovery potential for new physics. Building on the 2012 CDR, this report contains a detailed updated design for the energy-recovery electron linac (ERL), including a new lattice, magnet and superconducting radio-frequency technology, and further components. Challenges of energy recovery are described, and the lower-energy, high-current, three-turn ERL facility, PERLE at Orsay, is presented, which uses the LHeC characteristics serving as a development facility for the design and operation of the LHeC. An updated detector design is presented corresponding to the acceptance, resolution, and calibration goals that arise from the Higgs and parton-density-function physics programmes. This paper also presents novel results for the Future Circular Collider in electron-hadron (FCC-eh) mode, which utilises the same ERL technology to further extend the reach of DIS to even higher centre-of-mass energies.Peer reviewe
- …