39,381 research outputs found

    Decision Support Tools for Cloud Migration in the Enterprise

    Full text link
    This paper describes two tools that aim to support decision making during the migration of IT systems to the cloud. The first is a modeling tool that produces cost estimates of using public IaaS clouds. The tool enables IT architects to model their applications, data and infrastructure requirements in addition to their computational resource usage patterns. The tool can be used to compare the cost of different cloud providers, deployment options and usage scenarios. The second tool is a spreadsheet that outlines the benefits and risks of using IaaS clouds from an enterprise perspective; this tool provides a starting point for risk assessment. Two case studies were used to evaluate the tools. The tools were useful as they informed decision makers about the costs, benefits and risks of using the cloud.Comment: To appear in IEEE CLOUD 201

    Cloud Migration: A Case Study of Migrating an Enterprise IT System to IaaS

    Full text link
    This case study illustrates the potential benefits and risks associated with the migration of an IT system in the oil & gas industry from an in-house data center to Amazon EC2 from a broad variety of stakeholder perspectives across the enterprise, thus transcending the typical, yet narrow, financial and technical analysis offered by providers. Our results show that the system infrastructure in the case study would have cost 37% less over 5 years on EC2, and using cloud computing could have potentially eliminated 21% of the support calls for this system. These findings seem significant enough to call for a migration of the system to the cloud but our stakeholder impact analysis revealed that there are significant risks associated with this. Whilst the benefits of using the cloud are attractive, we argue that it is important that enterprise decision-makers consider the overall organizational implications of the changes brought about with cloud computing to avoid implementing local optimizations at the cost of organization-wide performance.Comment: Submitted to IEEE CLOUD 201

    Split and Migrate: Resource-Driven Placement and Discovery of Microservices at the Edge

    Get PDF
    Microservices architectures combine the use of fine-grained and independently-scalable services with lightweight communication protocols, such as REST calls over HTTP. Microservices bring flexibility to the development and deployment of application back-ends in the cloud. Applications such as collaborative editing tools require frequent interactions between the front-end running on users\u27 machines and a back-end formed of multiple microservices. User-perceived latencies depend on their connection to microservices, but also on the interaction patterns between these services and their databases. Placing services at the edge of the network, closer to the users, is necessary to reduce user-perceived latencies. It is however difficult to decide on the placement of complete stateful microservices at one specific core or edge location without trading between a latency reduction for some users and a latency increase for the others. We present how to dynamically deploy microservices on a combination of core and edge resources to systematically reduce user-perceived latencies. Our approach enables the split of stateful microservices, and the placement of the resulting splits on appropriate core and edge sites. Koala, a decentralized and resource-driven service discovery middleware, enables REST calls to reach and use the appropriate split, with only minimal changes to a legacy microservices application. Locality awareness using network coordinates further enables to automatically migrate services split and follow the location of the users. We confirm the effectiveness of our approach with a full prototype and an application to ShareLatex, a microservices-based collaborative editing application
    • …
    corecore