14,885 research outputs found

    Notes on Cloud computing principles

    Get PDF
    This letter provides a review of fundamental distributed systems and economic Cloud computing principles. These principles are frequently deployed in their respective fields, but their inter-dependencies are often neglected. Given that Cloud Computing first and foremost is a new business model, a new model to sell computational resources, the understanding of these concepts is facilitated by treating them in unison. Here, we review some of the most important concepts and how they relate to each other

    High-Performance Cloud Computing: A View of Scientific Applications

    Full text link
    Scientific computing often requires the availability of a massive number of computers for performing large scale experiments. Traditionally, these needs have been addressed by using high-performance computing solutions and installed facilities such as clusters and super computers, which are difficult to setup, maintain, and operate. Cloud computing provides scientists with a completely new model of utilizing the computing infrastructure. Compute resources, storage resources, as well as applications, can be dynamically provisioned (and integrated within the existing infrastructure) on a pay per use basis. These resources can be released when they are no more needed. Such services are often offered within the context of a Service Level Agreement (SLA), which ensure the desired Quality of Service (QoS). Aneka, an enterprise Cloud computing solution, harnesses the power of compute resources by relying on private and public Clouds and delivers to users the desired QoS. Its flexible and service based infrastructure supports multiple programming paradigms that make Aneka address a variety of different scenarios: from finance applications to computational science. As examples of scientific computing in the Cloud, we present a preliminary case study on using Aneka for the classification of gene expression data and the execution of fMRI brain imaging workflow.Comment: 13 pages, 9 figures, conference pape

    Cloud Computing in the Quantum Era

    Get PDF
    Cloud computing has become the prominent technology of this era. Its elasticity, dynamicity, availability, heterogeneity, and pay as you go pricing model has attracted several companies to migrate their businesses' services into the cloud. This gives them more time to focus solely on their businesses and reduces the management and backup overhead leveraging the flexibility of cloud computing. On the other hand, quantum technology is developing very rapidly. Experts are expecting to get an efficient quantum computer within the next decade. This has a significant impact on several sciences including cryptography, medical research, and other fields. This paper analyses the reciprocal impact of quantum technology on cloud computing and vice versa

    Fog Computing: A Taxonomy, Survey and Future Directions

    Full text link
    In recent years, the number of Internet of Things (IoT) devices/sensors has increased to a great extent. To support the computational demand of real-time latency-sensitive applications of largely geo-distributed IoT devices/sensors, a new computing paradigm named "Fog computing" has been introduced. Generally, Fog computing resides closer to the IoT devices/sensors and extends the Cloud-based computing, storage and networking facilities. In this chapter, we comprehensively analyse the challenges in Fogs acting as an intermediate layer between IoT devices/ sensors and Cloud datacentres and review the current developments in this field. We present a taxonomy of Fog computing according to the identified challenges and its key features.We also map the existing works to the taxonomy in order to identify current research gaps in the area of Fog computing. Moreover, based on the observations, we propose future directions for research

    Geoinformation, Geotechnology, and Geoplanning in the 1990s

    Get PDF
    Over the last decade, there have been some significant changes in the geographic information available to support those involved in spatial planning and policy-making in different contexts. Moreover, developments have occurred apace in the technology with which to handle geoinformation. This paper provides an overview of trends during the 1990s in data provision, in the technology required to manipulate and analyse spatial information, and in the domain of planning where applications of computer technology in the processing of geodata are prominent. It draws largely on experience in western Europe, and in the UK and the Netherlands in particular, and suggests that there are a number of pressures for a strengthened role for geotechnology in geoplanning in the years ahead
    • …
    corecore