161,730 research outputs found

    Towards an open cloud marketplace: vision and first steps

    Full text link
    As one of the most promising, emerging concepts in Information Technology (IT), cloud computing is transforming how IT is consumed and managed; yielding improved cost efficiencies, and delivering flexible, on-demand scalability by reducing computing infrastructures, platforms, and services to commodities acquired and paid-for on-demand through a set of cloud providers. Today, the transition of cloud computing from a subject of research and innovation to a critical infrastructure is proceeding at an incredibly fast pace. A potentially dangerous consequence of this speedy transition to practice is the premature adoption, and ossification, of the models, technologies, and standards underlying this critical infrastructure. This state of affairs is exacerbated by the fact that innovative research on production-scale platforms is becoming the purview of a small number of public cloud providers. Specifically, the academic research communities are effectively excluded from the opportunity to contribute meaningfully to the evolution not to mention innovation and healthy mutation of cloud computing technologies. As the dependence on our society and economy on cloud computing increases, so does the realization that the academic research community cannot be shut out from contributing to the design and evolution of this critical infrastructure. In this article we provide an alternative vision that of an Open Cloud eXchange (OCX) a public cloud marketplace, where many stakeholders, rather than just a single cloud provider, participate in implementing and operating the cloud, thus creating an ecosystem that will bring the innovation of a broader community to bear on a much healthier and more efficient cloud marketplace

    MalStone: Towards A Benchmark for Analytics on Large Data Clouds

    Full text link
    Developing data mining algorithms that are suitable for cloud computing platforms is currently an active area of research, as is developing cloud computing platforms appropriate for data mining. Currently, the most common benchmark for cloud computing is the Terasort (and related) benchmarks. Although the Terasort Benchmark is quite useful, it was not designed for data mining per se. In this paper, we introduce a benchmark called MalStone that is specifically designed to measure the performance of cloud computing middleware that supports the type of data intensive computing common when building data mining models. We also introduce MalGen, which is a utility for generating data on clouds that can be used with MalStone

    Guidance Notes for Cloud Research Users

    No full text
    There is a rapidly increasing range of research activities which involve the outsourcing of computing and storage resources to public Cloud Service Providers (CSPs), who provide managed and scalable resources virtualised as a single service. For example Amazon Elastic Computing Cloud (EC2) and Simple Storage Service (S3) are two widely adopted open cloud solutions, which aim at providing pooled computing and storage services and charge users according to their weighted resource usage. Other examples include employment of Google Application Engine and Microsoft Azure as development platforms for research applications. Despite a lot of activity and publication on cloud computing, the term itself and the technologies that underpin it are still confusing to many. This note, as one of deliverables of the TeciRes project1, provides guidance to researchers who are potential end users of public CSPs for research activities. The note contains information to researchers on: •The difference between and relation to current research computing models •The considerations that have to be taken into account before moving to cloud-aided research •The issues associated with cloud computing for research that are currently being investigated •Tips and tricks when using cloud computing Readers who are interested in provisioning cloud capabilities for research should also refer to our guidance notes to cloud infrastructure service providers. This guidance notes focuses on technical aspects only. Readers who are interested in non-technical guidance should refer to the briefing paper produced by the “using cloud computing for research” project

    Greening cloud-enabled big data storage forensics : Syncany as a case study

    Get PDF
    The pervasive nature of cloud-enabled big data storage solutions introduces new challenges in the identification, collection, analysis, preservation and archiving of digital evidences. Investigation of such complex platforms to locate and recover traces of criminal activities is a time-consuming process. Hence, cyber forensics researchers are moving towards streamlining the investigation process by locating and documenting residual artefacts (evidences) of forensic value of users’ activities on cloud-enabled big data platforms in order to reduce the investigation time and resources involved in a real-world investigation. In this paper, we seek to determine the data remnants of forensic value from Syncany private cloud storage service, a popular storage engine for big data platforms. We demonstrate the types and the locations of the artefacts that can be forensically recovered. Findings from this research contribute to an in-depth understanding of cloud-enabled big data storage forensics, which can result in reduced time and resources spent in real-world investigations involving Syncany-based cloud platforms

    How are cloud-based platforms changing cultural services: Towards a new service integration model

    Get PDF
    As a new strategy for public digital cultural services in China, cloud-based cultural platforms are developing rapidly and becoming increasingly important. Nevertheless, there is also a great gap between their operation status quo with expectations stated in various policy documents, especially in the aspects of resource fusion across cultural institutions, on-line and off-line interaction and smart services. This research builds a new theoretical service integration model for the application of cloud platforms in cultural services, and explores interactions between cloud-based cultural platforms, physical cultural venues and smart individual spaces

    Research Cloud Data Communities

    Get PDF
    Big Data, big science, the data deluge, these are topics we are hearing about more and more in our research pursuits. Then, through media hype, comes cloud computing, the saviour that is going to resolve our Big Data issues. However, it is difficult to pinpoint exactly what researchers can actually do with data and with clouds, how they get to exactly solve their Big Data problems, and how they get help in using these relatively new tools and infrastructure. Since the beginning of 2012, the NeCTAR Research Cloud has been running at the University of Melbourne, attracting over 1,650 users from around the country. This has not only provided an unprecedented opportunity for researchers to employ clouds in their research, but it has also given us an opportunity to clearly understand how researchers can more easily solve their Big Data problems. The cloud is now used daily, from running web servers and blog sites, through to hosting virtual laboratories that can automatically create hundreds of servers depending on research demand. Of course, it has also helped us understand that infrastructure isn’t everything. There are many other skillsets needed to help researchers from the multitude of disciplines use the cloud effectively. How can we solve Big Data problems on cloud infrastructure? One of the key aspects are communities based on research platforms: Research is built on collaboration, connection and community, and researchers employ platforms daily, whether as bio-imaging platforms, computational platforms or cloud platforms (like DropBox). There are some important features which enabled this to work.. Firstly, the borders to collaboration are eased, allowing communities to access infrastructure that can be instantly built to be completely open, through to completely closed, all managed securely through (nationally) standardised interfaces. Secondly, it is free and easy to build servers and infrastructure, but it is also cheap to fail, allowing for experimentation not only at a code-level, but at a server or infrastructure level as well. Thirdly, this (virtual) infrastructure can be shared with collaborators, moving the practice of collaboration from sharing papers and code to sharing servers, pre-configured and ready to go. And finally, the underlying infrastructure is built with Big Data in mind, co-located with major data storage infrastructure and high-performance computers, and interconnected with high-speed networks nationally to research instruments. The research cloud is fundamentally new in that it easily allows communities of researchers, often connected by common geography (research precincts), discipline or long-term established collaborations, to build open, collaborative platforms. These open, sharable, and repeatable platforms encourage coordinated use and development, evolving to common community-oriented methods for Big Data access and data manipulation. In this paper we discuss in detail critical ingredients in successfully establishing these communities, as well as some outcomes as a result of these communities and their collaboration enabling platforms. We consider astronomy as an exemplar of a research field that has already looked to the cloud as a solution to the ensuing data tsunami

    Empirical Evaluation of Cloud IAAS Platforms using System-level Benchmarks

    Get PDF
    Cloud Computing is an emerging paradigm in the field of computing where scalable IT enabled capabilities are delivered ‘as-a-service’ using Internet technology. The Cloud industry adopted three basic types of computing service models based on software level abstraction: Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS), and Software-as-a-Service (SaaS). Infrastructure-as-a-Service allows customers to outsource fundamental computing resources such as servers, networking, storage, as well as services where the provider owns and manages the entire infrastructure. This allows customers to only pay for the resources they consume. In a fast-growing IaaS market with multiple cloud platforms offering IaaS services, the user\u27s decision on the selection of the best IaaS platform is quite challenging. Therefore, it is very important for organizations to evaluate and compare the performance of different IaaS cloud platforms in order to minimize cost and maximize performance. Using a vendor-neutral approach, this research focused on four of the top IaaS cloud platforms- Amazon EC2, Microsoft Azure, Google Compute Engine, and Rackspace cloud services. This research compared the performance of IaaS cloud platforms using system-level parameters including server, file I/O, and network. System-level benchmarking provides an objective comparison of the IaaS cloud platforms from performance perspective. Unixbench, Dbench, and Iperf are the system-level benchmarks chosen to test the performance of the server, file I/O, and network respectively. In order to capture the performance variability, the benchmark tests were performed at different time periods on weekdays and weekends. Each IaaS platform\u27s performance was also tested using various parameters. The benchmark tests conducted on different virtual machine (VM) configurations should help cloud users select the best IaaS platform for their needs. Also, based on their applications\u27 requirements, cloud users should get a clearer picture of which VM configuration they should choose. In addition to the performance evaluation, the price-per-performance value of all the IaaS cloud platforms was also examined
    corecore