208,688 research outputs found

    Cloud Technology Management in Public and Big Data Mining

    Get PDF
    Cloud technology and the use of big data in the public sector are recently on the front burner. The purpose of the study is to explore the use of cloud technology and big data in the public sector. Literature review method was used in the study. The results demonstrate that cloud technology and big data are used to deliver different services in the public sector. As technology advances continue, the usage of cloud technology and big data are expected to increase in the public sector. Moreover, there are also expectations that the usage of cloud technology and big data in the public sector will increase due to the advantages of cost, efficiency and speed. In the study, recommendations were made for the wider use of cloud technology and big data in the public sector in Turke

    Experimental Study of the Cloud Architecture Selection for Effective Big Data Processing

    Full text link
    Big data dictate their requirements to the hardware and software. Simple migration to the cloud data processing, while solving the problem of increasing computational capabilities, however creates some issues: the need to ensure the safety, the need to control the quality during data transmission, the need to optimize requests. Computational cloud does not simply provide scalable resources but also requires network infrastructure, unknown routes and the number of user requests. In addition, during functioning situation can occur, in which you need to change the architecture of the application - part of the data needs to be placed in a private cloud, part in a public cloud, part stays on the client

    Toward efficient and secure public auditing for dynamic big data storage on cloud

    Full text link
    University of Technology Sydney. Faculty of Engineering and Information Technology.Cloud and Big Data are two of the most attractive ICT research topics that have emerged in recent years. Requirements of big data processing are now everywhere, while the pay-as-you-go model of cloud systems is especially cost efficient in terms of processing big data applications. However, there are still concerns that hinder the proliferation of cloud, and data security/privacy is a top concern for data owners wishing to migrate their applications into the cloud environment. Compared to users of conventional systems, cloud users need to surrender the local control of their data to cloud servers. Another challenge for big data is the data dynamism which exists in most big data applications. Due to the frequent updates, efficiency becomes a major issue in data management. As security always brings compromises in efficiency, it is difficult but nonetheless important to investigate how to efficiently address security challenges over dynamic cloud data. Data integrity is an essential aspect of data security. Except for server-side integrity protection mechanisms, verification from a third-party auditor is of equal importance because this enables users to verify the integrity of their data through the auditors at any user-chosen timeslot. This type of verification is also named 'public auditing' of data. Existing public auditing schemes allow the integrity of a dataset stored in cloud to be externally verified without retrieval of the whole original dataset. However, in practice, there are many challenges that hinder the application of such schemes. To name a few of these, first, the server still has to aggregate a proof with the cloud controller from data blocks that are distributedly stored and processed on cloud instances and this means that encryption and transfer of these data within the cloud will become time-consuming. Second, security flaws exist in the current designs. The verification processes are insecure against various attacks and this leads to concerns about deploying these schemes in practice. Third, when the dataset is large, auditing of dynamic data becomes costly in terms of communication and storage. This is especially the case for a large number of small data updates and data updates on multi-replica cloud data storage. In this thesis, the research problem of dynamic public data auditing in cloud is systematically investigated. After analysing the research problems, we systematically address the problems regarding secure and efficient public auditing of dynamic big data in cloud by developing, testing and publishing a series of security schemes and algorithms for secure and efficient public auditing of dynamic big data storage on cloud. Specifically, our work focuses on the following aspects: cloud internal authenticated key exchange, authorisation on third-party auditor, fine-grained update support, index verification, and efficient multi-replica public auditing of dynamic data. To the best of our knowledge, this thesis presents the first series of work to systematically analysis and to address this research problem. Experimental results and analyses show that the solutions that are presented in this thesis are suitable for auditing dynamic big data storage on cloud. Furthermore, our solutions represent significant improvements in cloud efficiency and security

    Running Big Data Privacy Preservation in the Hybrid Cloud Platform

    Get PDF
    Now a day’s cloud computing has been used all over the industry, due to rapid growth in information technology and mobile device technology. It is more important task, user’s data privacy preservation in the cloud environment. Big data platform is collection of sensitive and non-sensitive data. To provide solution of big data security in the cloud environment, organization comes with hybrid cloud approach. There are many small scale industries arising and making business with other organization. Any organization data owner or customers never want to scan or expose their private data by the cloud service provider. To improve security performance, cloud uses data encryption technique on original data in public cloud. Proposed system work is carried out how to improve image data privacy preserving in hybrid cloud. For that we are implementing image encryption algorithm based on Rubik’s cube principle improves the image cryptography for the public cloud data securit

    Data Placement for Privacy-Aware Applications over Big Data in Hybrid Clouds

    Get PDF
    Nowadays, a large number of groups choose to deploy their applications to cloud platforms, especially for the big data era. Currently, the hybrid cloud is one of the most popular computing paradigms for holding the privacy-aware applications driven by the requirements of privacy protection and cost saving. However, it is still a challenge to realize data placement considering both the energy consumption in private cloud and the cost for renting the public cloud services. In view of this challenge, a cost and energy aware data placement method, named CEDP, for privacy-aware applications over big data in hybrid cloud is proposed. Technically, formalized analysis of cost, access time, and energy consumption is conducted in the hybrid cloud environment. Then a corresponding data placement method is designed to accomplish the cost saving for renting the public cloud services and energy savings for task execution within the private cloud platforms. Experimental evaluations validate the efficiency and effectiveness of our proposed method

    Editorial for FGCS special issue: Big Data in the cloud

    Get PDF
    Research associated with Big Data in the Cloud will be important topic over the next few years. The topic includes work on demonstrating architectures, applications, services, experiments and simulations in the Cloud to support the cases related to adoption of Big Data. A common approach to Big Data in the Cloud to allow better access, performance and efficiency when analysing and understanding the data is to deliver Everything as a Service. Organisations adopting Big Data this way find the boundaries between private clouds, public clouds and Internet of Things (IoT) can be very thin. Volume, variety, velocity, veracity and value are the major factors in Big Data systems but there are other challenges to be resolved. The papers of this special issue address a variety of issues and concerns in Big Data, including: searching and processing Big Data, implementing and modelling event and workflow systems, visualisation modelling and simulation and aspects of social media

    Big Data Analytics on Cloud: challenges, techniques and technologies

    Get PDF
    These days it is known that Big Data Analytics is taking a huge attention from researchers and also from business. We all are witness of the data growth that every institution, company or even individuals store in order to use them in the future. There is a big potential to extract useful data from this Big Data that is stored usually in Cloud because sometimes there is not enough local space to store big amounts of data. There is a huge number of sectors where Big Data can be helpful including economic and business activities, public administration, national security, scientific researches in many areas, etc. This data in order to be used must get processed, usually by using Big Data Analytics Techniques. It is for sure that the future of business and technology will be relied on Big Data Analytics. This paper aims to show how big data is analyzed especially when it is deployed on cloud as well as the challenges, techniques and technologies that are used and can be used, in order to analyze Big Data on Cloud. We discuss and implement different methodologies of Big Data Analytics on Cloud

    Security problems and challenges in a machine learning-based hybrid big data processing network systems

    Get PDF
    The data source that produces data continuously in high volume and high velocity with large varieties of data types creates Big Data, and causes problems and challenges to Machine Learning (ML) techniques that help extract, analyze and visualize important information. To overcome these problems and challenges, we propose to make use of the hybrid networking model that consists of multiple components such as Hadoop distributed file system (HDFS), cloud storage system, security module and ML unit. Processing of Big Data in this networking environment with ML technique requires user interaction and additional storage hence some artificial delay between the arrivals of data domains through external storage can help HDFSto process the Big Data efficiently. To address this problem we suggest using public cloud for data storage which will induce meaningful time delay to the data while making use of its storage capability. However, the use of public cloud will lead to security vulnerability to the data transmission and storage. Therefore, we need some form of security algorithm that provides a flexible key-based encryption technique that can provide tradeoffs between time-delay, security strength and storage risks. In this paper we propose a model for using public cloud provider trust levels to select encryption types for data storage for use within a Big Data analytics network topology
    • …
    corecore