251 research outputs found

    Secure data sharing in cloud computing: a comprehensive review

    Get PDF
    Cloud Computing is an emerging technology, which relies on sharing computing resources. Sharing of data in the group is not secure as the cloud provider cannot be trusted. The fundamental difficulties in distributed computing of cloud suppliers is Data Security, Sharing, Resource scheduling and Energy consumption. Key-Aggregate cryptosystem used to secure private/public data in the cloud. This key is consistent size aggregate for adaptable decisions of ciphertext in cloud storage. Virtual Machines (VMs) provisioning is effectively empowered the cloud suppliers to effectively use their accessible resources and get higher benefits. The most effective method to share information resources among the individuals from the group in distributed storage is secure, flexible and efficient. Any data stored in different cloud data centers are corrupted, recovery using regenerative coding. Security is provided many techniques like Forward security, backward security, Key-Aggregate cryptosystem, Encryption and Re-encryption etc. The energy is reduced using Energy-Efficient Virtual Machines Scheduling in Multi-Tenant Data Centers

    Privacy-Enhanced Dependable and Searchable Storage in a Cloud-of-Clouds

    Get PDF
    In this dissertation we will propose a solution for a trustable and privacy-enhanced storage architecture based on a multi-cloud approach. The solution provides the necessary support for multi modal on-line searching operation on data that is always maintained encrypted on used cloud-services. We implemented a system prototype, conducting an experimental evaluation. Our results show that the proposal offers security and privacy guarantees, and provides efficient information retrieval capabilities without sacrificing precision and recall properties on the supported search operations. There is a constant increase in the demand of cloud services, particularly cloud-based storage services. These services are currently used by different applications as outsourced storage services, with some interesting advantages. Most personal and mobile applications also offer the user the choice to use the cloud to store their data, transparently and sometimes without entire user awareness and privacy-conditions, to overcome local storage limitations. Companies might also find that it is cheaper to outsource databases and keyvalue stores, instead of relying on storage solutions in private data-centers. This raises the concern about data privacy guarantees and data leakage danger. A cloud system administrator can easily access unprotected data and she/he could also forge, modify or delete data, violating privacy, integrity, availability and authenticity conditions. A possible solution to solve those problems would be to encrypt and add authenticity and integrity proofs in all data, before being sent to the cloud, and decrypting and verifying authenticity or integrity on data downloads. However this solution can be used only for backup purposes or when big data is not involved, and might not be very practical for online searching requirements over large amounts of cloud stored data that must be searched, accessed and retrieved in a dynamic way. Those solutions also impose high-latency and high amount of cloud inbound/outbound traffic, increasing the operational costs. Moreover, in the case of mobile or embedded devices, the power, computation and communication constraints cannot be ignored, since indexing, encrypting/decrypting and signing/verifying all data will be computationally expensive. To overcome the previous drawbacks, in this dissertation we propose a solution for a trustable and privacy-enhanced storage architecture based on a multi-cloud approach, providing privacy-enhanced, dependable and searchable support. Our solution provides the necessary support for dependable cloud storage and multi modal on-line searching operations over always-encrypted data in a cloud-of-clouds. We implemented a system prototype, conducting an experimental evaluation of the proposed solution, involving the use of conventional storage clouds, as well as, a high-speed in-memory cloud-storage backend. Our results show that the proposal offers the required dependability properties and privacy guarantees, providing efficient information retrieval capabilities without sacrificing precision and recall properties in the supported indexing and search operations

    Multicloud Resource Allocation:Cooperation, Optimization and Sharing

    Get PDF
    Nowadays our daily life is not only powered by water, electricity, gas and telephony but by "cloud" as well. Big cloud vendors such as Amazon, Microsoft and Google have built large-scale centralized data centers to achieve economies of scale, on-demand resource provisioning, high resource availability and elasticity. However, those massive data centers also bring about many other problems, e.g., bandwidth bottlenecks, privacy, security, huge energy consumption, legal and physical vulnerabilities. One of the possible solutions for those problems is to employ multicloud architectures. In this thesis, our work provides research contributions to multicloud resource allocation from three perspectives of cooperation, optimization and data sharing. We address the following problems in the multicloud: how resource providers cooperate in a multicloud, how to reduce information leakage in a multicloud storage system and how to share the big data in a cost-effective way. More specifically, we make the following contributions: Cooperation in the decentralized cloud. We propose a decentralized cloud model in which a group of SDCs can cooperate with each other to improve performance. Moreover, we design a general strategy function for SDCs to evaluate the performance of cooperation based on different dimensions of resource sharing. Through extensive simulations using a realistic data center model, we show that the strategies based on reciprocity are more effective than other strategies, e.g., those using prediction based on historical data. Our results show that the reciprocity-based strategy can thrive in a heterogeneous environment with competing strategies. Multicloud optimization on information leakage. In this work, we firstly study an important information leakage problem caused by unplanned data distribution in multicloud storage services. Then, we present StoreSim, an information leakage aware storage system in multicloud. StoreSim aims to store syntactically similar data on the same cloud, thereby minimizing the user's information leakage across multiple clouds. We design an approximate algorithm to efficiently generate similarity-preserving signatures for data chunks based on MinHash and Bloom filter, and also design a function to compute the information leakage based on these signatures. Next, we present an effective storage plan generation algorithm based on clustering for distributing data chunks with minimal information leakage across multiple clouds. Finally, we evaluate our scheme using two real datasets from Wikipedia and GitHub. We show that our scheme can reduce the information leakage by up to 60% compared to unplanned placement. Furthermore, our analysis in terms of system attackability demonstrates that our scheme makes attacks on information much more complex. Smart data sharing. Moving large amounts of distributed data into the cloud or from one cloud to another can incur high costs in both time and bandwidth. The optimization on data sharing in the multicloud can be conducted from two different angles: inter-cloud scheduling and intra-cloud optimization. We first present CoShare, a P2P inspired decentralized cost effective sharing system for data replication to optimize network transfer among small data centers. Then we propose a data summarization method to reduce the total size of dataset, thereby reducing network transfer

    Efficient data reliability management of cloud storage systems for big data applications

    Get PDF
    Cloud service providers are consistently striving to provide efficient and reliable service, to their client's Big Data storage need. Replication is a simple and flexible method to ensure reliability and availability of data. However, it is not an efficient solution for Big Data since it always scales in terabytes and petabytes. Hence erasure coding is gaining traction despite its shortcomings. Deploying erasure coding in cloud storage confronts several challenges like encoding/decoding complexity, load balancing, exponential resource consumption due to data repair and read latency. This thesis has addressed many challenges among them. Even though data durability and availability should not be compromised for any reason, client's requirements on read performance (access latency) may vary with the nature of data and its access pattern behaviour. Access latency is one of the important metrics and latency acceptance range can be recorded in the client's SLA. Several proactive recovery methods, for erasure codes are proposed in this research, to reduce resource consumption due to recovery. Also, a novel cache based solution is proposed to mitigate the access latency issue of erasure coding

    Autonomous Localization Of A Uav In A 3d Cad Model

    Get PDF
    This thesis presents a novel method of indoor localization and autonomous navigation of Unmanned Aerial Vehicles(UAVs) within a building, given a prebuilt Computer Aided Design(CAD) model of the building. The proposed system is novel in that it leverages the support of machine learning and traditional computer vision techniques to provide a robust method of localizing and navigating a drone autonomously in indoor and GPS denied environments leveraging preexisting knowledge of the environment. The goal of this work is to devise a method to enable a UAV to deduce its current pose within a CAD model that is fast and accurate while also maintaining efficient use of resources. A 3-Dimensional CAD model of the building to be navigated through is provided as input to the system along with the required goal position. Initially, the UAV has no idea of its location within the building. The system, comprising a stereo camera system and an Inertial Measurement Unit(IMU) as its sensors, then generates a globally consistent map of its surroundings using a Simultaneous Localization and Mapping (SLAM) algorithm. In addition to the map, it also stores spatially correlated 3D features. These 3D features are then used to generate correspondences between the SLAM map and the 3D CAD model. The correspondences are then used to generate a transformation between the SLAM map and the 3D CAD model, thus effectively localizing the UAV in the 3D CAD model. Our method has been tested to successfully localize the UAV in the test building in an average of 15 seconds in the different scenarios tested contingent upon the abundance of target features in the observed data. Due to the absence of a motion capture system, the results have been verified by the placement of tags on the ground at strategic known locations in the building and measuring the error in the projection of the current UAV location on the ground with the tag

    Derin sinir ağları kullanılarak 3B modellerin üretilmesi ve düzenlenmesi

    Get PDF
    Artificial intelligence (AI) and particularly deep neural networks (DNN) have become very hot topics in the recent years and they have been shown to be successful in problems such as detection, recognition and segmentation. More recently DNNs have started to be popular in data generation problems by the invention of Generative Adversarial Networks (GAN). Using GANs, various types of data such as audio, image or 3D models could be generated. In this thesis, we aim to propose a system that creates artificial 3D models with given characteristics. For this purpose, we focus on latent modification and generation of 3D point cloud object models with respect to their semantic parts. Different to the existing methods which use separate networks for part generation and assembly, we propose a single end-to-end Autoencoder model that can handle generation and modification of both semantic parts, and global shapes. The proposed method supports part exchange between 3D point cloud models and composition by different parts to form new models by directly editing latent representations. This holistic approach does not need part-based training to learn part representations and does not introduce any extra loss besides the standard reconstruction loss. The experiments demonstrate the robustness of the proposed method with different object categories and varying number of points, rotations and scales. The method can generate new models by integration of generative models such as GANs and VAEs and can work with unannotated point clouds by integration of a segmentation module.Derin Sinir Ağları (DSA) başta olmak üzere Yapay Zeka sistemleri tespit, tanıma ve bölütleme problemlerinde çok başarılı olduklarını göstererek son yılların en popüler çalışma konusu haline geldiler. Çekişmeli Üretici Ağların (ÇÜA) keşfiyle birlikte DSA veri üretme konusunda da son zamanlarda sıkça kullanılır oldu. ÇÜA kullanılarak ses, görüntü veya 3B model gibi değişik veri tipleri kolaylıkla üretilebilir hale geldi. Bu tezde, verilen karakteristiklere uygun olarak yapay 3B model üretebilen bir sistem önerilmektedir. Bu amaçla, anlamsal parçalarına uygun olarak 3B nokta bulutu modellerinin örtük düzlemde düzenlenmesi ve üretilmesi üzerine çalışılmıştır. Parçaların üretilmesi ve birleştirilmesi için farklı ağlar kullanan önceki metotlardan farklı olarak, hem 3B modelleri hem de parçalarını düzenleyebilen ve üretebilen tek bir uçtan uca Otokodlayıcı modeli önerilmektedir. Önerilen model direkt olarak örtük temsilleri değiştirerek 3B nokta bulutu modelleri arasında parça değişimini ve değişik parçaların bir araya getirilerek yeni nesnelerin oluşturulmasını desteklemektedir. Kullanılan bütünsel yaklaşım anlamsal parçaları öğrenmek için parça temelli eğitime ihtiyaç duymamaktadır ve standart yeniden oluşturma kaybı haricinde farklı bir kayıp fonksiyonuna ihtiyaç duymamaktadır. Yapılan deneyler modelin farklı nesne kategorilerine, farklı yönlerde, büyüklüklerde ve farklı sayıda nokta içeren nesnelere karşı dayanıklılığını göstermektedir. Method ÇÜA ve Değişimsel Otokodlayıcı gibi üretici modellerin entegrasyonu ile yeni nesneler üretebilmektedir ve bir bölütleme modülünün eklenmesi ile bölütlenmemiş nokta bulutu modelleri ile çalışabilmektedir.Ph.D. - Doctoral ProgramMETU Scientific Research Projects Coordination Unit GAP-704-2020-1007

    Enhancing Landsat time series through multi-sensor fusion and integration of meteorological data

    Get PDF
    Over 50 years ago, the United States Interior Secretary, Stewart Udall, directed space agencies to gather "facts about the natural resources of the earth." Today global climate change and human modification make earth observations from all variety of sensors essential to understand and adapt to environmental change. The Landsat program has been an invaluable source for understanding the history of the land surface, with consistent observations from the Thematic Mapper (TM) and Enhanced Thematic Mapper Plus (ETM+) sensors since 1982. This dissertation develops and explores methods for enhancing the TM/ETM+ record by fusing other data sources, specifically, Landsat 8 for future continuity, radar data for tropical forest monitoring, and meteorological data for semi-arid vegetation dynamics. Landsat 8 data may be incorporated into existing time series of Landsat 4-7 data for applications like change detection, but vegetation trend analysis requires calibration, especially when using the near-infrared band. The improvements in radiometric quality and cloud masking provided by Landsat 8 data reduce noise compared to previous sensors. Tropical forests are notoriously difficult to monitor with Landsat alone because of clouds. This dissertation developed and compared two approaches for fusing Synthetic Aperture Radar (SAR) data from the Advanced Land Observation Satellite (ALOS-1) with Landsat in Peru, and found that radar data increased accuracy of deforestation. Simulations indicate that the benefit of using radar data increased with higher cloud cover. Time series analysis of vegetation indices from Landsat in semi-arid environments is complicated by the response of vegetation to high variability in timing and amount of precipitation. We found that quantifying dynamics in precipitation and drought index data improved land cover change detection performance compared to more traditional harmonic modeling for grasslands and shrublands in California. This dissertation enhances the value of Landsat data by combining it with other data sources, including other optical sensors, SAR data, and meteorological data. The methods developed here show the potential for data fusion and are especially important in light of recent and upcoming missions, like Sentinel-1, Sentinel-2, and NASA-ISRO Synthetic Aperture Radar (NISAR)
    corecore