114 research outputs found

    Data auditing and security in cloud computing: issues, challenges and future directions

    Get PDF
    Cloud computing is one of the significant development that utilizes progressive computational power and upgrades data distribution and data storing facilities. With cloud information services, it is essential for information to be saved in the cloud and also distributed across numerous customers. Cloud information repository is involved with issues of information integrity, data security and information access by unapproved users. Hence, an autonomous reviewing and auditing facility is necessary to guarantee that the information is effectively accommodated and used in the cloud. In this paper, a comprehensive survey on the state-of-art techniques in data auditing and security are discussed. Challenging problems in information repository auditing and security are presented. Finally, directions for future research in data auditing and security have been discusse

    Data Auditing and Security in Cloud Computing: Issues, Challenges and Future Directions

    Get PDF
    Cloud computing is one of the significant development that utilizes progressive computational power and upgrades data distribution and data storing facilities. With cloud information services, it is essential for information to be saved in the cloud and also distributed across numerous customers. Cloud information repository is involved with issues of information integrity, data security and information access by unapproved users. Hence, an autonomous reviewing and auditing facility is necessary to guarantee that the information is effectively accommodated and used in the cloud. In this paper, a comprehensive survey on the state-of-art techniques in data auditing and security are discussed. Challenging problems in information repository auditing and security are presented. Finally, directions for future research in data auditing and security have been discussed

    A comprehensive meta-analysis of cryptographic security mechanisms for cloud computing

    Get PDF
    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.The concept of cloud computing offers measurable computational or information resources as a service over the Internet. The major motivation behind the cloud setup is economic benefits, because it assures the reduction in expenditure for operational and infrastructural purposes. To transform it into a reality there are some impediments and hurdles which are required to be tackled, most profound of which are security, privacy and reliability issues. As the user data is revealed to the cloud, it departs the protection-sphere of the data owner. However, this brings partly new security and privacy concerns. This work focuses on these issues related to various cloud services and deployment models by spotlighting their major challenges. While the classical cryptography is an ancient discipline, modern cryptography, which has been mostly developed in the last few decades, is the subject of study which needs to be implemented so as to ensure strong security and privacy mechanisms in today’s real-world scenarios. The technological solutions, short and long term research goals of the cloud security will be described and addressed using various classical cryptographic mechanisms as well as modern ones. This work explores the new directions in cloud computing security, while highlighting the correct selection of these fundamental technologies from cryptographic point of view

    SDVADC: Secure Deduplication and Virtual Auditing of Data in Cloud

    Get PDF
    Over the last few years, deploying data to cloud service for repository is an appealing passion that avoids efforts on significant information sustenance and administration. In distributed repository utilities, deduplication technique is often exploited to minimize the capacity and bandwidth necesseties of amenities by erasing repetitive data and caching only a solitary duplicate of them. Proof-of-Ownership mechanisms authorize any possessor of the identical information to approve to the distributed repository server that he possess the information in a dynamic way. In repository utilities with enormous information, the repository servers may intend to minimize the capacity of cached information, and the customers may want to examine the integrity of their information with a reasonable cost. We propose Secure Deduplication and Virtual Auditing of Data in Cloud (SDVADC) mechanism that realizes integrity auditing and

    Development of a software infrastructure for the secure distribution of documents using free cloud storage

    Full text link
    El siglo XXI pertenece al mundo de la computación especialmente como resultado de la computación en la nube. Esta tecnología posibilita la gestión de información de modo ubicuo, por lo que las personas pueden acceder a sus datos desde cualquier sitio y en cualquier momento. En este panorama, la emergencia del almacenamiento en la nube ha tenido un rol muy importante durante los últimos cinco años. Actualmente, varios servicios gratuitos de almacenamiento en la nube hacen posible que los usuarios tengan un backup sin coste de sus activos, pudiendo gestionarlos y compartirlos, representando una oportunidad muy económica para pequeñas y medianas empresas. Sin embargo, la adopción del almacenamiento en la nube involucra la externalización de datos, por lo que un usuario no tiene la garantía sobre la forma en la que sus datos serían procesados y protegidos. Por tanto, parece necesario el dotar al almacenamiento en la nube pública de una serie de medidas para proteger la confidencialidad y la privacidad de los usuarios, asegurar la integridad de los datos y garantizar un backup adecuado de los activos de información. Por esta razón, se propone en este trabajo Encrypted Cloud, una aplicación de escritorio funcional en Windows y en Ubuntu, que gestiona de forma transparente para el usuario una cantidad variable de directorios locales donde los usuarios pueden depositar sus ficheros de forma encriptada y balanceada. De hecho, se podrá seleccionar las carpetas locales creadas por la aplicación de escritorio de Dropbox o Google Drive como directorios locales para Encrypted Cloud, unificando el espacio de almacenamiento gratuito ofrecido por estos proveedores cloud. Además, Encrypted Cloud permite compartir ficheros encriptados con otros usuarios, usando para ello un protocolo propio de distribución de claves criptográficas simétricas. Destacar que, entre otras funcionalidades, también dispone de un servicio que monitoriza aquellos ficheros que han sido eliminados o movidos por una tercera parte no autorizada.The 21st century belongs to the world of computing, specially as a result of the socalled cloud computing. This technology enables ubiquitous information management and thus people can access all their data from any place and at any time. In this landscape, the emergence of cloud storage has had an important role in the last ve years. Nowadays, several free public cloud storage services make it possible for users to have a free backup of their assets and to manage and share them, representing a lowcost opportunity for Small and Medium Companies (SMEs). However, the adoption of cloud storage involves data outsourcing, so a user does not have the guarantee about the way her data will be processed and protected. Therefore, it seems necessary to endow public cloud storage with a set of means to protect users' con dentiality and privacy, to assess data integrity and to guarantee a proper backup of information assets. For this reason, in this work it is proposed Encrypted Cloud, a desktop application which works on Windows and Ubuntu, and that manages transparently to the user a variable amount of local directories in which the users can place their les in an encrypted and balanced way. Therefore, the user could choose the local folders created by the Dropbox or Google Drive desktop application as local directories for Encrypted Cloud, unifying the free storage space o ered by these cloud providers. In addition, Encrypted Cloud allows to share encrypted les with other users, using for this our own cryptographic key distribution protocol. Note that, among other functionalities, it also has a service that monitors those les which are deleted or moved by an unauthorised third party

    Adapting a quality model for a Big Data application: the case of a feature prediction system

    Get PDF
    En la última década hemos sido testigos del considerable incremento de proyectos basados en aplicaciones de Big Data. Algunos de los tipos más populares de esas aplicaciones han sido: los sistemas de recomendaciones, la predicción de características y la toma de decisiones. En este nuevo auge han surgido propuestas de implementación de modelos de calidad para las aplicaciones de Big data que por su gran heterogeneidad se hace difícil la selección del modelo de calidad ideal para el desarrollo de un tipo específico de aplicación de Big Data. En el presente Trabajo de Fin de Máster se realiza un estudio de mapeo sistemático (SMS, por sus siglas en inglés) que parte de dos preguntas clave de investigación. La primera trata sobre cuál es el estado en la identificación de riesgos, problemas o desafíos en las aplicaciones de Big Data. La segunda, trata sobre qué modelos de calidad se han aplicado hasta la fecha a las aplicaciones de Big Data, específicamente a los sistemas de predicción de características. El objetivo final es analizar los modelos de calidad disponibles y adaptar un modelo de calidad a partir de los existentes que se puedan aplicar a un tipo específico de aplicación de Big Data: los sistemas de predicción de características. El modelo definido comprende un conjunto de características de calidad definidas como parte del modelo y métricas de calidad para evaluarlas. Finalmente, se realiza una aproximación a un caso de estudio donde se aplica el modelo y se evalúan las características de calidad definidas a través de sus métricas de calidad presentándose los resultados obtenidos.In the last decade, we have been witnesses of the considerable increment of projects based on big data applications. Some of the most popular types of those applications have been: Recommendations, Feature Predictions, and Decision making. In this new context, several proposals have arisen for the implementation of quality models applied to Big Data applications. As part of the current Master thesis, a Systematic Mapping Study (SMS) is conducted which starts from two key research questions. The first one is about what is the state of the art about the identification of risks, issues, problems, or challenges in big data applications. The second one, is about which quality models have been applied up to date to big data applications, specifically to feature prediction systems. The main objective is to analyze the available quality models and adapt a quality model from the existing ones that can be applied to a specific type of Big Data application: The Feature Prediction Systems. The defined model comprises a set of quality characteristics defined as part of the model and a set of quality metrics to evaluate them. Finally, an approach is made to a case study where the model is applied, and the quality characteristics defined through its quality metrics are evaluated. The results are presented and discussed.Departamento de Informática (Arquitectura y Tecnología de Computadores, Ciencias de la Computación e Inteligencia Artificial, Lenguajes y Sistemas Informáticos)Máster en Ingeniería Informátic

    A Functional Taxonomy of Data Quality Tools: Insights from Science and Practice

    Get PDF
    For organizations data quality is a prerequisite for automated decision making and agility. To provide high quality data, numerous tools have emerged that support the different steps of data quality management. Yet, these tools vary in their functional composition and support for current trends, such as AI. There is no common and up-to-date perception of the capabilities a data quality tool should fulfill. In this paper, we develop a functional taxonomy of data quality tools to address this shortcoming and provide a holistic overview of data quality functionalities. We derived the taxonomy through an iterative approach of deductive reasoning by conducting a systematic literature review and inductive reasoning by reviewing existing data quality tools and gaining insights from experts. By applying our taxonomy to 18 commercial data quality tools we aim to provide the reader with a review of data quality tools and reach a functional consensus in the field

    Challenges and requirements of heterogenous research data management in environmental sciences:a qualitative study

    Get PDF
    Abstract. The research focuses on the challenges and requirements of heterogeneous research data management in environmental sciences. Environmental research involves diverse data types, and effective management and integration of these data sets are crucial in managing heterogeneous environmental research data. The issue at hand is the lack of specific guidance on how to select and plan an appropriate data management practice to address the challenges of handling and integrating diverse data types in environmental research. The objective of the research is to identify the issues associated with the current data storage approach in research data management and determine the requirements for an appropriate system to address these challenges. The research adopts a qualitative approach, utilizing semi-structured interviews to collect data. Content analysis is employed to analyze the gathered data and identify relevant issues and requirements. The study reveals various issues in the current data management process, including inconsistencies in data treatment, the risk of unintentional data deletion, loss of knowledge due to staff turnover, lack of guidelines, and data scattered across multiple locations. The requirements identified through interviews emphasize the need for a data management system that integrates automation, open access, centralized storage, online electronic lab notes, systematic data management, secure repositories, reduced hardware storage, and version control with metadata support. The research identifies the current challenges faced by researchers in heterogeneous data management and compiles a list of requirements for an effective solution. The findings contribute to existing knowledge on research-related problems and provide a foundation for developing tailored solutions to meet the specific needs of researchers in environmental sciences
    corecore