3 research outputs found

    A policy-based containerized filter for secure information sharing in organizational environments

    Get PDF
    In organizational environments, sensitive information is unintentionally exposed and sent to the cloud without encryption by insiders that even were previously informed about cloud risks. To mitigate the effects of this information privacy paradox, we propose the design, development and implementation of SecFilter, a security filter that enables organizations to implement security policies for information sharing. SecFilter automatically performs the following tasks: (a) intercepts files before sending them to the cloud; (b) searches for sensitive criteria in the context and content of the intercepted files by using mining techniques; (c) calculates the risk level for each identified criterion; (d) assigns a security level to each file based on the detected risk in its content and context; and (e) encrypts each file by using a multi-level security engine, based on digital envelopes from symmetric encryption, attribute-based encryption and digital signatures to guarantee the security services of confidentiality, integrity and authentication on each file at the same time that access control mechanisms are enforced before sending the secured file versions to cloud storage. A prototype of SecFilter was implemented for a real-world file sharing application that has been deployed on a private cloud. Fine-tuning of SecFilter components is described and a case study has been conducted based on document sharing of a well-known repository (MedLine corpus). The experimental evaluation revealed the feasibility and efficiency of applying a security filter to share information in organizational environmentsThis work has been partially supported by the Spanish “Ministerio de Economia y Competitividad” under the project grant TIN2016-79637-P “Towards Unification of HPC and Big Dataparadigms”

    Enterprise information security risks: a systematic review of the literature

    Get PDF
    Currently, computer security or cybersecurity is a relevant aspect in the area of networks and communications of a company, therefore, it is important to know the risks and computer security policies that allow a unified management of cyber threats that only seek to affect the reputation or profit from the confidential information of organizations in the business sector. The objective of the research is to conduct a systematic review of the literature through articles published in databases such as Scopus and Dimension. Thus, in order to perform a complete documentary analysis, inclusion and exclusion criteria were applied to evaluate the quality of each article. Then, using a quantitative scale, articles were filtered according to author, period and country of publication, leaving a total of 86 articles from both databases. The methodology used was the one proposed by Kitchenham, and the conclusion reached was that the vast majority of companies do not make a major investment in the purchase of equipment and improvement of information technology (IT) infrastructure, exposing themselves to cyber-attacks that continue to grow every day. This research provides an opportunity for researchers, companies and entrepreneurs to consult so that they can protect their organization's most important assets

    A gearbox model for processing large volumes of data by using pipeline systems encapsulated into virtual containers

    Get PDF
    Software pipelines enable organizations to chain applications for adding value to contents (e.g., confidentially, reliability, and integrity) before either sharing them with partners or sending them to the cloud. However, the pipeline components add overhead when processing large volumes of data, which can become critical in real-world scenarios. This paper presents a gearbox model for processing large volumes of data by using pipeline systems encapsulated into virtual containers. In this model, the gears represent applications, whereas gearboxes represent software pipelines. This model was implemented as a collaborative system that automatically performs Gear up (by using parallel patterns) and/or Gear down (by using in-memory storage) until all gears produce uniform data processing velocities. This model reduces delays and bottlenecks produced by the heterogeneous performance of applications included in software pipelines. The new container tool has been designed to encapsulate both the collaborative system and the software pipelines into a virtual container and deploy it on IT infrastructures. We conducted case studies to evaluate the performance of when processing medical images and PDF repositories. The incorporation of a capsule to a cloud storage service for pre-processing medical imagery was also studied. The experimental evaluation revealed the feasibility of applying the gearbox model to the deployment of software pipelines in real-world scenarios as it can significantly improve the end-user service experience when pre-processing large-scale data in comparison with state-of-the-art solutions such as Sacbe and Parsl.This work has been partially supported by the “Spanish Ministerio de Economia y Competitividad ” under the project grant TIN2016-79637-P “Towards Unification of HPC and Big Data paradigms”
    corecore