28,518 research outputs found
Classical Cryptographic Protocols in a Quantum World
Cryptographic protocols, such as protocols for secure function evaluation
(SFE), have played a crucial role in the development of modern cryptography.
The extensive theory of these protocols, however, deals almost exclusively with
classical attackers. If we accept that quantum information processing is the
most realistic model of physically feasible computation, then we must ask: what
classical protocols remain secure against quantum attackers?
Our main contribution is showing the existence of classical two-party
protocols for the secure evaluation of any polynomial-time function under
reasonable computational assumptions (for example, it suffices that the
learning with errors problem be hard for quantum polynomial time). Our result
shows that the basic two-party feasibility picture from classical cryptography
remains unchanged in a quantum world.Comment: Full version of an old paper in Crypto'11. Invited to IJQI. This is
authors' copy with different formattin
SafeSpark: a secure data analytics platform using cryptographic techniques and trusted hardware
Dissertação de mestrado em Informatics EngineeringNowadays, most companies resort to data analytics frameworks to extract value from the
increasing amounts of digital information. These systems give substantial competitive ad vantages to companies since they allow to support situations such as possible marketing
decisions or predict user behaviors.
Therefore, organizations tend to leverage the cloud to store and perform analytics over
the data. Database services in the cloud present significant advantages as a high level
of efficiency and flexibility, and the reduction of costs inherent to the maintenance and
management of private infrastructures. The problem is that these services are often a target
for malicious attacks, which means that sensitive and private personal information can be
compromised.
The current secure analytical processing solutions use a limited set of cryptographic
techniques or technologies, which makes it impossible to explore different trade-offs of
performance, security, and functionality requirements for different applications. Moreover,
these systems also do not explore the combination of multiple cryptographic techniques
and trusted hardware to protect sensitive data.
The work presented here addresses this challenge, by using cryptographic schemes and
the Intel SGX technology to protect confidential information, ensuring a practical solution
which can be adapted to applications with different requirements. In detail, this dissertation
begins by exposing a baseline study about cryptographic schemes and the Intel SGX tech nology, followed by the state-of-the-art revision about secure data analytics frameworks.
A new solution based on the Apache Spark framework, called SafeSpark, is proposed. It
provides a modular and extensible architecture and prototype, which allows protecting in formation and processing analytical queries over encrypted data, using three cryptographic
schemes and the SGX technology. We validated the prototype with an experimental evalu ation, where we analyze the performance costs of the solution and also its resource usage.
For this purpose, we use the TPC-DS benchmark to evaluate the proposed solution, and
the results show that it is possible to perform analytical processing on protected data with
a performance impact between 1.13x and 4.1x.Atualmente, um grande número de empresas recorre a ferramentas de análise de dados para extrair valor da quantidade crescente de informações digitais que são geradas. Estes sistemas apresentam consideráveis vantagens competitivas para as empresas, uma vez que permitem suportar situações como melhores decisões de marketing, ou até mesmo prever o comportamento dos seus clientes. Neste sentido, estas organizações tendem a recorrer a serviços de bases de dados na nuvem para armazenar e processar informação, uma vez que estas apresentam vantagens significativas como alto nível de eficiência e flexibilidade, bem como a redução de custos inerentes a manter e gerir uma infraestrutura privada. No entanto, estes serviços são frequentemente alvo de ataques maliciosos, o que leva a que informações pessoais privadas possam estar comprometidas. As soluções atuais de processamento analítico seguro utilizam um conjunto limitado de técnicas criptográficas ou tecnologias, o que impossibilita o balanceamento de diferentes compromissos entre performance, segurança e funcionalidade para diferentes aplicações. Ainda, estes sistemas não permitem explorar a simultânea utilização de técnicas criptográficas e de hardware confiável para proteger informação sensível. O trabalho apresentado nesta dissertação tem como objetivo responder a este desafio, utilizando esquemas criptográficos e a tecnologia Intel SGX para proteger informação confidencial, garantindo unia solução prática que pode ser adaptada a aplicações com diferentes requisitos. Em detalhe, este documento começa por expor um estudo de base sobre esquemas criptográficos e sobre a tecnologia SGX, seguido de uma revisão do estado de arte atual sobre ferramentas de processamento analítico seguro. Uma nova solução baseada na plataforma Apache Spark, chamada SafeSpark, é proposta. Esta providencia uma arquitetura modular e extensível, bem como um protótipo, que possibilita proteger informação e executar interrogações analíticas sobre dados cifrados, utilizando três esquemas criptográficos e a tecnologia Intel SGX. O protótipo foi validado com uma avaliação experimental, onde analisamos a penalização de desempenho da solução, bem como a sua utilização de recursos computacionais. Com este propósito, foi utilizada a plataforma de avaliação TPC-DS para avaliar a solução proposta, e os resultados mostram que é possível executar processamento analítico sobre dados protegidos, apresentando um impacto no desempenho entre 1.13x e 4.1x.This work was partially funded by FCT - Fundação para a Ciência e a Tecnologia, I.P., (Portuguese Foundation for Science and Technology) within project UID/EEA/50014/2019
Easylife: the data reduction and survey handling system for VIPERS
We present Easylife, the software environment developed within the framework
of the VIPERS project for automatic data reduction and survey handling.
Easylife is a comprehensive system to automatically reduce spectroscopic data,
to monitor the survey advancement at all stages, to distribute data within the
collaboration and to release data to the whole community. It is based on the
OPTICON founded project FASE, and inherits the FASE capabilities of modularity
and scalability. After describing the software architecture, the main reduction
and quality control features and the main services made available, we show its
performance in terms of reliability of results. We also show how it can be
ported to other projects having different characteristics.Comment: pre-print, 17 pages, 4 figures, accepted for publication in
Publications of the Astronomical Society of the Pacifi
MoPS: A Modular Protection Scheme for Long-Term Storage
Current trends in technology, such as cloud computing, allow outsourcing the
storage, backup, and archiving of data. This provides efficiency and
flexibility, but also poses new risks for data security. It in particular
became crucial to develop protection schemes that ensure security even in the
long-term, i.e. beyond the lifetime of keys, certificates, and cryptographic
primitives. However, all current solutions fail to provide optimal performance
for different application scenarios. Thus, in this work, we present MoPS, a
modular protection scheme to ensure authenticity and integrity for data stored
over long periods of time. MoPS does not come with any requirements regarding
the storage architecture and can therefore be used together with existing
archiving or storage systems. It supports a set of techniques which can be
plugged together, combined, and migrated in order to create customized
solutions that fulfill the requirements of different application scenarios in
the best possible way. As a proof of concept we implemented MoPS and provide
performance measurements. Furthermore, our implementation provides additional
features, such as guidance for non-expert users and export functionalities for
external verifiers.Comment: Original Publication (in the same form): ASIACCS 201
An algorithmic and architectural study on Montgomery exponentiation in RNS
The modular exponentiation on large numbers is computationally intensive. An effective way for performing this operation consists in using Montgomery exponentiation in the Residue Number System (RNS). This paper presents an algorithmic and architectural study of such exponentiation approach. From the algorithmic point of view, new and state-of-the-art opportunities that come from the reorganization of operations and precomputations are considered. From the architectural perspective, the design opportunities offered by well-known computer arithmetic techniques are studied, with the aim of developing an efficient arithmetic cell architecture. Furthermore, since the use of efficient RNS bases with a low Hamming weight are being considered with ever more interest, four additional cell architectures specifically tailored to these bases are developed and the tradeoff between benefits and drawbacks is carefully explored. An overall comparison among all the considered algorithmic approaches and cell architectures is presented, with the aim of providing the reader with an extensive overview of the Montgomery exponentiation opportunities in RNS
- …